Semantic analysis (machine learning)
In machine learning, semantic analysis of a corpus is the task of building structures that approximate concepts from a large set of documents. It generally does not involve prior semantic understanding of the documents. A metalanguage based on predicate logic can analyze the speech of humans.[1]:93– Another strategy to understand the semantics of a text is symbol grounding. If language is grounded, it is equal to recognizing a machine readable meaning. For the restricted domain of spatial analysis, a computer based language understanding system was demonstrated.[2]:123
Semantics | ||||||||
---|---|---|---|---|---|---|---|---|
Computing | ||||||||
|
||||||||
Latent semantic analysis (sometimes latent semantic indexing), is a class of techniques where documents are represented as vectors in term space. A prominent example is PLSI.
Latent Dirichlet allocation involves attributing document terms to topics.
n-grams and hidden Markov models work by representing the term stream as a markov chain where each term is derived from the few terms before it.
References
- Nitin Indurkhya; Fred J. Damerau (22 February 2010). Handbook of Natural Language Processing. CRC Press. ISBN 978-1-4200-8593-8.
- Michael Spranger (15 June 2016). The evolution of grounded spatial language. Language Science Press. ISBN 978-3-946234-14-2.