Computing Meaning: Volume 4 (Text, Speech and Language Technology)
Format: PDF / Kindle (mobi) / ePub
This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue is the ultimate challenge in natural language processing, and the key to a wide range of exciting applications. The breadth and depth of coverage of this book makes it suitable as a reference and overview of the state of the field for researchers in Computational Linguistics, Semantics, Computer Science, Cognitive Science, and Artificial Intelligence.
natural language expression are semantically related, or what is the function of a subexpression. This is illustrated in (7) for the function of a discourse connective (temporal or causal sense of since); in (8) (discussed in Sect. 3.3.2) for the implicit coherence relation connecting two sentences in a discourse; in (9) (discussed in more detail in Sect. 3.3) for the function of a temporal expression (at six o’clock indicating the time of occurrence of the set-event or the time at which the
were introduced for anaphoric expressions. These unifications are all possible because the only information about the variables c, d, . . . , h in the AIR is that they are either equal to the discourse referent a or to the discourse referent b, but that doesn’t impose any constraints on how they may unify with the USR variables z, u, . . . , r. This reveals an inadequacy in the AIR: the interpretation of the anaphoric links in the annotation has lost the information concerning which token of
length and grammatical complexity of the sentence. In contrast, the model we propose does not require the −−→ −−−→ −→ combination of the tensored vectors, e.g. dogs ⊗ chase ⊗ cats, to be represented explicitly. −−−→ Note that we have taken the vector of the transitive verb, e.g. chase, to be an entangled vector in the tensor space N ⊗ S ⊗ N . If this was a separable vector, the meaning of the verb would be as follows: −−−→ chase = i → Ci − ni ⊗ j → Cj′ − sj ⊗ → Ck′′ − nk k The meaning of
parameter ensures that an interpretation will be always returned by the procedure even if reasoning could not be completed in a reasonable time. Abductive Reasoning with a Large Knowledge Base for Discourse Processing 117 Algorithm 1 Mini-TACITUS reasoning algorithm: interaction of the time and depth parameters Require: a logical form LF of a text fragment, a knowledge base KB, a depth parameter D, a cost parameter C, a time parameter T Ensure: the best interpretation Ibest of LF 1: Iinit :=
text-to-semantics mappings. For this purpose, NLF has a number of desirable properties: 16 Fig. 1 Example of an NLF semantic expression H. Alshawi et al. [acquired /stealthily :[in, ^, 2002], Chirpy+Systems, companies.two :profitable :[producing, ^, pet+accessories]] 1. Apart from a few built-in logical connectives, all the symbols appearing in NLF expressions are natural language words. 2. For an NLF semantic expression corresponding to a sentence, the word tokens of the sentence appear