I will describe a new approach to semantics, which combines the benefits of formal semantics and distributional approaches. Formal semantics offers an elegant account of composition and logical operators, but typically shows low recall due to inadequate models of lexical semantics. Conversely, distributional semantics has been successful in describing the meanings of content words, but it is unclear how to effectively represent composition and function words in a vector space. I will introduce a model which closely follows formal semantics, except that content words are represented with distributional cluster-identifiers. I will show that it is capable of both complex multi-sentence first-order inferences, while improving performance on a question-answering task. I will then describe a semi-supervised extension for building a richer lexical semantics.
Back to symposium main page