The Treehouse presents:

Luke Zettlemoyer, University of Washington CSE

November 5, 2010 12:30-1:20 pm

Recent research has focused on the problem of learning to map natural
language sentences to semantic representations of their underlying
meaning. Most of this work has focused on analyzing sentences in
isolation. In this talk, I will present a new approach for learning
context-dependent mappings from sentences to logical form.

In this setup, the training examples are sequences of sentences
annotated with lambda-calculus meaning representations. I will present
an algorithm that maintains explicit, lambda-calculus representations
of salient discourse entities and uses a context-dependent analysis
pipeline to recover logical forms. The method uses a hidden-variable
variant of the perception algorithm to learn a linear model used to
select the best analysis. Experiments on context-dependent utterances
from the ATIS travel-planning corpus show that the approach recovers
fully correct logical forms with 83.7% accuracy.

Topic revision: r1 - 2010-11-03 - 00:05:58 - ebender
 

This site is powered by the TWiki collaboration platformCopyright & by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback
Privacy Statement Terms & Conditions