View publication

We consider a new perspective on dialog state tracking (DST), the task of estimating a user's goal through the course of a dialog. By formulating DST as a semantic parsing task over hierarchical representations, we can incorporate semantic compositionality, cross-domain knowledge sharing and co-reference. We present TreeDST, a dataset of 27 thousand conversations annotated with tree-structured dialog states and system acts. We describe an encoder-decoder framework for DST with hierarchical representations, which leads to 20 percent improvement over state-of-the-art DST approaches that operate on a flat meaning space of slot-value pairs.

Related readings and updates.

Apple at EMNLP 2020

Apple is sponsoring the Empirical Methods in Natural Language Processing (EMNLP) conference, which will be held virtually from November 16 to 20. EMNLP is a leading conference focused on natural language processing.

See event details

Can Global Semantic Context Improve Neural Language Models?

Entering text on your iPhone, discovering news articles you might enjoy, finding out answers to questions you may have, and many other language-related tasks depend upon robust natural language processing (NLP) models. Word embeddings are a category of NLP models that mathematically map words to numerical vectors. This capability makes it fairly straightforward to find numerically similar vectors or vector clusters, then reverse the mapping to get relevant linguistic information. Such models are at the heart of familiar apps like News, search, Siri, keyboards, and Maps. In this article, we explore whether we can improve word predictions for the QuickType keyboard using global semantic context.

See article details