WHAT THE RESEARCH IS:

A novel approach to using word embeddings (where words or phrases are mapped to sequences of numbers that represent their meaning) for natural language processing (NLP) that dynamically selects the right types of embeddings for the task at hand. These dynamic meta-embeddings outperform similar models that use a single type of word embedding. Facebook’s AI researchers have open-sourced this code.

HOW IT WORKS:

Current NLP systems often rely on pretrained embeddings that engineers choose beforehand. This new approach improves efficiency and overall performance by training a neural network on multiple embeddings, allowing the system to determine the usefulness of each embedding to understand language related to a given task. The results from a range of experiments show that these AI-combined word embeddings outperform traditional approaches, while also delivering specific benefits, including making it easier to evaluate how an NLP system prioritized different embeddings during a given operation.

WHY IT MATTERS:

This context-based approach leads to better results across a range of benchmarked NLP tasks. By providing new insights into the use and effectiveness of word embeddings, dynamic meta-embeddings improve our collective understanding of how neural networks understand language.

READ THE FULL PAPER:

Dynamic Meta-Embeddings for Improved Sentence Representations

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy