Salesforce open sources research to advance state of the art in AI for common sense reasoning

Salesforce open sources research to advance state of the art in AI for common sense reasoning

Deep learning is great for many applications, but common sense reasoning is not one of them. New research from Salesforce promises to alleviate this, advancing previous results by a considerable margin.

Ten percent is a considerable margin to improve on the state of the art on anything. This is what Salesforce research has just achieved for common sense reasoning for deep learning language models.

In its paper, Explain Yourself! Leveraging Language Models for Commonsense Reasoning, presented tomorrow in the Association for Computational Linguistics (ACL) 2019 annual meeting, Salesforce researchers unveil two important contributions: CoSE, a dataset for Commonsense Explanations; and CAGE, a model for Commonsense Auto-Generated Explanation. ZDNet took the opportunity for a Q&A with two of the Salesforce Research Scientists who worked on this, Nazneen Rajani and Bryan McCann.

As a reminder, Salesforce research is focused on question answering, as a way to facilitate access to data via Einstein. We have previously seen how other Salesforce researchers investigated the use of knowledge graphs toward the same end.

Rajani and McCann’s work takes a different approach, but also builds on a number of previous contributions. Common sense reasoning is an open problem for some of the world’s leading researchers. For example, one of the key ingredients of building CAGE was OpenAI GPT. Dubbing this language model recently open sourced by Elon Musk’s OpenAI as “too dangerous” to be released in the wild may have been overly precautionary.

Nevertheless, it is the state of the art in language models. As Rajani and McCann point out, these natural language processing networks are limited to text alone, as a poor substitute for living in the real world. So, researchers train the models by having them read a human-mind-boggling amount of text, including all of Wikipedia, thousands of books, and in other approaches, results from querying Google, too.

Read the full article on ZDNet


Join the Orchestrate all the Things Newsletter

Stories about how Technology, Data, AI and Media flow into each other shaping our lives.

Analysis, Essays, Interviews and News. Mid-to-long form, 1-3 times per month.


Write a Reply or Comment

Your email address will not be published.