EmbryonicAI

NLU: A search for a generalizable model that consistently performs well across multiple tasks

Below I attempt to paraphrase the following paper in plain in English:  Better Fine Tuning by Reducing Representational Collapse by Armen Aghajanyan, Akshat Shrivastava, Anchit Gupta, Naman Goyal, Luke Zettlemoyer, and Sonal Gupta from Facebook submitted to Arxiv on August 6th, 2020. tldr Today NLU is about leveraging pre-trained models on large datasets like RoBERTa… Continue reading NLU: A search for a generalizable model that consistently performs well across multiple tasks

Computers Learning to Reason

The paper End-to-end Differentiable Proving describes a break through way for computers to reason, aka learn first order logic from a knowledge base. Learning to answer questions such as what country is next to Germany or who was George Washington’s nephew? All without being explicitly programmed to do so. The technique is called Neural Theorem… Continue reading Computers Learning to Reason

Knowledge a necessary but not sufficient ingredient for artificial intelligence

Intelligence comes in many flavors from kinesthetic intelligence in the sports all star, logical processing in the math genius, or interpersonal skills found in the charismatic sales person. No matter the domain, intelligence is often measured by the information people know or what people can do. Similarly, in order for artificial intelligence to become a… Continue reading Knowledge a necessary but not sufficient ingredient for artificial intelligence