Learning to Construct and Reason with a Large Knowledge Base of Extracted Information
Channel:
Subscribers:
344,000
Published on ● Video Link: https://www.youtube.com/watch?v=KFwLcbttFR8
Carnegie Mellon University's "Never Ending Language Learner" (NELL) has been running for over three years, and has automatically extracted from the web millions of facts concerning hundreds of thousands of entities and thousands of concepts. NELL works by coupling together many interrelated large-scale semi-supervised learning problems. In this talk I will discuss some of the technical problems we encountered in building NELL, and some of the issues involved in reasoning with this sort of large, diverse, and imperfect knowledge base. This is joint work with Tom Mitchell, Ni Lao, William Wang, and many other colleagues.
Other Videos By Microsoft Research
Tags:
microsoft research