October 08, 2013
Natural language understanding plays a huge role in making knowledge graphs work. Expect Labs’ Director of Research discusses the paradigm shift that is happening in this area, as a result of these new knowledge graphs.
Follow up with this installment that explains how knowledge graphs are constructed.
We are now in a position where most of human knowledge is already encoded in digital form. These knowledge graphs grow in a mostly automated fashion, and there’s still some degree of human curation, but they crawl many sources of data and carefully incorporate new information. So very soon all of human knowledge, at least as far as objects and their relations are concerned, will be encoded in a knowledge graph. Any person you can think of, any institution, any work of art, any location, they will all be part of this knowledge graph. Which means, and that is really the key point of this paradigm shift in natural language understanding, that to a computer system with a knowledge graph, words and phrases are no longer ethereal, disembodied strings of characters, but are actually anchored in the real world. You can tie each noun phrase to a specific entity. The string “Tim,” when uttered in the context of this conversation at Expect Labs uniquely refers to our CEO Tim Tuttle, that is, to the specific node in the knowledge graph that represents Tim and presumably contains a ton of information about him, taken from social media profiles (Facebook, LinkedIn, TechCrunch), and many other sources.