It was not long after a person created the concept of a robot which people desired it to comprehend human language and text.  It was a fantasy that may only be seen from the pages of science fiction novels and short stories, or seen in films. Called Natural Language Processing (NLP), the notion of a computer knowledge human language and text is currently here.

It's not a simple task to reach.  First, there's the issue of human's talking in a succinct manner to ensure a system can understand. Secondly, the issue of words that sound exactly the same, but have distinct meanings such as weigh and manner, wait and weight, etc. You can know more about language processing at https://www.mobilize.ai/.

Image Source: Google

Processing the written or spoken sentence is based heavily on Big Data, considerable quantities of formatted, semi-structured, and unstructured information which may be mined for data. Computers can go through the information, examine it, and discover patterns or tendencies. 

Originally, NLP relied upon fundamental principles where machines utilizing calculations were advised what phrases and words to search for in text and then educated specific answers once the phrases seemed.  It has developed into profound learning, a flexible, more intuitive method where calculations are utilized to instruct a system to spot a speaker's intention from a collection of examples.

From the development of NLP, algorithms are bad at distributing. But now with advancements in profound learning and AI, algorithms are now able to successfully translate. If you have an Amazon Echo or even a Google Home, then you're interacting with artificial intelligence and NLP.