Hey Siri !! What is Natural Language Processing?

Back in 2011 with the original release of Siri in iPhone 4S, as a virtual assistant, it was praised for its voice recognition and processing, and with that gradual development to date, a lot of breakthroughs have been made out in this field. Today alike Siri, you have Cortana, Alexa or Google assistant, and google translator translating languages to your understandable ones, which have excellent speech recognition engines built behind it. While we are comfortable and enjoy using all of these and criticize them when we find yet to improve issues, what’s behind these huge technologies? All the answer lies over natural language processing and machine learning algorithms.


Caption: Virtual Assistants -Siri, Alexa, Google Assistant

In an extremely simple and improvised way to distinguish the natural language processing application from other data processing systems, consider a word counter program, used to count the total number of bytes and words in a text file. When used to count bytes and lines it is an ordinary data processing application, but when it is used to count the words in a file, it is a language processing system as it requires the knowledge about what it means to be a word.


Caption: Natural Language Processing
Source: Forbes

Natural Language processing is a discipline under artificial intelligence which covers the models and algorithms drawn from standard toolkits of Computer Science, Mathematics and Linguistics, to get computers to perform tasks enabling human-machine communication in natural language, and further extracting information embedded in other text over the web page, audio signals or inferring and synthesizing information from these sources. Speech recognition, conversational agent, web-based question answering, language translation are the major applications of NLP.

A sophisticated NLP system has the knowledge incorporated for phonetics and phonology, morphology, syntax, semantics, pragmatics, and discourse. Simply NLP hovers over models like state machines, formal rule systems, logic models, and vector- space models, to language processing and solving ambiguity. Ambiguity refers to phenomena when one sentence can bring various meanings. For example: “Look at the dog with one eye”. This sentence can mean: look at the dog having only one eye or look at the dog with your one eye closed. How do computers understand this? It's solved through NLP’s various models and algorithms like part-of-speech tagging, word-sense disambiguation, probabilistic parsing, and so on.

Using the models generally involves a search through a space of states representing hypotheses about an input like searching through a space of phone sequences for the correct word, search through space of tress for the syntactic parse of an input sentence, search through space of translation hypotheses for the correct translation of sentence into another the language during speech recognition, parsing, and machine translations respectively. Beside non-probabilistic tasks are carried out using well-known graph algorithms and probabilistic tasks are carried out using various heuristic variants.  In addition, machine learning tools like classifiers and sequence models are also implemented during language processing modeling. NLP itself is a larger domain with various ongoing and to be yet done researches in the field of artificial intelligence. An introduction to NLP to get the gist takes you throughout the domain and swirls you around a lot of terms. Simply its a field for enhancing human-computer communication using natural language through various models and algorithms.


Comments

Popular posts from this blog

Programming Languages For Artificial Intelligence

Data Visualization Techniques

Programming Languages To Learn in 2020