DEEP LEARNING FOR NLP (NATURAL LANGUAGE PROCESSING)
He is the scientist director of Onepoint, consulting company.
Theoritical physicist, he is specialised in AI, Machine Learning and NLP.
He is the co-author of "The automatic processing of languages" published in february 2020.
NLP is a computational linguistics.
NLP is used in automatic translation and spell checking, processing and analysing corpus document, simulation conversation (chat box), virtual assistants, automatic text generation (statistics used for making text understantable for all), automatic text generation, document classification, text summarisation, matching text with a set of criteria.
We use NLP in health care (analysing a vast amount of publications), in public opinions with surveys.
The elementary tasks for NLP are : language detection, summarising, speech tagging, question answering, syntax analysis, translation, terminological extraction.
A remarkable fact is that a significant part of the syntax and sementics of a language can be learned throught a purely statistical approach, without using any explicit rules or abstract symbolic representations.
You can solve sementic ambiguities using the attention mechanism.
The criteria for choosing a generic task for transfert learning are :
1/ the task to be learned should require solving hard problems of NLP : such as the mastery of syntax, sementics, logical deduction. In case the model can solve it, this will mean it "understands" language.
2/ a task for which training data is already available in quantity and cost free.