Natural Language Processing, or NLP, is one of the most popular domains in which the application of Machine Learning techniques is a real success. Nowadays, almost all NLP tasks can be resolved using Machine learning: from a simple classification LSTM network to the recent wave of transformers like BERT, GPT, or T5.
Since this article is not an introduction or explanation of NLP, I propose the following interesting articles to, first of all, understand what is NLP about:
Working on scientific research requires being aware of the state of the art. It’s exactly the reason why I wrote this post, in which I will present some of the most active academic research teams working on NLP.
Team #1 : “The Standford NLP Group”
The first team on the list is, of course, The Standford NLP group. Chaired by Professor Christopher Manning, it is the most known and most impactful team in the NLP domain, thanks to their invaluable contributions. Team’s members have many scientific publications, however, They are most known by their NLP tool “CoreNLP”. This tool enables users to easily process many linguistic tasks as tokenization, parts of speech, named entities recognition, dependency analysis, coreference identification, sentiment analysis … and more. Also, it’s available in 6 languages: Arabic, Chinese, English, French, German, and Spanish. CoreNLP is developed using Java, which is the most used programming language, but in same time not preferered for Machine learning projects. Recently, they published a new tool made by Python : “Stansa”. This project looks promising, and can be interfaced with CoreNLP api.
- Organisation : Group at Stanford University
- web page :