ADVANCED SEMANTICS, Blog
26 April 2016

The future of computational linguistics

Astronomer and astrophysicist and scientist Carl Sagan once said: “You have to know the past to understand the present.” To understand the future of computational linguistics, we’re going to follow Sagan’s sage advise and look back at its origins for hints at what lies ahead for this field of study.

According to the Association for Computational Linguistics, “computational linguistics is the scientific study of language from a computational perspective.” Work in computational linguistics (CL) is concerned with modeling natural language, and draws on a variety of other disciplines, among them cognitive computing and artificial intelligence. The goal of computational linguistics is to develop software able to understand natural language, the everyday language we use to communicate.

The interaction between human and machine has always been one of the main challenges for computer scientists; the recent exponential growth of unstructured information (documents, web pages, social media, etc.) makes predicting the future of computational linguistics even more urgent. Indeed, human‐machine communication can dramatically improve the ability of the organizations to use all of the information available for strategic insight and decision making.

Computational linguistics: a timeline

Let’s look at how computational linguistics has evolved:

1940s: Computational Linguistics was born; initial projects were focused on machine translation

1954: First public demonstration of a Russian‐English machine translation system (even though it failed)

1962: David Hays, a famous computer scientist and member of the Automatic Language Processing Advisory Committee (ALPAC), coined the term “computational linguistics”

1964: MIT Professor Joseph Weizenbaum created Eliza, the first chatterbot that mimics human conversation

1980s: First practical commercialization of research on CL and a return to machine translation

1990s: Great development of large corpora and the use of machine learning

2000s: Explosion of natural language processing tools and interfaces

Computational linguistics trends

Reading these milestones, we have an idea of the many theories, studies, tests and applications conducted in CL, but also the opportunities for the future, and the work that will be required to achieve solid results.

It’s no doubt that, even 70 years later, the future of computational linguistics could hold more than a few surprises. In a recent interview, Expert System CTO Marco Varone said that, when it comes to text comprehension, machines will only record incremental improvements over the next 10 years. Clearly, we have a long way to go.

The growing relevance of Artificial Intelligence gives us hope for future improvement in CL.  However, natural language processing is complex: to obtain a deep analysis, we need to implement cognitive computing systems that can really communicate with humans. These systems can replicate human understanding and, thanks to this ability, are being used for a variety of business needs, including medical diagnostics, well production optimization, asset allocation and decision making for M&A, strategies for product innovation, and consumer behavior prediction.

Whatever the future of computational linguistics may hold, its success will depend on the right combination of language interpretation and programmatic logic to achieve a precise understanding of text.


Share On

Menu