What you should know about cognitive computing definition
What is cognitive computing?
Although computers are better when it comes to processing and making calculations, they haven’t been able to accomplish some of the most basic human tasks, until now. Thanks to cognitive computing, machines are bringing human-like intelligence to a number of business applications, including big data. Here, we will provide a cognitive computing definition that incorporates what it is and how it’s applied.
So, what is cognitive computing? According to Forbes, “cognitive computing comes from a mashup of cognitive science—the study of the human brain and how it functions—and computer science.” However, to provide a cognitive computing definition that encompasses all of the aspects of this mashup, we have to look a little deeper:
- Cognitive computing is not a single technology: It makes use of multiple technologies and algorithms that allow it to infer, predict, understand and make sense of information. These technologies include Artificial Intelligence and Machine Learning algorithms that help train the system to recognize images and understand speech, to recognize patterns, and through repetition and training, produce ever more accurate results over time. Through Natural Language Processing systems based on semantic technology, cognitive systems can understand meaning and context in a language, allowing deeper, more intuitive level of discovery and even interaction with information.
- Cognitive computing systems are not programmed, they learn: The above technologies help cognitive computing systems acquire knowledge and learn from it as it evolves. Rather than being programmed with a strict set of linear or logical responses or rules, cognitive computing uses an approach that allows it to arrive at an answer or make an informed choice in a way that resembles human reasoning. New information and interactions with people and data can help improve and refine its insight and accuracy over time.
Still confused? Let’s look at this cognitive computing definition that compares it to traditional technologies:
|Traditional systems||Cognitive Computing systems|
|calculate, crunch numbers||understand meaning and relationships|
|programmed by ITTT steps or logical processes||learn and draw conclusions from interactions with people and information|
|rely on structured queries and predefined rules to generate responses||understand human, non-linear communication and “draws from” a variety of “potentially relevant information and connections”|
|cannot handle unstructured data without significant programming||“derives” meaning from any kind of text and incorporates unstructured data into analysis|
|provide data points||can answer Who?, What?, When?, Where? and Why?|
A cognitive computing definition for today includes big data
Put simply, Cognitive Computing brings together the best of both worlds, combining the speed, scale and power of machines with a human-like approach to take advantage of information on a scale that would otherwise be impossible for people. The ability to understand language, to recognize patterns and to be able to learn from information can help companies address more meaningful and complex challenges.
Cognitive computing is a natural for big data, which is made up in large part by unstructured information, the qualitative data made of text rather than numbers, such as email, social media posts, customer surveys, news reports and more. Where traditional system are made to process structured data, cognitive systems can handle the textual part of unstructured information that contains important context and meaning that go beyond basic data points.
Given the number and value of recent investments in cognitive computing related technologies and companies, IDC’s prediction that, “by 2018, over 50% of developer teams will embed cognitive services in their apps” (to the tune of $60+ billion in savings by 2020), is no surprise. Cognitive computing is here to stay.