Cognitive Computing – Hype or Progress?

Cognitive Computing

First, there was artificial intelligence, then the terms machine learning and deep learning followed. And now there is cognitive computing. What’s so special about this new concept? Is it just a new buzzword from the IT scene that can be exploited for marketing purposes? Or is it a new approach that brings machines one step closer to human thinking?

Cognitive Computing – what is behind it?

Are machines able to think? In some ways, yes. But this isn’t actually new. It depends on how you define thinking. Thinking in the form of logical conclusions has been simulated more or less electronically for almost 80 years. There are reasons why Konrad Zuse called his computer, completed in 1941, a “mechanical brain”.

The human brain, however, is capable of more than just drawing logical conclusions on an abstract level, and this is where cognitive computing comes in. Cognitive computing claims that machines come closer to the way the human brain works.

  • Computing stands for all goal-oriented activities that are related to computers. Computing is supposed to help the efficient automatization of as many processes as possible.
  • Cognitive (lat. cognoscere = to think, understand, know) is an adjective that refers to perception, thought, remembrance and recognition, i.e. the processes that make up the human mind.

The solution is to train computers in these human abilities. This competence not only consists of thinking in the form of logical conclusions, but also of perceiving and independently recognizing connections. But isn’t this exactly the essence of artificial intelligence and machine learning in its purest form? Yes, but cognitive computing claims to go one step further.

Have masses of raw AI training data created or evaluated by Clickworkers to optimize cognitive computing systems.

Complex Data as a Challenge for Machines

Every day, billions of gigabytes of data are generated and placed on the network. At the same time, increasingly complex decisions have to be made that take into account as many relevant aspects as possible. This requires systems that structure and filter a vast amount of data and make it usable for the respective purposes. And human capabilities alone are clearly not enough. Traditional computer programs also reach their limits here.

Cognitive computing addresses the problem that 80 percent of the world’s data is unstructured. The new approach uses pattern recognition and machine learning to create structures. Cognitive computing goes a step further than machine learning and artificial intelligence by optimizing the communication between different systems that are based on artificial intelligence as well as improving the communication of humans with these systems.

  • Therefore, cognitive computing stands for a principle of networking AI systems and facilitating communication between humans and computers.
  • By combining human thinking and artificial intelligence, cognitive systems ought to achieve better results.

Application Areas of Cognitive Computing

In practice, cognitive computing may look like this:

  • Merging data from text, audio, video, graphics and recognizing their context.
  • The Conversion of information from Big Data into a new, user-friendly format to gain new insights.
  • The automation of complex procedures in order to solve problems independently.

This would enable a doctor or surgeon to immediately obtain relevant information by giving simple instructions, making it easier for him to decide on a specific therapy. However, a prerequisite is that this interactive communication takes place in real time and is ultimately (due to the principle of machine learning) always open-ended. This also presupposes that cognitive systems are adaptable and recognize that information is not always clear and is subject to constant change.

Cognitive Computing – a new Buzzword?

Considering the overlapping of the terms artificial intelligence, machine learning and cognitive computing, one naturally suspects that cognitive computing is nothing more than a new buzzword for marketing purposes. If artificial intelligence no longer seems relevant enough, it is sold as cognitive computing. Is cognitive computing really something new?

The term is indeed controversial in the IT scene. Gartner analyst, Tom Austin, thinks it is “marketing nonsense” – for the simple reason that machines cannot think. But even the proponents of this new term don’t claim that it can. However, they put the term cognitive computing in a more complex context:

  • Artificial intelligence stands for the mechanical completion of tasks that normally require human intelligence.
  • Machine learning is a specific way of programming systems that generate artificial intelligence.
  • Cognitive computing stands for a complex architecture that uses various AI subsystems. Cognitive computing takes on the task of ensuring that these different systems interlock and thus become even more efficient.

Cognitive computing is, therefore, not only perception and motor control, but also an approximate cognitive behavior – as it occurs in the human brain. This approximation is achieved primarily by optimizing communication between humans and machines. Cognitive computing is recognition, understanding, reasoning and drawing conclusions at a higher level.

Summary

Only the future will show whether cognitive computing is more than just a buzzword from the realm of artificial intelligence. In any case, the claims behind this new term are ambitious:

  • Cognitive systems will become perfect assistants to existing human expertise.
  • They will also have the ability to become experts in certain fields themselves.

The differentiation between artificial intelligence, machine learning and deep learning is not always easy. But perhaps the term cognitive computing has the potential to summarize these aspects of modern programming: The targeted use of different systems with the side effect of making communication between humans and machines easier and more efficient.

&nbps;

avatar

Jan Knupper