May 032016
 
harmonp_127x127

Cognitive computing is among the major trends in computing today and seems destined to change how business people think about the ways in which computers can be used in business environments. “Cognitive computing” is a vague term used in a myriad of ways. Given the confusion in the market as to the nature of cognitive computing, our recent Executive Report (Part I in a two-part series) describes what we mean by cognitive computing by exploring five different perspectives on the topic: (1) rules-based expert systems, (2) big data and data mining, (3) neural networks, (4) IBM’s Watson, and (5) Google’s AlphaGo. Here is a brief description of each.

  1. Rules-Based Expert Systems
    There have been other attempts to commercialize artificial intelligence (AI) technology. The most notable was an effort to create rules-based expert systems in the 1980s. Although there were some successes, most expert systems efforts were abandoned by the end of that decade because they were expensive to build and nearly impossible to maintain. The format of a rules-based expert system required that any improvements be hand-tailored by human experts, and it simply proved too expensive to keep such systems up to date.
  2. Big Data and Data Mining
    AI techniques were also used with more modest applications such as data mining tools that could analyze and generate suggestions after examining large databases. Given the large databases that organizations created in the 1990s and the 2000s, data mining tools have proved very valuable. These tools depend on machine learning techniques and provide the basis for many of today’s cognitive computing advances.
  3. Neural Networks
    The most important technologies being used in cognitive computing applications are called neural networks. This basic technology has been around for decades but breakthroughs in the 1990s and 2000s allowed developers to create much more powerful and robust applications. The key new technologies aredeep neural networks and reinforcement learning. These techniques have enabled major progress in natural language (NL) systems, in visual systems, and in creating decision-making systems that can examine information in databases and “learn” from them.
  4. IBM’s Watson Plays Jeopardy!
    In 2011 IBM’s Watson demonstrated its cognitive capabilities by playing two Jeopardy! champions and winning the game. It listened to questions, searched its huge database of information, and generated winning answers in under three seconds. People watched the performance on TV and were quite impressed. IBM preceded to set up a new organization to commercialize Watson and suggested that cognitive computing was the future of computing — and of IBM.
  5. Google’s AlphaGo
    In March 2016, Google demonstrated that its AlphaGo application could beat the world champion of Go(the most complex strategic game humans play). AlphaGo began by beating the European Go champion in October 2015. Those who observed the play were confident that the world champion, who was considered much better than the European champion, would have no trouble beating AlphaGo. Between the October and March matches, however, AlphaGo was able to play millions of games with itself, constantly learning more about Go and getting better. It easily beat the world champion when they played head-to-head.

Putting It All Together

The commercial attention focused on the possibilities of AI technologies in the 1980s was stimulated by the success of two expert systems: Dendral and MYCIN. The current round of commercial interest in AI is driven by the popular successes of Watson and AlphaGo. These victories, in themselves, aren’t of too much value, but the capabilities demonstrated in the course of these wins are hugely impressive. In the case of Watson, it’s now clear that applications can be provided with NL interfaces that can query and respond to users in more or less open-ended conversations. At the same time, Watson is capable of examining huge databases and organizing the knowledge to answer complex, open-ended questions. In the case of AlphaGo, it’s equally clear that an application capable of expert performance can continue to learn by examining huge online databases of journals and news stories or by working against itself to perform a task faster, better, or cheaper and improve rapidly.

There have been no recent major technological breakthroughs; all the basic technologies used today have been around for at least two decades. There have, however, been minor technological breakthroughs, and these, in turn, have forced researchers to review older techniques and reevaluate their power. Consequently, deep neural networks, various types of feed-backward techniques, and reinforcement learning have been combined with techniques for searching massive databases and the steady growth of computing power to generate a powerful new generation of AI applications. The new applications are designed around architectures that combine many different techniques — sometimes the same technique used in multiple ways — running on multiple machines, which brings forth different problem-solving approaches and has led to exciting new solutions.

Cognitive computing does not describe a specific technology or even a well-defined approach to computing. The term is now used to describe a broad approach to application development that combines a wide variety of different techniques. Cognitive applications combine AI and non-AI techniques in complex architectures that include not only knowledge capture and knowledge analysis capabilities, but also NL and visual front ends and large-scale database search capa­bilities. Significant features of cognitive applications are their proven ability to rapidly learn and to improve on their own (in at least some circumstances). In the future, cognitive applications will link to the Internet — constantly reading journals, newsfeeds, and conference proceedings and forever improving their problem-solving capabilities.

avatar

Paul Harmon

Paul Harmon is an internationally-known business consultant and technology analyst and a Senior Consultant with Cutter’s Data Analytics & Digital Technologies practice. Combining his interest in intelligent software technologies and in how businesses organize their processes to maximize their success, Mr. Harmon is currently researching and writing about how organizations are using the latest cognitive techniques to maximize the efficiency of their business activities. More ...

Discussion

  4 Responses to “Five Perspectives on Cognitive Computing”

  1. The elephant in the room is ‘bugs.’ As we make systems increasingly autonomous we must also ensure Do No Harm. Our Body of Knowledge for making systems that will Do No Harm is woefully sparse. Worse is that we make systems that contain software ‘bugs’ therefore not doing what the designer intended in the first place. How about exploring Do No Harm?

  2. avatar

    Jack, I think you have raised a significant issue, and I think its quite complex. Would you consider the fact that a Go-playing application didn’t win every game a sign that it had bugs?

    When you build applications that function like human experts, they will make informed decisions that will not always be perfect. Would you be satisfied with a diagnosis system that was right 80% of the time, if human experts were normally only right 75% of the time?

    I will look forward to exploring some of these issues in the future.

  3. I will admit I am no expert, but when you guys mention bugs, surely there cant be that many so why not patch over them? Fix the problem and then there will be no issues, no bugs to deal with. How long would it take you to fix a bug?

  4. avatar

    In large, very complex programs, as for example, Microsoft’s Operating System, it can be nearly impossible to eliminate all bugs. Studies show that you introduce bugs as you work to eliminate bugs. And some cognitive programs are very complex indeed, so it is a problem.

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)