Cognitive computing is all about assisting you to do a better job and helping you to be the best you can be."
Dr Michelle Dickinson

Most people are familiar with the term "artificial intelligence". It was coined in the 1950s to describe computers that can seemingly think for themselves. Since then artificial intelligence has made great Hollywood fodder, with mankind typically coming off second-best after the rise of the machines.

The term artificial intelligence implies that machines can actually think for themselves, with people too quick to attribute them with human traits. This is why these days many researchers prefer the term "cognitive computing", says Dr Michelle Dickinson, nanotechnology specialist and senior lecturer at the University of Auckland.

IBM Science
Who is giving whom directives? Computer systems might be less independent than we think.

"I don't like the term 'artificial intelligence' because it assumes that computers have standalone intelligence and they can truly think for themselves, but that's not true – they're just following a complicated set of rules," Dickinson says.

"That's why I prefer the term 'cognitive computing' because it makes it clearer that we're building computers that emulate some of our cognitive abilities, rather than actually bringing machines to life."

Learned ability

The goal of cognitive computing is not to build machines that can think for themselves, Dickinson says, but rather to build machines that can analyse vast amounts of information and learn along the way. We can take advantage of their strengths to help us solve difficult problems and make the best decisions based on more knowledge than any one person could ever hope to hold in their head.

A cognitive computing system can adapt and make sense of information – even input that is unstructured, such as images or natural speech. This makes it well suited to recognising and interpreting patterns, and giving them meaning.

IBM Science

"It knows that the typical human face has two eyes, a nose and a mouth and that these elements appear in certain places," Dickinson says. "So when a computer sees a pattern of shapes in that configuration it knows that it's probably a face.

"When it sees something new it's able to consider whether it's seen something similar before and determine whether this new thing fits into an existing group or whether it requires a new group in its knowledge base."

Practical applications

Artificial intelligence, signal processing and machine learning are already behind many of the technologies we use today, from search engines to smartphone-based personal assistants. Combining them into cognitive computing isn't just a futuristic dream, neither is it hidden away in research labs. It's already out in the real world making a real difference to people's lives.

IBM's Watson is one example, having come to fame after winning US game show Jeopardy! thanks to its ability to answer natural language questions. Today IBM's cognitive computing has been put to work on all kinds of problems, from writing recipes and recommending music to treating cancer and issuing weather forecasts.

The technology is rather mature, Dickinson says, and cognitive computing's major challenges are now more awareness and uptake than technical limitations.

"That's the problem with many technologies – it's not just the technical challenges in building it but it also needs to arrive at the right time for people to understand the technology and be ready to embrace it," she says.

"I know I'm more willing to take instructions and advice from a computer today than I was 10 years ago and it will be exciting to see how cognitive computing finds its place in the world."

Making a difference

Another strength of cognitive computing is that we can teach it new skills and, just like a person, it can improve over time as it learns from its mistakes. This capability for self-improvement is part of what makes cognitive computing such a powerful tool, already on show as IBM's Watson helps doctors determine the best treatments for cancer patients.

IBM Science

While cognitive computing has applications for every field, Dickinson is particularly passionate about the impact it will have on education and medicine. About 8000 clinical trials are published around the world each day, more than any one person could hope to read, but cognitive computing can act as a powerful research assistant – not only crunching the data but also looking for patterns and learning as it goes.

"If I'm an oncologist and I know that this cancer patient had a tumour, I can look around the world for similar patients with similar tumours and help determine the best treatment for my patient," Dickinson says. "As the doctor I can base my treatment decisions on the experience of millions of people, thanks to cognitive computing."

"Whatever field you're working in, cognitive computing is all about assisting you to do a better job and helping you to be the best you can be."

IBM

Welcome to the Cognitive Era

A new era of business. A new era of technology. A new era of thinking.

The Cognitive Era brings with it a fundamental change in how systems are built and interact with humans. Cognitive solutions are already unclogging city traffic, improving emergency services, making food supplies safer and improving customer engagement. But this is just the beginning. It's time to outthink what is achievable.

Visit IBM NZ
IBM and the IBM logo are trademarks of International Business Machines Corp., registered in many jurisdictions worldwide.