Research Library
Monthly Research
& Market Commentary


There is Nothing Artificial About Machine Intelligence

The pursuit of Artificial Intelligence (AI) is as old as the IT industry itself. Ever since the first programmable computers were conceived, both pioneering technologists and the public at large have foreseen the inevitable competition between human and electronic capabilities.

Machine Intelligence

However, this situation is now rapidly changing. Over the next few years, we expect major progress in a wide range of AI/MI disciplines.

But reaching this stage has taken much longer than many predicted. While sophisticated, hand-crafted expert systems have long been mission-critical in a wide range of industries, developing low-cost, general-purpose machine intelligence (MI) has mostly proved elusive. Progress in fundamental AI/MI fields such as robotics, neural networks and natural language processing has generally lagged behind other areas of digital innovation, with most of the relevant research remaining academic in nature.

However, this situation is now rapidly changing. Over the next few years, we expect major progress in a wide range of AI/MI disciplines, as machines become ever-more capable of speaking, hearing, seeing, recognizing, tracking, controlling, analyzing, advising, learning, explaining, deciding and many other human capabilities. Although the terms AI and MI can still be used interchangeably, over time MI should prevail. Machine intelligence works quite differently from the human version, but it is entirely real, and in this sense not at all artificial.

There are three main reasons why we believe progress in MI is improving so quickly today:

  1. Technologists have learned that it is often much easier to build specific MI capabilities (such as language translation or facial recognition) by using large, specialized data sets than it is to develop general-purpose learning capabilities. Each MI capability has its own applications, business model and focused start-up companies. There are hundreds of such firms today.
     
  2. Major strides are also being made toward the traditional goal of general-purpose intelligence through so-called deep learning. For example, DeepMind, a British firm acquired for over $400 million by Google in 2014, has greatly impressed the AI community with the ability of its systems to learn Atari video games such as Pong from scratch, and then quickly teach themselves to play at super-human levels. This month, the company’s AlphaGo system will take on Lee Sedol, a world champion in the Chinese game Go, in a highly anticipated $1million match.
     
  3. Perhaps most importantly, the reason that the two developments above are so important today is that machine intelligence is merging with cloud economics – global scale, continuous improvement, real-time Big Data acquisition, and effectively zero marginal cost. This means that each new MI capability can potentially serve literally billions of online users. Consider the way that language translation capabilities are now being bundled into Skype.
     

The merger of MI and the cloud is fundamental to the innovations of the future, but it is still not sufficiently understood. Before the cloud, most AI work was isolated and relatively high cost. But when married to the cloud, MI services such as recognizing faces or translating languages are no different from using Shazam to identify a song or Googling a search term. What makes all of these wondrous applications possible is not just powerful computers and clever software, but the availability of large, specialized and continually refreshed data sets, most of which simply did not exist in the pre-cloud era. 

As AI enthusiasts have long observed, once a smart application becomes ubiquitous, people no longer see it as AI – it becomes just another technology service. But taken together, ever-richer data sets, improved MI know-how, powerful cloud economics and lucrative MI business models are creating tremendous new market opportunities and a new Silicon Valley gold rush, even today when tech markets are down.

New words describe new realities

Over the last year, clients may have noticed that we have been using the term ‘the Matrix’ to describe today’s emerging technology landscape, with a deliberate nod to the iconic 1997-2003 film trilogy. We decided to do this because new words can help us appreciate new realities, as shown in the figure below.

New words emerge to describe new realities

While the internet, the web and the cloud are now often used interchangeably, in each case new terminology emerged to reflect a major new phase of digital innovation.

Consider that the term internet was originally a shortened alternative to internetworking – which described the ability to link private incompatible computer networks via gateways. In the late 1980s, Tim Berners-Lee took this concept a major step further by making it easy to link not just systems, but individual pages and documents, via a web of hypertext. More recently, the use of ‘the cloud’ caught on as a way to capture the fact that networked computers were no longer just connecting systems and pages; they were also an on-demand platform that could transform computing into a utility service. While the internet, the web and the cloud are now often used interchangeably, in each case new terminology emerged to reflect a major new phase of digital innovation. 

We believe that new words will soon be needed once again. While the cloud suggests something ‘out there’, we are all now part of a vast and increasingly intelligent Matrix of systems, software and data. But whether our use of the term ‘the Matrix’ catches on or not, we think it captures the driving technology dynamic of our times: the extraordinary merger of machine intelligence and cloud economics, and (for better or worse) all that this will entail. This potent combination will be the theme of this year’s LEF Study Tour (24-30 September 2016), as well a major focus of my own research this year. Clouds are not particularly self-aware and intelligent, but the Matrix surely is.

JOIN THE CONVERSATION

*{{ error }}
*{{ error }}
*{{ error }}
captcha
*{{ error }}
*{{ error }}
*{{ error }}

DOWNLOADS

Research Commentary

PDF (437.3 KB)

AUTHORS

David Moschella
Research Fellow
David Moschella, based in the United States, is a Research Fellow for Leading Edge Forum.  David's focus is on industry disruptions, machine intelligence and related business model strategies.  He is the project lead for our 2017 research into Disrupting ‘The Professions’ – Scenarios for Human and Machine Expertise. David was previously Research Director of the programme. David’s key areas of expertise include globalization, industry restructuring, disruptive technologies, and the co-evolution of business and IT.  David is the author of multiple research reports.  His most recently published reports include Embracing 'the Matrix' and the Machine Intelligence Era (March 2016) and The Myths and Realities of Digital Disruption (September 2015). An author and columnist, David’s second book, Customer-Driven IT, How Users Are Shaping Technology Industry Growth, was published in 2003 by Harvard Business School Press.  The book predicted the shift from a supplier-driven to today’s customer-led IT environment.  His 1997 book, Waves of Power, assessed global competition within the IT supplier community.  He has written some 200 columns for Computerworld, the IT Industry’s leading publication on Enterprise IT, and has presented at countless industry events all around the world. David previously spent 15 years with International Data Corporation, where he was IDC’s main spokesperson on global IT industry trends and was responsible for its worldwide technology, industry and market forecasts.    

CATEGORIES

21st Century
Adaptive Execution
Assets/Capabilities
Identity/Strategy
Proactive, Haptic Sensing
Reimagining the Portfolio
Value Centric Leadership

ALSO IN THIS CATEGORY

How to lead ‘Clevers’ – What aspects of existing leadership and HR orthodoxy actually make a difference?
20 Mar 2017 / By Mike Bowden
How to Build your Plan for Becoming a 21st Century Organization
14 Mar 2017 / By Dave Aron
How to lead ‘Clevers’ – What aspects of existing leadership and HR orthodoxy actually make a difference?
13 Mar 2017 / By Mike Bowden
If You Want to Get There, I Wouldn’t Start From Here
02 Mar 2017 / By Dave Aron
Designing for constant evolution: Pioneers, Settlers and Town Planners
13 Feb 2017 / By Simon Wardley