November 3, 2016

Explainable AI: A Key Ingredient to Realizing Market Growth Potential

Explainable AI: A Key Ingredient to Realizing Market Growth Potential

IDC recently released an update to their Cognitive/Artificial Intelligence Systems spending forecast. With this revision they now predict the market will grow from $8B in 2016 to a staggering $47B by 2020, a 55% CAGR over that period. 

According to the press release, nearly half of all of the revenue throughout the forecast will go to software, including cognitive applications and software platforms. Taking these numbers at face value, it is extremely interesting to think about product implications that will arise if we are to realize anything close to market growth of this magnitude.  

In this context, I find the recent Wired article by Clive Thompson, Sure AI Is Powerful - But Can We Make It Accountable?, to raise some compelling questions that must be considered in any discussion around the growth of intelligent applications and systems. In the article Clive focuses on the question of how, amidst the increased use of AI-based systems across a wider and wider range of use cases, the industry will deal with the current lack of explainability inherent to most AI and machine learning algorithms. He writes: 

“The opacity of machine learning isn’t just an academic problem. More and more places use the technology for everything from image recognition to medical diagnoses. All that decision­ making is, by definition, unknowable—and that makes ­people uneasy. My friend Zeynep Tufekci, a sociologist, warns about “Moore’s law plus inscrutability.” Microsoft CEO Satya Nadella says we need “algorithmic accountability.”"

Central to this discussion is the lack of explainability that exists in the current black box approach employed by most AI algorithms and techniques. As Thompson points out in the article, the stakes are raised when unexplainable system biases impact things like bank loan decisions or medical diagnosis. As AI becomes more prevalent in our personal and professional lives, it will inevitably be utilized in increasingly dynamic, mission critical, and non-routine scenarios where the stakes are only going to get higher. 

In order for AI to be confidently deployed, and ideally thrive, in these environments, explainability will be paramount. DARPA refers to this concept as Explainable AI (XAI). Below is an excerpt from a recent presentation by the agency’s David Gunning on the importance of XAI: 

“Continued advances promise to produce autonomous systems that will perceive, learn, decide, and act on their own. However, the effectiveness of these systems will be limited by the machine’s inability to explain its thoughts and actions to human users. Explainable AI will be essential, if users are to understand, trust, and effectively manage this emerging generation of artificially intelligent partners.” 

Explainability matters for all consumers of AI that want to have confidence and trust in the decisions that are made by the system that they interact with on a daily basis. Explainability matters for the producers of AI, the developers and technologists, that are continually looking to debug, refine and repurpose their models to produce more adaptable, dynamic and intelligent applications and systems. If the Cognitive/AI Systems market is to have a chance at realizing the type of growth that IDC is projecting, then explainability must increasingly be considered as a table stakes requirement in all AI platforms or toolkits.  

At Bonsai, with our focus on adaptive control and system optimization use cases, this is a problem we set out to address from very early on. Each high level model created within the Bonsai AI Engine allows you to see a causal inference chain detailing what contributed to a prediction, identify conceptual gaps and bugs, and constantly refine training. Now with the recent introduction of our Private Beta program we are actively working with developers and enterprises to help validate the platform for specific use cases, while informing future product direction. If you, or someone at your company, are interested in trying out the platform you can sign up here and we will onboard you as soon as we can.

Subscribe to our newsletter to stay up-to-date on our latest product news & more!

SUBSCRIBE NOW