Drive smarter decision making with explainable machine learning

Missed a session from the Future of Work Summit? Visit our Future of Work Summit on-demand library to stream.

This article is contributed by Berk Birand, CEO of Fero Labs.

Is the hype around AI finally cooling off?

That’s what some recent research would suggest. Most executives now say the technology is more hype than reality – and 65% report zero value on their investments in AI and machine learning.

However, these statements often reflect a fundamental misunderstanding. Many executives fail to distinguish between generic black box AI and related technologies such as explainable machine learning. As a result, they miss a critical path to smarter and more efficient decision-making that can deliver greater business value.

Black boxes, or software programs that spew out mysterious answers without revealing how they got there, are the algorithms that power the world’s top tech companies. You have no way of knowing how a black box gets its result. Occasionally the results are funny, like when Google’s image recognition software mistakenly identifies a cat as guacamole, or when Netflix recommends a bad show. In those cases, the stakes are low. A mistake by Netflix will only cost you a few wasted minutes.

But for complex, high-stakes sectors such as healthcare, criminal justice and manufacturing, it’s a different story. If AI technology tells a steel engineer to add the wrong amount of alloys, creating a metal with the wrong density, buildings can collapse.

In areas like healthcare, where a single decision literally makes the difference between life and death, professionals can be particularly reluctant to trust the recommendations of a mysterious black-box algorithm. Or worse, they could adopt them, leading to potentially catastrophic results.

Explainable machine learning

Unlike black box software, any AI solution that can call itself “explainable” would show how different inputs affect the output. Take autopilot software, for example – the algorithm that controls the controls needs to know how much the plane will tilt if a sensor detects northwestern winds of 80 miles per hour, and the user needs to be able to understand how this information affects the algorithm’s predictions. Without this capability, the software would not fulfill its intended purpose and would result in a negative value.

In addition, explainable software should provide some kind of measurement that indicates confidence in any prediction, enabling safe and accurate decision making. In health care, for example, a doctor should not just use a certain treatment. Instead, they would be told the probability of the desired result, as well as the confidence level. In other words, is the software very confident in its prediction, or is the prediction more of a gamble? Only with this kind of information can the doctor make informed and safe decisions.

How can you apply explainable machine learning to drive smarter decision-making in your business?

If you want to build a tool in-house, know that it is difficult. Explainably, machine learning is complex and requires deep statistical knowledge to develop. One industry that has done this well is pharmaceuticals, where companies often have dozens of PhDs doing explainable data science and analytics in-house.

If you want to buy software, you need to do some due diligence. Look at real use cases the vendor provides, not just slogans. Look at the background of the science/research team: are they proficient in explainable machine learning? What evidence do they show off their technology?

The most important? Use your judgment. The great thing about explainable machine learning is that it can be well explained. If you don’t get it, it probably won’t generate value for your business.

Berk Birand is the CEO of Fero Labsan industrial AI software company based in New York.

DataDecision makers

Welcome to the VentureBeat Community!

DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.

If you want to read about the latest ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.

You might even consider contributing an article yourself!

Read more from DataDecisionMakers

Leave a Reply

Your email address will not be published.