What Does Black Box AI Mean?
Black box AI is any type of artificial intelligence (AI) that is so complex that its decision-making process cannot be explained in a way that can be easily understood by humans. Black box AI is the opposite of explainable AI (XAI).
Causes of black box AI include:
Proprietary IT – the inner workings of an AI model are kept secret to protect intellectual property.
Deep learning – Deep neural networks (DNNs) and deep learning algorithms create thousands (and sometimes millions) of non-linear relationships between inputs and outputs. The complexity of the relationships makes it difficult for a human to explain which features or interactions led to a specific output.
Black box AI is undesirable for a number of reasons. When the internal workings of an AI system are not understood, it becomes increasingly challenging to identify why an AI model is producing biased outputs and where errors in logic are occurring. It also makes it difficult to determine who should be held accountable when outputs are flawed or outright dangerous.
Techopedia Explains Black Box AI
When an AI system is transparent and interpretable, it becomes easier to trust in the integrity of the system and the accuracy of its outputs. Transparency and interpretability can be achieved through a variety of approaches, including designing and using algorithms that are easily understood by humans, making sure that human feedback always plays a role in the decision-making process and developing tools that are able to provide visual explanations for how an AI application arrives at a decision.
Popular tools being developed to prevent black box AI and ensure responsible AI include:
- LIME (Local Interpretable Model-Agnostic Explanations)
- SHAP (SHapley Additive exPlanations)
- ELI5 (Explain Like I’m 5)
- DALEX (Descriptive mAchine Learning EXplanations)
Editor’s Note: According to the Apple Style Guide, writers should avoid the term black box and use the terms closed box or opaque box instead. The writer for this definition would like to propose “mystery box” as a substitute label.