Explainable systems and Black box systems
White Box vs Black Box
Types of AI system/Models
There are two types of AI systems that can be defined
Explainable systems or White box systems
Black Box systems
Explainable systems / White Box systems (models)

These are the AI systems or Models where one can easily map the input to the output. One knows what is happening during the training phase and how the system makes the appropriate choice.
Examples: Linear Regression and Decision Tree
Black Box systems

These are the AI systems or Models where one can not map the input to the output. You do not know what is happening within the system.
Example: Deep Learning and Computer Vision Models
Last updated
Was this helpful?