Provides real life examples of how biased algorithms in big data can have serious effects; for example, how the "Compas" algorithm is used by courts to determine recidivism rates despite the possibility that the algorithm is racially discriminatory.
In this scene, Charlie and Don realize that their mathematical model is wrong because they were focusing on the wrong factor (the home) as opposed to the right one (the criminal's workplace).