Companies and businesses today, are always on a hunt for technology and services that are going to be irresistible for people. They always try to attract the general public with a unique, never-seen-before offering, which tempts them to make investments or purchase. These companies hire the planet’s best minds to come up with these new offerings, and in this competitive market, to have an edge over the competitors, they often falsely or inaccurately advertise their offerings. So, what comes next is the trust of the general public, to lure them, they must’ve what they advertise, but should the public trust these big companies which operate from skyscrapers with something like crimes.
Recently, a group of campaigners at Liberty argued this very crucial trust issue, on whether the police and other law upholding agencies be allowed access to the artificial intelligence, big data, and machine learning tech to predict what crimes people might commit. As selling things and selling opinions are completely different things, the latter can cripple a democracy and lead us all to the oblivion and chaos. Determining the course of the criminal justice system can be more damaging, than being useful. For instance, if we take into account the past practice of discrimination against women or minorities, and feed this algorithm based on previous experience using the emerging technologies, we sure can justify the nature of a crime. But, with modern machine learning techniques being opaque, even to their programmers, we can’t ask a computer to testify about its own reasoning, but can ask a police officer.