The state of Utah has given Banjo, an artificial intelligence company, access to state traffic cameras, CCTV and “public safety” cameras, 911 emergency systems, location data for state-owned vehicles, and other sensitive data.
Banjo said they are combining this data with social media, satellites, and other apps to detect real-world anomalies. Banjo is trying to predict crimes as they happen in real-time.
The Company claims that its product, “Live Time Intelligence”, can solve child kidnappings in seconds, identify active shooter situations, and alert the very moment a traffic accident happens. This is an ethical double-edged sword.
One hand everyone wants lower crime. No one wants to be in fear in their home or in the public sphere. On the other hand, do we trust the government to only use this information for good? The ethical dilemma is real.
Edward Snowden showed us how our own government was spying on us through our cell phones. They claimed they were only collecting metadata but the potential to spy and get more data was there.
So the question becomes, what is the limit on artificial intelligence? As Artificial intelligence grows, is there a way to limit it? Programmatically the question is yes, you can limit it. Ethically the question becomes vague.
The questions we should be asking ourselves are:
- What is the balance between privacy and technology?
- What rights do we have in the public sphere in terms of privacy?
- Can companies collect data about their product usage?
- If companies can collect this data, can they sell that data or give it to a government entity without a warrant?
- Is metadata really harmless and anonymous as they say it is?
- Can law enforcement act in these situations without a warrant?
- What is the balance between security and ethics?
These are questions that we need to answer as the age of artificial intelligence is here. The ethical problems will become more and more apparent as AI is implemented. We need to address these problems sooner rather than later otherwise we are going to be in a world of hurt.