12/01/2017 / By Rhonda Johansson
For Vice President of Marketing of COBAN Technologies David Hinojosa, their new AI software is a “dashcam on steroids.” Such a description may elicit a smile or two but it does call to question just how rapidly we are transitioning to the virtual world. Many privacy and civil liberties activists believe that this new software opens greater potential for government or law agencies to secretly profile citizens and/or misuse gathered data in some way.
As explained by Marc Rotenberg who serves as President of the Electronic Privacy Information Center (EPIC), “Some of these technologies can be helpful but there are huge privacy issues when systems are designed to capture identity and make a determination based on personal data.”
Rotenburg stresses that AI systems should be heavily regulated so that consumers’ privacies are protected. EPIC has suggested that these regulations include legal safeguard, transparency, and procedural rights.
Hinojosa led a team of engineers to develop a new artificial intelligence software that would identify vehicles by their license plates, along with other features, to law enforcement officers. The technology is meant to provide “an extra set of eyes” to human patrol officers who might get tired. Supposedly, this vision-based AI is built to improve public safety by analyzing large chunks of data simultaneously, in real time. Hinojosa explained that their technology would consolidate evidence from multiple sources, including six cameras and other sources, and then transfer the data to a cloud for analysis. This should give police officers enough information on a potential criminal and their intended route.
The dashcam from COBAN Technologies is set to be tested by the police force of Delaware before the end of this year.
Security groups say that improved facial recognition software can be optimized by retail stores to prevent theft, armed robbery, or similar crimes. Local startup Deep Science has already piloted projects among small U.S. retailers which can detect an armed robbery in real time by identifying masked assailants. Their technology enables automatic alerts, informing a predetermined group (either management or police unit) of the ongoing crime. Co-founder of Deep Science Sean Huver says that their software monitors threats more efficiently and lowers the cost and need for human security guards.
“A common problem is that security guards get bored,” he said.
This, apparently, is the go-to logic of many of these software companies.
Briefcam, an Israeli startup, uses a similar technology to interpret video surveillance footage. Spokespeople of the company say that human error is all too likely and can compromise quality. Artificial intelligence is the only way to go through millions of chains of data.
“People who watch video become ineffective after 1o to 20 minutes,” explained Amit Gavish, the U.S. general manager of Briefcam.
Nevertheless, the scent of “Big Brother” is tickling the nose of many watchdog groups. These technologies may have started with an innocent goal of tracking down criminals but have now expanded to serve promotional and financial purposes. For example, Walmart is planning to set up a series of facial recognition software along their checkout lanes which would detect dissatisfied shoppers. The AI software would “ping” Walmart employees of a potentially unhappy customer, which should prompt the employee to go to that person and ask if they need assistance.
Or we can point to a more extreme use of the software found in China. Face++ is a Chinese startup that scans your face which then becomes your “currency”. Your facial structure gives you access to specific parts of buildings that use the technology. You can even pay for certain items just by your face. While the technology has yet to reach America, there are concerns on how secure the Face++ database is.
Ultimately, we are faced with the choice of how much we should trade our privacy for certain comforts and “security”. Technology does have its place in business efficiency but it should never replace our right to privacy.
Sources include:
Tagged Under: AI, AI software, computing, dashcams, face scans, Facial recognition, Fourth Amendment, future tech, invasion of privacy, Police, police state, privacy watch, security vs privacy, smart cameras, surveillance, virtual world
COPYRIGHT © 2017 COMPUTING NEWS