Widgetized Section

Go to Admin » Appearance » Widgets » and move Gabfire Widget: Social into that MastheadOverlay zone

Risky Business: Privacy in the Age of AI and Big Data

The views expressed are those of the author and do not necessarily reflect the views of ASPA as an organization.

By Matthew Teal
May 9, 2019

The first four articles in this series have looked at potential benefits of using artificial intelligence (AI) and Big Data in local government environments. However, there are also very real costs and risks associated with local government adoption of AI and Big Data. This article will discuss the potential loss of privacy that citizens (including local government employees themselves) face in the age of AI and Big Data.

The idea of electronic surveillance pervading society is far from new. Most famously, George Orwell’s 1984 predicted, “Big Brother” and the constant surveillance of the populace through electronic means:

 “The instrument (the television it was called) could be dimmed, but there was no way of shutting it off completely. … The telescreen received and transmitted simultaneously. … There was of course no way of knowing whether you were being watched at any given moment.”

Such technology was fantasy at the time but is all too real in today’s world. In the October 2018 issue of The Atlantic magazine, Yuval Noah Harari wrote that, “As many people lose their economic value [through the rise of AI-powered automation], they might also come to lose their political power. The same technologies that make billions of people economically irrelevant might also make them easier to monitor and control.” Harari cites the, “Total-surveillance regime,” that Israel has imposed on the Palestinian West Bank, “Already today whenever Palestinians make a phone call, post something on Facebook, or travel from one city to another, they are likely to be monitored by Israeli microphones, cameras, drones, or spy software.” Similarly, in April 2019 Paul Mozur published an article for the New York Times documenting how China is using, “A vast, secret system,” of AI-powered facial recognition technology, “To track and control the Uighurs, a largely Muslim minority. It is the first known example of a government intentionally using artificial intelligence for racial profiling, experts said.”

Big deal, you might say. China and the West Bank are a long way from my town in the United States and have very different laws. That is true. That is also irrelevant. The technology that powers Chinese and Israeli surveillance is also present in the United States and is being actively used across America today. In April 2019, Jennifer Valentino-DeVries wrote an article for the New York Times detailing how American police can now use information stored in Google’s servers to obtain search warrants. Valentino-DeVries writes that, “The warrants, which draw on an enormous Google database employees call Sensorvault, turn the business of tracking cellphone users’ locations into a digital dragnet for law enforcement.” The article notes that, “The practice was first used by federal agents in 2016, according to Google employees, and first publicly reported last year in North Carolina. It has since spread to local departments across the country, including in California, Florida, Minnesota and Washington. This year, one Google employee said, the company received as many as 180 requests in one week. Google declined to confirm precise numbers.”

The use of AI and Big Data in local government also have privacy risks beyond law enforcement. For example, research by Yilun Wang and Michal Kosinski at Stanford University has demonstrated that AI-powered computer programs can infer sexual orientation just from looking at a person’s face. According to a September 2017 summary of the research published in The Economist, “Machine vision can infer sexual orientation by analyzing people’s faces. The researchers suggest the software does this by picking up on subtle differences in facial structure. With the right data sets … similar AI systems might be trained to spot other intimate traits, such as IQ or political views.” Could local government employers use such software to discriminate against LGBTQ+ applicants, employees, or citizens petitioning the government? The advocacy group Freedom for All Americans claims that, “LGBTQ Americans aren’t fully protected from discrimination in 30 states.” (Though the U.S. Equal Employment Opportunity Commission claims to forbid, “Any employment discrimination based on gender identity or sexual orientation.”) Even if sexual orientation were not used an explicit reason for firing an employee or denying employment to an applicant, could local governments nevertheless use such information in their policy or employment decisions? What about one day using scans to identify and discriminate against people with low IQ’s or people from the “wrong” political party?

So how can local governments mitigate these threats? First, local governments must be transparent. Specifically, local governments must be transparent about what technologies they use and how they use those technologies. Additionally, local governments must articulate where and how long they store any data gathered or generated by such technologies. In his Atlantic article, Harari argued that, “We must regulate the ownership of data.” Finally, there may be times where local governments and their communities simply choose to value privacy over efficiency by intentionally not gathering and analyzing certain types of data. In April 2019, Ross Douthat argued that, “A movement to restore privacy must be, at some level, a movement against the internet. Not a pure Luddism, but a movement for limits, for internet-free spaces, for zones of enforced pre-virtual reality.”

My opinions are my own and do not represent the University of North Carolina at Chapel Hill.

Matthew Teal, MA, MPA
Policy Analyst
University of North Carolina at Chapel Hill
Email: [email protected]
Twitter: @mwteal

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5.00 out of 5)

Leave a Reply

Your email address will not be published. Required fields are marked *