Surveillance

The Markup reviews reviewed different examples of controversial uses of machine learning algorithms in 2020:

Every year there are myriad new examples of algorithms that were either created for a cynical purpose, functioned to reinforce racism, or spectacularly failed to fix the problems they were built to solve. We know about most of them because whistleblowers, journalists, advocates, and academics took the time to dig into a black box of computational decision-making and found some dark materials.

Politico reported on some new EU regulations which would restrict the exports of surveillance technologies by companies.

The update to EU rules, expected to be agreed within weeks, would set up a comprehensive EU list of technologies that governments can control through licensing. It would also increase due diligence obligations on companies to check if their goods can be used by their clients to violate human rights.

The Register reported on the announcement of Apple to introduce “privacy ’nutrition labels’” (https://web.archive.org/web/20201109184222/https://www.theregister.com/2020/11/06/apple_privacy_advice/):

“For food, you have nutrition labels; you can see if it’s packed with protein or loaded with sugar, or maybe both, all before you buy it,” he said. “So we thought it would be great to have something similar for apps. We’re going to require each developer to self-report their practices.”

From the announcement by Apple:

Later this year, the App Store will help users understand an app’s privacy practices before they download the app on any Apple platform. On each app’s product page, users can learn about some of the data types the app may collect, and whether that data is linked to them or used to track them.

Apparently there has been a challenge that was posed to designers to rethink visual for cybersecurity: “How might we reimagine a more compelling and relatable visual language for cybersecurity?”

Most of the images in this field are indeed overused, so it is quite interesting to alternatives these designers came with. Have a look at the top contributions here.

Member of the European Parliament Sofie in ’t Veld wrote an article on about:intel about the reliance on foreign and US technologies in EU domestic security, such as Palantir. Here’s a quote from her conclusions:

The European Union is overly reliant on foreign companies for digital services, both on the government and the consumer level. This reliance cannot be fixed overnight. It is however high time to make choices that at least steer the EU towards more strategic independence. Companies that don’t fall within the EU’s jurisdiction cannot be involved with matters of domestic security. This must be particularly true for companies that have strong ties to American security services, as is the case with Palantir.

An article from CNET reports on the use of “keyword warrants” in police investigations the US:

Typically, probable cause is needed for search warrants, which are associated with a suspect or address. The demands for information are narrowly tailored to a specific individual. Keyword warrants go against that concept by giving up data on a large group of people associated with searching for certain phrases.

One of the nice things about podcasts is that usually you can usually still subscribe to them via RSS without being tracked. Advertisers however are trying to find new ways to track your listening habits and target users for ads:

Advertisers are projected to spend more than 800 million on podcasts in 2020, and companies are devising ways to provide them with data that will persuade them to spend more. The most common tactics include using IP addresses to identify users, adding tracking URLs to ads, and abandoning RSS in favor of proprietary platforms that already track their users.

Read the full article from The Markup here.

Another example of a controversy around surveillance practices. Adam Molina at Soundguys “Headphones are collecting too much personal data”. He seems to balance some of the conveniences that surveillance capitalist apps bring, but he is dismayed when he doesn’t see how his headphones tracking his music could benefit him=

On the flip side, I don’t know what I get in return for letting my headphones know what I’m listening to. Furthermore, I can’t think of a single reason why a pair of workout earbuds need access to someone’s menstrual history. We should just call it what it is because, at that point, it doesn’t feel like a transaction anymore. It’s just spying.

In an article at MIT Technology Review, Ethan Zuckerman makes an interesting point on how years of police body cams and bystander cellphone video has not stopped police violence. He recalls how these kinds of surveillance are linked to the ideas of the panopticon. But that one aspect of Jeremy Bentham’s hope for a completely transparent system has not been achieved.

Bentham’s panopticon works because the warden of the prison has the power to punish you if he witnesses your misbehavior. But Bentham’s other hope for the panopticon—that the behavior of the warden would be transparent and evaluated by all who saw him—has never come to pass. Over 10 years, from 2005 to 2014, only 48 officers were charged with murder or manslaughter for use of lethal force, though more than 1,000 people a year are killed by police in the United States.

This how Michel Foucault, in his book “Discipline and Punish” describes Bentham’s vision:

This Panopticon, subtly arranged so that an observer may observe, at a glance, so many different individuals, also enables everyone to come and observe any of the observers. The seeing machine was once a sort of dark room into which individuals spied; it has become a transparent building in which the exercise of power may be supervised by society as a whole.

The current technocratic hype are “track and trace” apps to help contain the coronavirus. The example of South Korea is frequently given as success story. This success is debatable however and needs to be put in context. An article on nature.com gives more context on the surveillance of infected people in South Korea:

South Korea’s data transparency during this outbreak has its origins in how the government handled the 2015 outbreak of MERS, which reportedly infected 186 people in South Korea and killed 36. The government at the time initially refused to identify the hospitals in which infected people were being treated, but a software programmer made a map of cases based on crowdsourced reports and anonymous tips from hospital staff. Eventually, the government relented and named the affected hospitals.

Read the full article on nature.com for more background information.