Surveillance

From the A Cypherpunk’s Manifesto (Eric Hughes, 1993):

“Privacy is necessary for an open society in the electronic age. … We cannot expect governments, corporations, or other large, faceless organizations to grant us privacy … We must defend our own privacy if we expect to have any. … Cypherpunks write code. We know that someone has to write software to defend privacy, and … we’re going to write it.”

Self-described cypherpunk Harry Halpin talks with Coindesk about his company’s (Nym Technologies) vision for privacy solutions at the network level a world where privacy is gradually being eroded. His vision of why we need privacy in order to be able to change the world is inspiring.

AT Spui25 in Amsterdam for the book launch of “Secrecy and Methods in Security Research”. The pannelists gave presentations about their chapters in the book and discussed how they deal with secrecy in research. One common idea that the authors in this book tackle is to see secrecy not as an obstancle, but as a way to understand an organisation. In this way secrecy becomes productive or a research object of itself.

Abacus, a unit of the South China Morning Post, published an article on how facial recognition technology is causing issues in China as people wear masks as a preventive measure against the new corona virus:

For hundreds of millions of people in China, the spread of the new coronavirus has caused abrupt changes to the smallest of habits – even a gesture that most in the country are used to by now: Looking into the camera for facial recognition.

Residents donning surgical face masks while venturing outside their homes or meeting strangers have found themselves in an unfamiliar conundrum. With their faces half-covered, some are unable to unlock their phones or use mobile payments with their faces.

Read the full article from abacus news.

Today Mark Zuckerberg accounced his new vision for Facebook as a more privacy-focused company. The principal change he thinks should be to have interoperable end-to-end encryption for all of Facebook’s apps. Although this would be an interesting improvement to protect the communication, the links with law enforcement are worrying. Who is deciding what patterns identify “bad actors”, and how are they not influenced by governments? It also a way for Facebook to seem like they deem it appropriate to decide who are “bad actors”, which is equally worrying in my opinion.

We have a responsibility to work with law enforcement and to help prevent these wherever we can. We are working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work.

Another piece from The Intercept reminds us that may be another form of coporate white-washing, and how they haven’t delivered on any other privacy improvement promises.

Hettie O’Brien writes in an article on The New Statesman on the fantasies of tech solutionism for Irish border:

The logic here is peak Silicon Valley: technology vacates policies of their political intent, offering practical solutions that we can converge on regardless of political differences.

Yet this politics-free vision of the Irish border amounts to magical thinking. It’s not because the necessary technologies don’t exist; many already do. It’s that the proposed solutions, whether visible or not, would effectively monitor everyone and everything that passed across the Irish border, making it one of the most closely observed and therefore political crossings in the world.

Sidewalk Labs, a subsidiary of Google’s parent company Alphabet announced a new tool for urban planners to use. Brenda McPhail, director of the Canadian Civil Liberties Association’s Privacy, Technology, and Surveillance Project sees it as a good example of surveillance capitalism. Read the full article of The Intercept.

“Replica is a perfect example of surveillance capitalism, profiting from information collected from and about us as we use the products that have become a part of our lives,” said Brenda McPhail, director of the Canadian Civil Liberties Association’s Privacy, Technology, and Surveillance Project. “We need to start asking, as a society, if we are going to continue to allow business models that are built around exploiting our information without meaningful consent.”

According to this article by The Intercept some prisons in the U.S. are capturing the voices of incarcerated people’s voice to create new biometric databases with their “voice prints”. It seems like another example of the deployment of new technology with the involvement of private companies on more vulnerable groups of people, with all the usual problems of biometrics (eg. reliability) and automated decisions (eg. transparency, explainability).

The enrollment of incarcerated people’s voice prints allows corrections authorities to biometrically identify all prisoners’ voices on prison calls, and find past prison calls in which the same voice prints are detected. Such systems can also automatically flag “suspicious” calls, enabling investigators to review discrepancies between the incarcerated person’s ID for the call and the voice print detected. Securus did not respond to a request for comment on how it defined “suspicious.” The company’s Investigator Pro also provides a voice probability score, rating the likelihood that an incarcerated person’s voice was heard on a call.