Rich Mogull writes at TidBITS (a blog focusing on Apple products and technologies) about Apple’s latest Platform Security user guide. He notes how the emphasis for increasing security is on vertical integration, which he defines as “the combination of hardware, software, and cloud-based services to build a comprehensive ecosystem.” This of course has other effects which he notes:
But with great power comes great lock-in. Relying on a particular vendor’s cloud service means that if the service goes down or the vendor decides to discontinue the service, you are the proud owner of a useless hunk of plastic and electronic components. While Apple isn’t going out of business anytime soon, we saw issues earlier this year when the company’s certificate server was overwhelmed and responded slowly to integrity checks, preventing some users from being able to launch apps.
Read the full article here.
Amusing conversations on this picture in the subreddit ProgrammerHumor after someone posted a picture of Bart from The Simpsons writing “Excel is not a database” on a chalkboard as punishment.
Ben Thomas on Stratchery.com writes why he considers Tesla a “meme company”:
It turned out, though, that TSLA was itself a meme, one about a car company, but also sustainability, and most of all, about Elon Musk himself. Issuing more stock was not diluting existing shareholders; it was extending the opportunity to propagate the TSLA meme to that many more people, and while Musk’s haters multiplied, so did his fans. The Internet, after all, is about abundance, not scarcity. The end result is that instead of infrastructure leading to a movement, a movement, via the stock market, funded the building out of infrastructure.
Interesting new article from Shoshana Zuboff at The New York Times. She recaps some of her arguments on surveillance capitalism, but also links it with more recent events related to the US elections and the Covid-19 pandemic.
In an information civilization, societies are defined by questions of knowledge — how it is distributed, the authority that governs its distribution and the power that protects that authority. Who knows? Who decides who knows? Who decides who decides who knows? Surveillance capitalists now hold the answers to each question, though we never elected them to govern. This is the essence of the epistemic coup. They claim the authority to decide who knows by asserting ownership rights over our personal information and defend that authority with the power to control critical information systems and infrastructures.
This Bloomberg article deals with an interesting controversy surrounding the introduction of a feature called “App Tracking Transparency,” which aims to give users the option to opt-out of tracking in apps. The article reports on newspaper full-page ads which Facebook published in response. Furthermore, Mozilla has joined the debate by publicly applauding Apple for the feature.
The Guardian published this great piece of investigative journalism on the funding of research on security technologies through EU-funded research programmes (such as Horizon 2020), and the involvement of industry. The following is an excerpt about the way the kinds of funded research projects are presented:
“Often the problem is that the topic itself is unethical,” said Gemma Galdon Clavell, an independent tech ethicist who has evaluated many Horizon 2020 security research projects and worked as a partner on more than a dozen. “Some topics encourage partners to develop biometric tech that can work from afar, and so consent is not possible – this is what concerns me.” One project aiming to develop such technology refers to it as “unobtrusive person identification” that can be used on people as they cross borders. “If we’re talking about developing technology that people don’t know is being used,” said Galdon Clavell, “how can you make that ethical?”
The Markup reviews reviewed different examples of controversial uses of machine learning algorithms in 2020:
Every year there are myriad new examples of algorithms that were either created for a cynical purpose, functioned to reinforce racism, or spectacularly failed to fix the problems they were built to solve. We know about most of them because whistleblowers, journalists, advocates, and academics took the time to dig into a black box of computational decision-making and found some dark materials.