mouthporn.net
#algorithm – @protoslacker on Tumblr
Avatar

Three Good Links

@protoslacker / protoslacker.tumblr.com

I read posts online that interest, infuriate, stimulate, inspire, or otherwise move me. I'll share short snippets. Mastodon Shuffle
Avatar
When Peden covered the escalation in Gaza in 2021, the sources he was seeing in his feed were from people on the ground or credible news agencies. This weekend, he says, verified content or primary sources were virtually impossible to find on X.

David Gilbert in Wired. The Israel-Hamas War Is Drowning X in Disinformation

People who have turned to X for breaking news about the Israel-Hamas conflict are being hit with old videos, fake photos, and video game footage at a level researchers have never seen.

Avatar
There’s no proof that Facebook’s changes had political intentions, but it’s not hard to imagine that the company could tweak its algorithms in the future, if it wanted to. To guard against that potential, new laws could bar changes to the algorithm in the run-up periods before elections. In the financial industry, for instance, “quiet periods” in advance of major corporate announcements seek to prevent marketing and public-relations efforts from artificially influencing stock prices. Similar protections for algorithms against corporate manipulation could help ensure that politically active, power-seeking Facebook executives – or any other company with significant control over users’ access to information – can’t use their systems to shape public opinion or voting behavior.

Jennifer Grygiel in The Conversation. Facebook algorithm changes suppressed journalism and meddled with democracy

Avatar
Just by calling attention [to the fact that] a narrative frame is being established means that it becomes more entrenched. And in a digital media environment it becomes more searchable. It becomes literally Google search indexed alongside whatever particular story [is written about it].

Whitney Phillips quoted in an article by Paris Martineau at Wired. The Existential Crisis Plaguing Online Extremism Researchers

Avatar
Our dreams might be the final frontier of capitalism’s march toward colonizing, quantifying, and capitalizing every aspect of everyday life. The dream may also then be the only ground left on which we can make a stand: Dreams are a space of unknowing, a space of confusion and non-reality outside the effective virtualities that render us machine-readable and wanting to be even more so. Dreams might be a space of autonomy from which we can draw inspiration to move beyond capitalist realism. But only if dreams themselves remain beyond the grasp of the computable subjectivity and its ideological machinery can they offer something other than the reductive vision of tracking, and of total availability to the technologies that intend to tell us who we really are.

Zach Kaiser in Real Life. Sleep Subjects

Corporate metrics want to extract productivity from everything — even your dreams

Avatar
It seems this transformation, from physical object to vector of data, is a general and oft-repeated process in the history of technology, where new inventions begin in an early experimental phase in which they are treated and behave as singular individual things, but then evolve into vectors in a diffuse and regimented system as the technology advances and becomes standardised. In the early history of aviation, airplanes were just airplanes, and each time a plane landed or crashed was a singular event. Today, I am told by airline-industry insiders, if you are a billionaire interested in starting your own airline, it is far easier to procure leases for actual physical airplanes, than it is to obtain approval for a new flight route. Making the individual thing fly is not a problem; inserting it into the system of flight, getting its data relayed to the ATC towers and to flightaware.com, is.
Avatar
It’s a system that is incomprehensible without the aid of computers, and in which the traditional relationship of authority between human and machine is inverted. (“Reducing humans to meat algorithms, useful only for their ability to move and follow orders, makes them easier to hire, fire, and abuse,” Bridle notes.) As with so much else in the book, it’s difficult not to read this as a metaphor for a much broader truth: we are all of us increasingly negotiating a world that makes sense only from the point of view of machines.

Mark O'Connell in The New Yorker. The Deliberate Awfulness of Social Media

To be alive and online in our time is to feel at once incensed and stultified by the onrush of information, helpless against the rising tide of bad news and worse opinions.

Avatar
Any social scientist with a heart desperately wants to understand how to relieve inequality and create a more fair and equitable system. So of course there’s a desire to jump in and try to make sense of the data out there to make a difference in people’s lives. But to treat data analysis as a savior to a broken system is woefully naive. Doing so obfuscates the financial incentives of those who are building these services, the deterministic rhetoric that they use to justify their implementation, the opacity that results from having non-technical actors try to understand technical jiu-jitsu, and the stark reality of how technology is used as a political bludgeoning tool.

danah boyd at apophenia (originally on Medium). Beyond the Rhetoric of Algorithmic Solutionism

Avatar
RoboCop stages the proverbial showdown between good cop and bad cop. ACAB (“All cops are bastards”), as a mantra, reveals this structure to always be reactionary. RoboCop is policing redeemed by the retention of the human element. But nowadays cybernetic police practices extend beyond the human and the locatable. What happens when we consider predictive forms of policing that have no face and thus cannot be personified? Perhaps there is a need for personification to arouse our moral indignation. “All police databases are bastards” makes no sense. We need new aesthetic and political practices to respond to new forms of power that circulate through technology and algorithmic regulation.

Jackie Wang interview by M. Buna at Los Angeles Review of Books.  Carceral Capitalism: A Conversation with Jackie Wang

Avatar
Another reason why we shouldn’t trust Google to provide us with credible, accurate and neutral information is that its main concern is advertising, not informing. That’s why we should be very worried. While public institutions such as universities, schools, libraries, archive and other memory spaces are loosing state funding (the book focuses on the USA but Europe isn’t a paradise either in that respect), private corporations and their black-boxed information-sorting tools are taking over and gaining greater control over information and thus over the representation of cultural groups or individuals.

Regine Debatty at We Make Money Not Art reviews Algorithms of Oppression. How Search Engines Reinforce Racism Dr. Safiya Umoja Noble

Avatar
It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

Zeynep Tufekci in The New York Times. YouTube, the Great Radicalizer

Avatar
The author sets the stage for her critique of corporate information control by debunking the many myths and illusions that surround internet. She explains that, no, the Google search engine is neither neutral nor objective; yes, Google does bear some responsibility in its search results, they are not purely computer-generated; and no, Google is not a service, a public information resource, like a library or a school.

Regine Debatty at We Make Money Not Art. Algorithms of Oppression. How Search Engines Reinforce Racism

Avatar
The second is the levels of exploitation, not of children because they are children but of children because they are powerless. Automated reward systems like YouTube algorithms necessitate exploitation in the same way that capitalism necessitates exploitation, and if you’re someone who bristles at the second half of that equation then maybe this should be what convinces you of its truth. Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but right here, now, on your screen, in your living room and in your pocket.

James Bridle at Medium. Something is wrong on the internet

Avatar
Instead, I would argue, what’s disturbing here is what the content suggests about how things should be connected. The real risk would seem to be that children exposed to recommendation algorithms at an early age might begin to emulate them cognitively, learning how to think, reason, and associate based on inhuman leaps of machine logic.

Geoff Manaugh at BLDGBLOG. The Ghost of Cognition Past, or Thinking Like An Algorithm

Avatar
As a typical millennial constantly glued to my phone, my virtual life has fully merged with my real life. There is no difference any more. Tinder is how I meet people, so this is my reality. It is a reality that is constantly being shaped by others – but good luck trying to find out how.

Judith Duportail in The Guardian. I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets

The dating app knows me better than I do, but these reams of intimate information are just the tip of the iceberg. What if my data is hacked – or sold?

Getting your data out of Tinder is really hard – but it shouldn’t be

Avatar
From now on, the distinguishing factor between those who win elections and those who lose them will be how a candidate uses that data to refine their machine learning algorithms and automated engagement tactics. Elections in 2018 and 2020 won’t be a contest of ideas, but a battle of automated behavior change.

Berit Anderson and Brett Horvath at Scout. The Rise of the Weaponized AI Propaganda Machine

There’s a new automated propaganda machine driving global politics. How it works and what it will mean for the future of democracy.

Avatar
There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

Alexis Madrigal at The Atlantic. Google and Facebook Failed Us The world’s most powerful information gatekeepers neglected their duties in Las Vegas. Again.

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.
mouthporn.net