mouthporn.net
#facebook advertisers – @dragoni on Tumblr
Avatar

DragonI

@dragoni

"Truth is not what you want it to be; it is what it is, and you must bend to its power or live a lie", Miyamoto Musashi
Avatar

Facebook and Systemic Racism #NoBloodMoney

Yet when Slate tried something similar Thursday, our ad targeting “Kill Muslimic Radicals,” “Ku-Klux-Klan,” and more than a dozen other plainly hateful groups was similarly approved. In our case, it took Facebook’s system just one minute to give the green light.
This isn’t the first time the investigative journalism nonprofit has exposed shady targeting options on Facebook’s ad network. Last year, ProPublica found that Facebook allowed it to exclude certain “ethnic affinities” from a housing ad—a practice that appeared to violate federal anti-discrimination laws. 
Below are some of the targeting groups Facebook allowed us to use in the ad. Many were auto-suggested by the tool itself—that is, when we typed “Kill Mus,” it asked if we wanted to use “Kill Muslim radicals” as a targeting category. The following categories were among those that appeared in its autocomplete suggestions under the option to target users by “field of study:”
  • How kill jewish
  • Killing Bitches
  • Killing Hajis
  • Pillage the women and rape the village

  • Threesome Rape

Under “school,” we found “Nazi Elementary School.” A search for “fourteen words,” a slogan used by white nationalists, prompted Facebook to suggest targeting users who had listed their “employer” as “14/88,” a neo-Nazi code. Other employers suggested by Facebook’s autocomplete tool in response to our searches:
  • Kill Muslimic Radicals
  • Killing Haji

  • Ku-Klux-Klan

  • Jew Killing Weekly Magazine
  • The school of fagget murder & assassination
Avatar

How much money did Facebook earn from racist ads?

Not surprised that the FB Nation has racists. Facebook needs to add Humans to monitor and approve ads. #NoBloodMoney

Until this week, when we asked Facebook about it, the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’”
To test if these ad categories were real, we paid $30 to target those groups with three “promoted posts” — in which a ProPublica article or post was displayed in their news feeds. Facebook approved all three ads within 15 minutes.
After we contacted Facebook, it removed the anti-Semitic categories — which were created by an algorithm rather than by people
we chose additional categories that popped up when we typed in “jew h”: “How to burn Jews,” and “History of ‘why jews ruin the world.’” Then we added a category that Facebook suggested when we typed in “Hitler”: a category called “Hitler did nothing wrong.” All were described as “fields of study.”
Facebook’s automated system told us that we still didn’t have a large enough audience to make a purchase. So we added “German Schutzstaffel,” commonly known as the Nazi SS, and the “Nazi Party,” which were both described to advertisers as groups of “employers.” Their audiences were larger: 3,194 for the SS and 2,449 for Nazi Party.
Still, Facebook said we needed more — so we added people with an interest in the National Democratic Party of Germany, a far-right, ultranationalist political party, with its much larger viewership of 194,600.
Here is a screenshot of our ad buying process on the company’s advertising portal:

Here is one of our approved ads from Facebook:

A few days later, Facebook sent us the results of our campaigns. Our three ads reached 5,897 people, generating 101 clicks, and 13 “engagements” — which could be a “like” a “share” or a comment on a post.
Avatar

"If I don't pay for the product. I am the product."

Facebook's secretive advertising practices became a little more public on Monday thanks to a leak out of the company's Australian office. This 23-page document discovered by The Australian (paywall), details in particular how Facebook executives promote advertising campaigns that exploit Facebook users' emotional states—and how these are aimed at users as young as 14 years old.

Property of Facebook and Advertisers

Teen Angst

According to the report, the selling point of this 2017 document is that Facebook's algorithms can determine, and allow advertisers to pinpoint, "moments when young people need a confidence boost." If that phrase isn't clear enough, Facebook's document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including: 
  • "worthless"
  • "insecure"
  • "defeated"
  • "anxious"
  • "silly"
  • "useless"
  • "stupid" 
  • "overwhelmed"
  • "stressed"
  • "a failure" 
You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.
mouthporn.net