Why are readers drawn to sensationalist stories? Why do content providers produce them? It likely has something to do with the recommender systems that direct readers' attention to certain stories more than to others.
Time magazine has the YouTube story:
YouTube Has Been 'Actively Promoting' Videos Spreading Climate Denialism, According to New Report
"YouTube has been “actively promoting” videos containing misinformation about climate change, a report released Thursday by campaign group Avaaz claims, despite recent policy changes by the platform intended to drive users away from harmful content and conspiracy theories.
"Likes are the social media currency undergirding an entire influencer economy, inspiring a million Kardashian wannabes and giving many of us regular people daily endorphin hits. But lately, Mr. Mosseri has been concerned about the unanticipated consequences of Instagram as approval arbiter.
...
"Mr. Mosseri knows something about dealing with dystopian tech fallout. He came to Instagram in October 2018 after years overseeing the Facebook News Feed, an unwitting engine of fake news, inflammatory rhetoric and disinformation. He wants to avoid similar pitfalls at Instagram, which is owned by Facebook.
Time magazine has the YouTube story:
YouTube Has Been 'Actively Promoting' Videos Spreading Climate Denialism, According to New Report
"YouTube has been “actively promoting” videos containing misinformation about climate change, a report released Thursday by campaign group Avaaz claims, despite recent policy changes by the platform intended to drive users away from harmful content and conspiracy theories.
"The “up next” feature dictates what users watch for 70% of the time they spend on YouTube. The exact make-up of the YouTube algorithm that drives recommendations, designed to keep users on the platform for as long as possible, is a closely guarded secret. Experts say the algorithm appears to have learned that radical or outrageous content is more likely to engage viewers.Avaaz examined 5,537 videos retrieved by the search terms “climate change,” global warming” and “climate manipulation,” and then the videos most likely to be suggested next by YouTube’s “up next” sidebar. For each of those search terms respectively, 8%, 16% and 21% of the top 100 related videos included by YouTube in the “up-next” feature contained information that goes against the scientific consensus on climate change – such as denying climate change is taking place, or claiming that human activity is not a cause of climate change. Avaaz claims this promotion process means YouTube is helping to spread climate denialism."
...
"The “up next” feature dictates what users watch for 70% of the time they spend on YouTube. The exact make-up of the YouTube algorithm that drives recommendations, designed to keep users on the platform for as long as possible, is a closely guarded secret. Experts say the algorithm appears to have learned that radical or outrageous content is more likely to engage viewers.
**********
The NY Times has the Instagram story
This Is the Guy Who’s Taking Away the Likes"Likes are the social media currency undergirding an entire influencer economy, inspiring a million Kardashian wannabes and giving many of us regular people daily endorphin hits. But lately, Mr. Mosseri has been concerned about the unanticipated consequences of Instagram as approval arbiter.
...
"Mr. Mosseri knows something about dealing with dystopian tech fallout. He came to Instagram in October 2018 after years overseeing the Facebook News Feed, an unwitting engine of fake news, inflammatory rhetoric and disinformation. He wants to avoid similar pitfalls at Instagram, which is owned by Facebook.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.