![]() Nonetheless, the comment process will be open for 10 days and comments will be collected in an appendix for each case. Users in the experiments also ended up spending much less time on the platforms than other users, suggesting they had become less compelling. The Oversight Board specifically invites public comments that consider: The board will issue a decision on Trump’s Facebook fate within 90 days of January 21, though the verdict could come sooner. Replacing the algorithm with a chronological feed led to people seeing more untrustworthy content (because Meta’s algorithm downranks sources who repeatedly share misinformation), though it cut hateful and intolerant content almost in half. Today, the Oversight Board selected a case referred by Facebook regarding a post in a Group claiming hydroxychloroquine and azithromycin is a cure for COVID-19 and criticizing the French government’s response to COVID-19. On Thursday, the surgeon general published a new report calling on social media platforms to make new investments in. AI now proactively detects 88.8 percent of the hate. Each side took an extreme position that distracted them and us from a deeper issue. Photo by Caroline Brehman-Pool / Getty Images. Facebook has decided whether to adopt recommendations the Facebook Oversight Board made last month when the board decided. Facebook has also improved its hate speech moderation using many of the same techniques it’s employing toward coronavirus-related content. There are two pieces of alleged hate speech, one piece of alleged COVID. In each of the experiments, the tweaks did change the kind of content users saw: Removing reshared posts made people see far less political news and less news from untrustworthy sources, for instance, but more uncivil content. In the past week, President Biden and Facebook have been in a war of words over vaccine misinformation. The six cases selected by the Oversight Board cover some of the bases that Facebook has faced difficulty moderating in 2020. In a third study, published in Nature, the team reduced by one-third the number of posts Facebook users saw from “like-minded” sources-that is, people who share their political leanings. In one experiment, the researchers prevented Facebook users from seeing any “reshared” posts in another, they displayed Instagram and Facebook feeds to users in reverse chronological order, instead of in an order curated by Meta’s algorithm. Facebook is sending notifications directly to users who like, share, and comment on posts that where removed for containing COVID-19 misinformation, Fast Company reports.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |