A report from Axios (via The Verge) highlights how this works. Primarily, Facebook will change the “signals” it uses to highlight content in the News Feed. “We’ve also learned that some engagement signals can better indicate what posts people find more valuable than others. Based on that feedback, we’re gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment on or share political content,” Facebook’s Product Management Director, Aastha Gupta, wrote in a blog post. The company acknowledges that this may impact public affairs content and the site traffic of publishers. Keeping this in mind, the blog post said that FB would follow a “gradual and methodical rollout for these tests,” adding that more regions will be included over the coming months.
Facebook started testing reduced political content in News Feeds earlier this year
Facebook started testing reduced political content in News Feeds earlier this year
The social media juggernaut began testing a News Feed with reduced political content this February. However, it only covered Brazil, Canada, Indonesia, and the U.S. Not long after, Facebook said it would allow users to express what they don’t want to see in their News Feeds. The company also talked about an emphasis on posts that inspire or encourage people. Facebook has long been accused of providing a platform for the spread of misinformation. The new measure is one of the many steps the social media giant has taken to curb such content over the past year. The company claims that political content in the News Feed only accounts for 6% of the overall content. However, FB doesn’t specify its definition of political content. The idea here is to counter the misinformation that slips through the cracks under the guise of breaking news. Since such news is published in real-time, fact-checking them is almost impossible. Facebook has tried to cut down political content on its News Feed since the 2020 U.S. Presidential Elections.