Hope Corrigan
Curated From www.pcgamer.com Check Them Out For More Content.
Starting next week Facebook users in the United States should start to notice a slight change to their doomscrolling experience. As of March 18 Meta will start rolling out its Community Note fact checking solution; a feature allowing users to make notes on posts questioning the validity of the information being shown.
According to TechCrunch, Community Notes is Meta’s version of crowdsourcing debunking. It works by allowing users to give feedback on the accuracy of information reflected in posts. Others can then vote on whether or not they found the information helpful and accurate, and if enough users think it’s valid, a note will be placed under the original post reflecting the community’s decision.
Zuck first announced the change which is set to replace the previously implemented 3rd party fact-checkers back in January. Unsurprisingly to anyone who’s ever used Facebook, these 3rd parties just weren’t found to be effective. Not only were they simply not fast enough, but they also tended to show too much bias. Worse yet, these were often just incorrect, which really defeated the whole purpose.
When the fact-checker is wrong, then there’s a whole new level of misinformation being spread. Rather than paying better professionals and upscaling the operation, and holding it to a rigid standard, the next best step is to outsource for free to the community. As always it’s worth remembering that if you’re not paying for something, you’re likely the product.
This kind of solution is by no means new. Anyone familiar with X’s version will immediately notice this looks eerily similar to what the platform implemented back in 2021. Back when it was still called Twitter, and I still used it. It’s been the only way of flagging and correcting misinformation on the platform, which has been helpful, but has also had problems.
The problem with something like Community Notes is it can really depend on the community. There’s no guarantee whoever writing the notes actually knows what they’re talking about. Combine that with a group that may have similar views and you’re primed for an echo-chamber of misinformation.
Meta has detailed some steps to minimise this, such as monitoring contributions and noting individual bias. It says it’ll put more weight on verdicts where users who’d normally have conflicting opinions agree, and that the popularity of opinion won’t default to automatically being shown as a fact.
Hopefully this will make for a better version of the community notes system than X’s, which is currently being evaluated by European Commission for its effectiveness when it comes to information manipulation, especially in regards to mitigating risks to civic discourse and electoral processes.
This could by why Meta is choosing the United States to launch, despite the high risk such a large profit sector for the company poses. The US tends to be a bit more lax when it comes to these regulations, so it’s a safer place to test the waters.