Facebook has been having an especially rough time lately. Beyond the usual pressure from regulators, lawmakers, and FTC lawsuits, President Biden took things up a notch when he initially criticized Facebook for "killing people," by allowing misinformation about Covid-19 vaccines on its platform.
Biden later walked that statement back, clarifying that what he apparently meant was that people who are spreading misinformation are causing people to die because they aren't getting vaccinated. Facebook, understandably, took offense to that characterization. No one wants to be accused of killing people.
Of course, that doesn't change the fact that almost everyone agrees that Facebook has a real problem with misinformation, especially related to the pandemic. The only question is what Facebook can and should do about it.
In an interview this week with Casey Newton for The Vergecast, Mark Zuckerberg talked about just that, only it seemed like he was admitting that getting rid of all misinformation on Facebook is just too hard, and people should expect less. To make this point, Zuckerberg compared removing misinformation on Facebook to police fighting crime.
When you think about the integrity of a system like this, it's a little bit like fighting crime in a city. No one expects that you're ever going to fully solve crime in a city. The police department's goal is not to make it so that if there's any crime that happens, that you say that the police department is failing. That's not reasonable. I think, instead, what we generally expect is that the integrity systems, the police departments, if you will, will do a good job of helping to deter and catch the bad thing when it happens and keep it at a minimum, and keep driving the trend in a positive direction and be in front of other issues too. So we're going to do that here.
There is almost too much in that paragraph to tackle in one column, but I think it's safe to say that Zuckerberg's analogy breaks down when you consider that, generally speaking, the police aren't providing safe haven for the criminals, giving them a free tool to communicate with each other, and amplifying their efforts, helping them recruit more criminals.
Another point worth mentioning is that if any city had a crime problem on the scale of Facebook's content moderation problem, not only would the police chief be fired, the Mayor and City Council would resign in disgrace. And cities have constrained resources with which to fight crime and have to strike a balance with other needs like, say, putting out literal fires. Facebook doesn't have that problem.
Facebook has, for all practical purposes, endless resources to devote to the problem. It's one of the most efficient profit-generating machines ever invented. One of the most effective ways to stop misinformation would be to stop amplifying so much of it.
The problem is that this is often the same type of content that keeps people engaged. For Facebook, engagement is the single most important fuel that drives the profit machine.
No doubt, it's a big problem. There are almost 3 billion people who use Facebook on a regular basis. I have no idea how much content they create on a daily basis, but the law of large numbers says that it's probably next to impossible to prevent every bad piece of content.
If that's the case, and you find that you've unleashed into the world something that creates harm, you have a responsibility for what happens next. Even if you aren't directly responsible for posting and sharing content, you are accountable. But, honestly, if it's just too difficult to effectively moderate content on the platform you built, it's possible that you've built something you shouldn't have.
July 25, 2021 at 02:30PM
https://www.inc.com/jason-aten/mark-zuckerberg-says-combatting-misinformation-on-facebook-is-just-too-hard.html
Mark Zuckerberg Says Combating Misinformation on Facebook Is Just Too Hard - Inc.
https://news.google.com/search?q=hard&hl=en-US&gl=US&ceid=US:en
No comments:
Post a Comment