Facebook spends more energy trying to mend their broken public image than fixing what’s happening on their platform and at the company. Take a look below to dive deeper into Facebook’s problems and our demands.
Get new leadership
When Facebook’s data scientists presented Mark Zuckerberg with “potential changes to curb the tendency of the overhauled algorithm to reward outrage and lies,” the CEO resisted the proposed fixes because “he was worried they might hurt the company’s other objective—making users engage more with Facebook.” Facebook’s leadership and their profit over people model has failed to keep people safe online and offline.
What we want:
- Mark Zuckerberg steps down as CEO
Prioritize data privacy
In 2020, Facebook made 98% of its revenue from advertising and it will do whatever it takes to keep our attention on the platform even if that means letting disinformation and hate run rampant on the platform. The longer we are glued to Facebook or Instagram, the more they can track us and build profiles to sell to advertisers.
What we want:
- Privacy of every user on Facebook and Instagram should be protected by default and by design. Allowing people to opt-out of ad tracking is not enough when it is difficult to find and puts the onus on users.
- Build systems so that the pursuit of engagement does not favor hate content, conspiracies, polarization and disinformation. Never monetize any content of this kind.
- Halt Instagram for kids immediately.
Tackle disinformation and misinformation
By the time the pro-Trump mob was storming the capitol on January 6th, disinformation and conspiracy theories had been compiling for months, fueling the violent, attempted coup. This is just one example of how lies go unchecked on Facebook. On everything from Covid-19 to climate to elections, Facebook is not doing enough to make sure dangerous disinformation is not being spread. Profit motives should never be prioritized over creating healthy environments online.
What we want:
- Turn off algorithmic manipulation so that false content is not amplified.
- Public reporting and transparency on research and data about how Facebook is handling disinformation.
Provide transparency on content moderation decisions
Violence starts with words. And what is written online — especially when those with power disparage those without — affects what happens offline (and vice versa). From Charlottesville to Myanmar to Palestine, Facebook has enabled atrocious acts of violence and human rights abuses to take place.
But Facebook’s content moderation rules are a mess — they are vague and inconsistently applied. Recent releases from The Wall Street Journal have shown that Facebook had millions of “elite” users who were not held accountable to their moderation policies. And other research has shown that Black users on Facebook and Instagram are 50 percent more likely to have their accounts automatically disabled by the content moderation system. On top of this, users don’t even know what is allowed on the platform vs. not.
What we want:
- Adopt the Change the Terms model policies and enforce them.
- Stop aiding white supremacy, gender based violence, and human rights abuses
- Make and enforce a policy specifically against white supremacy and ban white supremacists.
- Do not aid governments that are complicit in human rights abuses — this starts by holding people in positions of power to higher content moderation standards than others on the platform.
- Create and/or modify content moderation policies that take into account social, historical, and political context.
- Remove the newsworthiness exemption — and any other secret list of VIP users — that allows elected officials and others to bypass existing policies.
- Regular transparency reports detailing the impact of content moderation decision-making.
- Content moderation decision-making processes and auditing are opened up to include meaningful input from community orgs that represent Black and Brown folks who have been most affected by Facebook’s neglect and targeting.
- Take steps to repair harm already done by:
- Allowing individuals facing severe harassment to connect with a live Facebook employee.
- Providing reparations for harmed communities.