You see what you see on the internet because of algorithms. Facebook’s News Feed, Google’s Search Results, and the feeds of both LinkedIn and Twitter are all sorted algorithmically. If you use Gmail, even the emails you receive now get routed to your different inboxes algorithmically.
For digital marketers whose job is to get people to consume content online, managing these algorithms has become a foundational skill. It’s a particularly murky task considering that the criteria built into the algorithms generally remain concealed by their techno-monopolist creators.
So on September 22nd, when Facebook explicitly disclosed the types of content they demote with their News Feed algorithm, it was big news for marketers and publishers. And the news was significant for society at large too. What gets demoted and what gets broadly distributed on Facebook is at the very crux of the free speech and misinformation balancing act.
But that news was quickly overshadowed by a related, albeit more sensational, story involving content on Facebook, as a former product manager employed by the company blew the whistle on what she saw as the corporation’s willingness to spread harmful misinformation because it drives engagement and profits, to the detriment of society and democracy. Her solution: more regulatory oversight of Facebook and the content therein.
And hey, you know what? We find ourselves in a pandemic-exacerbated societal rut that we need to emerge out of one way or another. If more regulation reduces misinformation which nudges us towards COVID and cultural deliverance, all the better.
But really, something closer to the complete opposite is true.
Facebook’s new demotion standards, to say nothing of whatever stricter controls will likely come in the near future, might just blind us to that rarest of content that can change the world for the better at a time when we need it most.
With that in mind, here is a look at how Facebook’s newly disclosed content demotion standards might have interpreted some of history’s most remarkable content, had it been published today.
Demoted: news articles lacking transparent authorship
Per Facebook, “News articles that do not contain bylines or a staff directory with editorial staff information at the publisher level” will be demoted. So a named author, or at least a named editorial staff, is now required for content to generate maximum distribution on Facebook.
But since when is the quality of a piece bound inextricably to its authorship?
Perhaps the most viral piece of content in American history was published anonymously. It was the polemical pamphlet Common Sense, with its plainspoken language, that animated the colonial commoner’s spirit with a fiery sense of independence and revolution. Soon after, a pseudonymous Publius argued voluminously in The Federalist Papers in favor of binding those colonies together in strong nationhood by ratifying the Constitution.
Both pieces of content changed the world for the better. Neither had transparent authorship.
The gender-insightful Pride and Prejudice was published anonymously in 1813. Writing was not the business of women at the time, so surely a man wrote it. But of course, Jane Austen had authored the masterpiece, and were she publishing the same exact content today in serialized form as part of a broader anonymized social commentary site, it would be eligible for suppression, not promotion, on Facebook.
Demoted: links to domains and pages with a high click gap
This demotion standard applies when a publisher’s traffic comes overwhelmingly from Facebook relative to other sources. The idea being, if a publisher has experienced hyper-growth on Facebook but nowhere else, it is likely manipulating Facebook’s News Feed algorithm or its population of users in some unsavory way. Legitimate publishers have many proportionate streams of traffic.
But Facebook is confusing manipulation for mastery. If most of your website’s traffic comes from Facebook, it might just mean you’ve optimized distribution on the platform, not that you’ve gamed it.
Conservative media site The Daily Wire has been notoriously effective on Facebook since the publication’s founding in 2015. Its founders realized early on that, when it comes to digital content, the distribution medium is just as important as the message. So they tailored their content specifically to the new channels it would be discovered in. While legacy newspapers were stuck pushing slow-loading longform articles with unclear headlines on Facebook, The Daily Wire produced snappy news that was optimized for in-scroll direct responses. Consumption of their product grew precipitously as a result.
Comparably, back in the 16th century, a German priest named Martin Luther produced a premium piece of content that attracted viewership overwhelmingly from one channel. In his day, religious media had been distributed exclusively through local bishops networked with the pope. That channel had been uniquely hostile to any content that undermined the established orthodoxy. Just ask Jan Hus. So Luther, albeit probably accidentally, optimized his church-challenging content for decentralized mass printing and hand-to-hand transmission by enumerating those 95 Theses on single sheets of paper that anyone could discreetly slip to anyone else like a note in class. And so started the Reformation.
Demoted: fact-checked misinformation
This standard has been broadly publicized for sometime now: content that is “False, Altered, or Partly False” gets demoted.
Of course, given the sheer volume of content on Facebook, artificial intelligence and machine learning are instrumental in enforcing this standard. The AI and ML are on the front lines of content moderation across the platform, and it is their responsibility to first flag posts that may contain false claims, for a human to then later review and make a final determination on.
You know, for a culture so deeply conflicted about the essence of truth, delegating this most foundational of tasks to robots seems a peculiar move. Like, has no one seen Ex-Machina or Terminator? Or 2001: A Space Odyssey or The Matrix? Or read Minority Report? There is an entire genre of storytelling devoted to this most timely piece of wisdom: beware placing too much trust in the robots. And yet here we are.
And besides that, all of this assumes that there is, in fact, such a thing as an externally verifiable truth, which would conflict with the prevailing culture’s notion that each individual is entitled to his or her own personalized truth.
Those brain twisters aside, what this really boils down to is this: who gets to decide fact from fiction? Well, fortunately for us, we have dedicated institutions comprised of experts responsible for making those very decisions on most subjects of public interest. That should cover us right? Right?
History tells us things don’t always work out that way. Early in the 17th century, Galileo Galilei employed the telescope to produce evidence supporting Nicholas Copernicus’s argument that the sun, not the earth, was at the center of the solar system. Galileo’s content was flagged as misinformation by an institutional functionary, which led to a committee of experts being assigned to review the matter. Those experts concluded that heliocentrism was a scientific falsity. Any content that suggested otherwise was banned.
Demoted: posts from broadly untrusted news publishers
From Facebook, “Content from news publishers that our community broadly rates as untrusted in on-platform surveys” will be demoted.
Here so much hinges on this most basic question: who is the “community” being surveyed? Is that population hand selected by Facebook? Is it a population representative of the entire country or just Facebook users? Or is it something else all together? Facebook disclosed no further details.
You can see how this would be fraught with complications for almost every publisher. Progressives find Fox News untrustworthy. So should its content be demoted? Conservatives find CNN untrustworthy. Should its content be demoted too?
What about the trustworthiness of the Russian black market publications that sprang up behind the iron curtain and argued for free expression and individual rights at a time when coercive collectivism ruled the day? A community of communist comrades would have found state issued media more trustworthy, of course. And even those citizens who harbored dissenting points of view would be reluctant to respond to surveys honestly for fear of some form of party-sanctioned retaliation. In either case, non-fiction essays from the likes of Aleksandr Solzhenitsyn would have been attributed as untrustworthy and algorithmically suppressed as a result.
It’s not about the quality of information or the shielding of democracy. It’s about preserving the established order of things. The content we need most, that really moves things forward, is out there.
You just might not find it in your Facebook News Feed.