Andrew Laming and Angus Taylor (Images: AAP)

Liberal MPs faking engagement on Facebook prompts us to ask: (a) how much of Australia’s political Facebook is, well, fake? And (b) does Facebook care?

A recent whistleblower report out of the United States on Facebook’s failures to block fake engagement suggests that (a) it could be quite a bit and (b) that Facebook cares about as much as the squeaking of local political wheels requires of them.

Just last February, Treasurer Josh Frydenberg was eager to engage with Facebook at the most senior levels over how much money Facebook would pay News Corp and Nine (not much, compared to Facebook revenues, it turned out). A year on, there’s notably less enthusiasm to clean up the pollution of political debate where it most directly affects Australians: in their local communities.

What does this fake engagement look like? Think: Andrew Laming’s 30-odd Facebook pages and profiles disguised as community groups in suburban Brisbane; the dozens of community pages set up by Craig Kelly’s staffer Frank Zumbo in southern Sydney; or the hilarious self-own by Angus Taylor commenting on his own page under his own name: “Fantastic. Great move. Well done Angus.”

Fake engagement through shares, likes and comments is designed to shape political discussion by tricking the algorithm into promoting inauthentic posts, while building astro-turfed pages like Zumbo’s “Destination Sutherland Shire” corrals local community groups inside the Liberal Party information network.

Facebook calls it “coordinated inauthentic behaviour“: “Coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation.”

The platform is shocked: “We apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organisations behind it.”

Well, not that shocked: in a series of reports in The Guardian last week, former Facebook data scientist Sophie Zhang (hired by the company in 2018 to address fake engagement) said that political actors were using multiple pages to game the system through a loophole — although users can only have one personal (and authentic) account, they can create multiple pages and use them to fake engagement. (Zhang called it “page abuse”.)

How the mighty is weak

Facebook’s response to criticism goes something like this: we’ve built an algorithm that’s powerful enough to micro-target you with, say, those shoes you just can’t resist buying, but too weak to identify and weed out fake news, sock-puppets and hate speech.

Instead, the platform relies on poorly paid moderators to scoop the froth off the polluted waters of social media.

Facebook’s response is complaint-driven, rather than structured into their algorithm. According to the whistleblower, complaints are prioritised thus: is the country too small to worry about? Is the political party involved too significant to be challenged?

The fight over the news bargaining code showed Australia sits in an odd position: not small enough to be ignored, not big enough to be truly threatening. Look instead at how Facebook responds to the ruling Bharatiya Janata Party (BJP) in India.

Last year, The Wall Street Journal reported that Facebook deliberately ran soft on the BJP, including allowing anti-Muslim posts by a BJP MP even though they had been flagged for promoting violence. Last week, The Guardian similarly reported that it foot-dragged over dealing with a network of fake accounts used to puff a BJP leader.

In the week before Easter, Facebook’s VP for Global Affairs and Communications (and former UK deputy prime minister) Nick Clegg took sloughing off responsibility a step further: it’s not us, he wrote in a TL:DR Medium post, it’s you. “Content ranking is a dynamic partnership between people and algorithms. On Facebook, it takes two to tango.”

Facebook is handing off responsibility to its Oversight Board, set up in late 2018 in response to complaints from the US right that too much (and, specifically, too much conservative content) was being excluded. (Their decision on whether former president Trump should be allowed back on the platform is expected any day.)

Now, the board’s remit has been extended to complaints about material that hasn’t been blocked, but perhaps should be. Further, the board wants to review the algorithm itself.

But it’s engagement — fake and real — ginned up by the algorithm that drives Facebook’s increasing advertising revenues. What chance of risking that?