(Image: Supplied)

If you go on Facebook, Instagram or Messenger, you might come across advertisements selling illegal drugs, counterfeit Australian banknotes, cloned credit cards, guns or, in one case, a monkey.

And Meta, the owner of these platforms, makes money each time it shows these ads to you — despite its own rules prohibiting them.

There are dozens of these posts from the last month on the company’s public database of advertisements, called the Meta Ad Library, that advertisers have paid to promote to Australian Facebook, Instagram and Messenger users.

Several ads promoting the sale of what appears to be illicit drugs (Image: META AD LIBRARY)

Meta’s advertising platform allows users to pay to target the company’s billions of users with messages, using the technology to “find the people most likely to click” using information like demographics and other user data. Meta’s rules prohibit using the company’s advertising services to sell illegal products or services. The company’s website says that advertisements are reviewed “primarily” by automated computer systems before they run. 

These advertisements, first reported by 404 Media, openly offer illegal goods for sale. They use text and images that clearly depict their wares, such as what appears to be guns and drugs as well as captions mentioning “counterfeit banknotes” or “mushrooms”. 

One advertisement even offers a Capuchin monkey “ready to be taken by any interested family” and links to another social media account explicitly offering the monkey for sale.

(Image: META AD LIBRARY)

While some advertisements appear to be global or at least not solely shown to Australian users, other accounts appear to be targeted at Australian users with names like “Mushrooms Store Australia”, or feature purportedly fake Australian currency. 

Many of the advertising accounts appear to have few or no followers, suggesting they have been created just to use Meta’s advertising platform.

The advertisements direct users to the almost unmoderated messaging and social media platform Telegram. These Telegram channels include prices and additional pictures, and offer a way for a user to get in touch directly with the seller. 

IMAGES SHARED ON A TELEGRAM CHANNEL PROMOTED ON META’S PLATFORMS (Image: TELEGRAM)

Crikey is unable to confirm whether these sellers are legitimate or are scamming users. One Telegram user responded to Crikey’s initial inquiries by confirming they were selling to people in Australia. The people behind these accounts frequently post screenshots purporting to show happy customers as proof, but, again, it’s unclear if these are real or not. 

A PURPOrTED CHAT LOG SHARED BY ONE ACCOUNT PROMOTED ON META (Image: Supplied)

After Crikey sent three examples of the advertisements to Meta, a Meta spokesperson said the company had removed the ads and deleted the Facebook accounts behind them.

“We strongly encourage people to report items that may breach our rules so we can review and take the appropriate action,” they said.

Advertisements that have not been sent to Meta remain active and new ones have been approved since Crikey first pointed out the problem to the company.

Deakin University senior lecturer Dr James Martin, who has researched the illicit online drug trade and cybercrime, said he expected at least some of the sellers to be legitimate. 

“I’d be sceptical of some of these advertisements, like those selling guns, but people have been selling illicit drugs on clear web platforms like Facebook since there have been clear web platforms,” he told Crikey

Even if some are scams, Martin said, the existence of these advertisements on Meta’s platforms shows that the company is failing to protect its users.

While social media platforms like Facebook, Instagram and Messenger, as well as Snapchat and Grindr, have long been used to advertise drug sales, the use of Meta’s advertising platform — and the money paid by advertisers — means there is a greater responsibility from the company, said Martin. 

“These companies say that artificial intelligence can’t protect us from bad actors or criminals. No, it’s your business model that isn’t protecting us. If you had humans doing it, it wouldn’t be a problem,” he said.