Instagram Stories shows X-rated content to users not searching for it

Instagram's Stories function has made it easier for users to stumble accidentally on inappropriate content through search.

Instagram Stories: automatic suggestions can show inappropriate content
Instagram Stories: automatic suggestions can show inappropriate content

Campaign spotted X-rated Stories appearing as suggestions on Instagram’s search page last week. 

Tapping the magnifying search icon on Instagram’s home feed takes the user into a new feed of suggested content, called Explore.

This shows a variety of suggest images, videos and, now, Live Stories from accounts which the user does not follow.

In this instance, the majority of videos and images being displayed were appropriate. But suggested Stories comprised only X-rated accounts.

While pornographic content is an unavoidable aspect of the internet, the difference with Stories is how easy it is to come across inappropriate images even when the user is not looking for them.

Other users have also reported issues.


Instagram introduced Stories in August as an answer to a popular Snapchat feature that is also called Stories.

Instagram’s version allows users to weave together video or a series of images into a narrative of their day. Like Snapchat’s version, this disappears after 24 hours. And if a user’s profile is public, that Story can appear as a suggestion in search.

It is this latter part that means X-rated content might be harder to find on Snapchat. A user can view a person’s Snapchat Stories only if they are already following them, meaning it is more difficult to stumble on inappropriate images by accident.

Campaign understands Live Stories are suggested based on a user’s interests, and what is already popular on the service. It will only show users content from accounts they do not already follow.

When it comes to inappropriate content, Instagram uses a mix of manual and algorithmic filtering to screen out spam, pornographic images and pictures portraying violence and self-harm, among other things. The service also largely relies on its community to report inappropriate content for removal. Campaign found that the X-rated content was eventually removed from Instagram’s search page.

In this particular instance, the X-rated images did not technically violate Instagram’s terms, since they did not show actual nudity. There is also no suggestion that inappropriate Stories would show up against brand advertising.

Instagram declined to comment.

Subscribe to Campaign from just £57 per quarter

Includes the weekly magazine and quarterly Campaign IQ, plus unrestricted online access.

SUBSCRIBE

Looking for a new job?

Get the latest creative jobs in advertising, media, marketing and digital delivered directly to your inbox each day.

Create an Alert Now
Share

1 Vodafone, Sky and HSBC join retreat from Google

Several more major brands, including Vodafone, Sky, and a trio of the UK's leading banks, have added their names to the list of those considering suspending their advertising on Google.

Why Cosabella replaced its agency with AI and will never go back to humans
Shares0
Share

1 Why Cosabella replaced its agency with AI and will never go back to humans

In October, lingerie retailer Cosabella replaced its digital agency with an AI platform named "Albert". Since then it has more than tripled its ROI and increased its customer base by 30%.

Just published

More