Instagram Stories shows X-rated content to users not searching for it

Instagram's Stories function has made it easier for users to stumble accidentally on inappropriate content through search.

Instagram Stories: automatic suggestions can show inappropriate content
Instagram Stories: automatic suggestions can show inappropriate content

Campaign spotted X-rated Stories appearing as suggestions on Instagram’s search page last week. 

Tapping the magnifying search icon on Instagram’s home feed takes the user into a new feed of suggested content, called Explore.

This shows a variety of suggest images, videos and, now, Live Stories from accounts which the user does not follow.

In this instance, the majority of videos and images being displayed were appropriate. But suggested Stories comprised only X-rated accounts.

While pornographic content is an unavoidable aspect of the internet, the difference with Stories is how easy it is to come across inappropriate images even when the user is not looking for them.

Other users have also reported issues.

— /Users/tomn/ (@tomn94) October 18, 2016


Instagram introduced Stories in August as an answer to a popular Snapchat feature that is also called Stories.

Instagram’s version allows users to weave together video or a series of images into a narrative of their day. Like Snapchat’s version, this disappears after 24 hours. And if a user’s profile is public, that Story can appear as a suggestion in search.

It is this latter part that means X-rated content might be harder to find on Snapchat. A user can view a person’s Snapchat Stories only if they are already following them, meaning it is more difficult to stumble on inappropriate images by accident.

Campaign understands Live Stories are suggested based on a user’s interests, and what is already popular on the service. It will only show users content from accounts they do not already follow.

When it comes to inappropriate content, Instagram uses a mix of manual and algorithmic filtering to screen out spam, pornographic images and pictures portraying violence and self-harm, among other things. The service also largely relies on its community to report inappropriate content for removal. Campaign found that the X-rated content was eventually removed from Instagram’s search page.

In this particular instance, the X-rated images did not technically violate Instagram’s terms, since they did not show actual nudity. There is also no suggestion that inappropriate Stories would show up against brand advertising.

Instagram declined to comment.