Instagram Apologizes After IGTV Recommends Horrific Videos

Instagram is an incredibly strict medium that will track you down near instantly if you show a nipple.

The new IGTV is… not working the same at the moment.

As part of an investigative report, Business Insider monitored the new video service’s For You and Popular tabs for nearly three weeks and found the algorithm was recommending some really messed up vids — even recommending them to teens!

Photo: Kendall Jenner Gets Around IG’s Nudity Policy

One, entitled “Hot Girl Follow Me” featured a young girl reporters estimated was only 11 or 12 years old “moving to take her top off” before the video ended.

Another video featuring a scantily clad “clearly underage” girl was recommended to a dummy account set up by reporters — with a listed age of 13!

Multiple comments asked why Instagram was recommending this kind of things while other comments said simply, “Nice.” *shudder*

The National Society for the Prevention of Cruelty to Children reported the videos to the police as potentially illegal.

When Business Insider reported the vids to IGTV, which is owned by Facebook btw, they had to report them as “nudity or pornography” as there was no “illegal” option. The vids stayed online for 5 days and weren’t removed until Business Insider contacted IG’s press office. Ugh.

Video: How Cambridge Analytica Used Facebook To Help Trump

The accounts which posted the videos remained active.

Another painful video featured a penis being freed from a lug nut by an electric saw. Another saw a baby being held down by a monkey.

After several other disturbing videos, BI reached out to Instagram about enforcing their standards.

IG refused to explain their recommendation algorithm, but their spokeswoman would say:

“We care deeply about keeping all of Instagram — including IGTV — a safe place for young people to get closer to the people and interests they care about.

We have community guidelines in place to protect everyone using Instagram and have zero tolerance for anyone sharing explicit images or images of child abuse. We have removed the videos reported to us and apologize to anyone who may have seen them.

We take measures to proactively monitor potential violations of our Community Guidelines and just like on the rest of Instagram, we encourage our community to report content that concerns them. We have a trained team of reviewers who work 24/7 to remove anything which violates our terms.”

It sounds like those measures aren’t working if the videos don’t just exist but are being PROMOTED by the site.

Source: Read Full Article