YouTube Has A Massive Child Exploitation Problem. How Humans Train Its Search AI Is Partly Why


By Davey Alba for BuzzFeed (https://www.buzzfeed.com/news), 12/28/2017

{excerpt}

YouTube says those content moderation responsibilities fall to other groups working across Google and YouTube. But according to Bart Selman, a Cornell University professor of artificial intelligence, although these reviewers do not directly determine what is allowed or not on YouTube, they still have considerable impact on the content users see. “Since the raters [make assessments about quality], they effectively change the ‘algorithmic reach’ of the videos,” he told BuzzFeed News.

"Even if a video is disturbing or violent, we can flag it but still have to say it’s high quality.”

“It is well-known that users rarely look beyond the first few search results and almost never look at the next page of search results,” Selman continued. “By giving a low rating to a video, a rater is effectively ‘blocking’ the video.”

“Think of it this way,” said Jana Eggers, CEO of the AI startup Nara Logics. “If a search result exists, but no one sees it, does it still exist? It's today's Schrödinger's cat. … [Ratings] impact the sort order, and that impacts how many people will see the video.”

View full article >