LUKE SHARRETT/THE NEW YORK TIMES/REDUX/EYEVINE
ANALYSIS YOUTUBE KIDS
Child’s play? YouTube is the top source of video content for kids of all ages. So why is there so little control over what they watch, asks Chris Baraniuk LOGAN PAUL is a broadcaster with an audience of almost 16 million people, but until recently, you had probably never heard of him. Paul sparked controversy earlier this month after uploading a YouTube video of himself gawking at a body in Japan’s Aokigahara forest, which is known for the high number of suicides that have occurred there. The video was viewed millions of times before it was taken down. Paul has since apologised and 20 | NewScientist | 27 January 2018
YouTube has cut some of its business ties with the star, who is one of the most popular personalities on the site. It has also changed the way it monetises some videos. Paul’s large audience of children and teenagers made the incident all the more worrying. “Logan, you’re still my hero,” said one young fan at the end of a much-viewed reaction video. But this is not the first time YouTube has faced criticism for the level of
oversight it applies to content aimed at young people. Should we expect the world’s largest video platform to do more? It is a question increasingly in need of an answer, considering that YouTube is now the number one place where kids of all ages consume video content. Nearly half of 3- to 4-year-olds in the UK watch content on the site, according to the media regulator Ofcom. Viewing figures rise with age and 90 per cent of 12 to 15s
watch YouTube videos. Among this age group, it is a better known brand than the BBC and ITV (the UK’s largest traditional broadcasters), and Netflix. Three years ago, YouTube decided to target this growing audience more directly with a new app, YouTube Kids, offering a selection of content automatically picked from the main site. “The app makes it safer and easier for children to find videos on topics they want to explore,” declared a
For daily news stories, visit newscientist.com/news
YouTube blog post about the app. It hasn’t quite worked out that way. YouTube Kids is riddled with material that would never be shown on children’s TV, says UK presenter Ed Petrie, who has worked for Nickelodeon and CBBC, two children’s channels. He points to a video on YouTube Kids in which a man throws a dish of boiling water into the air on a very cold day, turning it into snow. “There’s absolutely no way Nickelodeon or CBBC would ever show someone doing that without an awful lot of caveats and someone explaining it’s dangerous,” he says. This sort of video, without profanity, nudity or other obvious red lines, is exactly the kind that might not be picked up by an algorithm.
Watch the algorithm But some have found more disturbing content on YouTube Kids in the past. In November, a widely shared essay by writer and artist James Bridle highlighted the presence of upsetting and violent parodies of children’s cartoons on the app. Petrie believes YouTube has neglected its responsibilities here. He argues the site should radically change the way it monitors YouTube Kids and only provide videos that have been checked by a human moderator. “I don’t care if that means less content,” he says. “It’s just not appropriate to have software decide what’s OK for kids.” YouTube Kids warns parents that it is not flawless. “It’s possible your child may find something that you don’t want them to watch,” says a message during the app set-up. But given we have no idea how these videos are chosen, it is hard for parents to know how much they should worry (see “What should parents do?”, right). Google, which owns YouTube, declined to answer specific questions put to it by New Scientist about how its algorithms select content for children, or how
it responds to videos that have agree that some form of been flagged as inappropriate. automation is the only way “We work to make the videos in to moderate large sites, with YouTube Kids as family-friendly algorithms acting as gatekeepers as possible and take feedback that flag unsuitable content to be very seriously,” says a Google checked by humans. But if we are spokesperson. “Flagged videos putting the algorithms in charge, are manually reviewed 24/7 and we should know how they work. any videos that don’t belong in “The safest thing in the world the app are removed.” would be to have someone look Opaque, algorithmic decisionat every single thing that comes making is concerning wherever through, but in reality that it occurs, but Google’s lack of wouldn’t be possible,” says Peter transparency when it comes “YouTube Kids is riddled to children is particularly with material that problematic, says Sonja Jutte would never be shown of the NSPCC, a UK children’s on children’s TV” charity. “[Self-regulation] has clearly failed to protect children from inappropriate content and Maude at content-moderating behaviours,” she says. firm Crisp Thinking, which works YouTube’s challenge is to with the likes of Disney, Coca-Cola balance an unprecedented and the BBC. volume of content with A more sophisticated approach, reasonable standards. More than he says, is to monitor the 400 hours of footage is uploaded comments posted beneath a video. to the main site every minute. The footage itself may be deemed Last month, Google announced innocuous by an algorithm, it would hire thousands more but if people are responding with human moderators in an effort outraged comments, that could be to better police hate speech, a sign that it needs to be checked misinformation and content that by a human, even if no-one has might be harmful to children. gone to the trouble of flagging the “We have welcomed that,” says video as unsuitable. Jutte. “However, that isn’t a Joshua Buxbaum at WebPurify, substitute for transparency.” another content-filtering firm, Most people in the industry says that videos can be broken
WHAT SHOULD PARENTS DO? Parents and guardians should not be expected to police all content that their child might encounter online. “I’m getting quite tired of the parent-blaming going on, saying parents should be there every minute of the day – we never said that for comics or reading books,” says Sonia Livingstone, a psychologist at the London School of Economics who studies children’s use of digital media. But given the disturbing videos that can be found on the web, parents may feel they have no choice. On YouTube Kids (see main article), parents can block videos or channels they don’t want children to see, disable search or set timers for maximum allowed screen time.
In general, though, Livingstone says parental controls on such services are “very limited”. Many parents simply hope that content-blocking software will keep their children safe. But one 2017 study of more than 500 children aged 12 to 15 found no evidence that such attempts to filter material decreased the likelihood of having negative experiences online. Livingstone says that parents shouldn’t get upset if they see their child watching something inappropriate. Instead, she suggests that it is much more productive to discuss the material and explore their child’s response to it when they are ready.
into a selection of frames for checking by humans. They can also be sped up to help human moderators process content more quickly. Despite this, he says the cost of checking every video safely and properly on a large site would be “astronomical”. He’s not wrong. Paying humans just $10 an hour to review new YouTube videos would cost more than $2 billion a year, almost half Google’s parent firm Alphabet’s 2017 profits, assuming they watch at regular speed. But do kids really need access to so much content? Publishers like Netflix and the BBC manage to produce hundreds of hours of safe, child-friendly videos, more than any individual could possibly watch. YouTube, of course, doesn’t even have to pay production costs – most of its creators get paid through ad revenue sharing after the video has been made. So why can’t it ensure that YouTube Kids only contains safe content, just as its rivals do? New Scientist put that question to Google, but it declined to comment. That is probably because the answer lies in Google’s long-held position that it is a platform, not a publisher, and thus merely provides an opportunity for others to distribute content. “Our mission is to give everyone a voice and show them the world,” says YouTube’s about page. To define itself as publisher might give Google an entirely new problem: accusations of censorship, because it would need to determine what content is and isn’t allowed on its services. In the messy world of adults, that’s not desirable, but for children, perhaps editorial control is exactly what is required. For Petrie, an extra safe approach, no matter the expense, will always be the only acceptable one when it comes to providing content for young viewers. “Someone like me sounds a bit like a fuddy duddy,” he says. “But it’s just a case of caring about the people who consume your stuff.” ■ 27 January 2018 | NewScientist | 21