By Rob Pegoraro

February 8, 2020

Will YouTube’s New Privacy Rules Actually Protect Children?

Someone finally thought of the children.

By deciding to treat all viewers of videos classified as “made for kids” as children under the age of 13, YouTube should certainly stop some illegal tracking of under-13 viewers by advertisers. But the shift, which Google’s video-hosting subsidiary announced was underway January 6, may also upend the business models of many YouTube creators, even those who say they aren’t trying to appeal to kids.

The risk isn't just the shame-by-proximity of having clips appear next to Barney & Friends. They will also be cut off from targeted advertisements, the most profitable sort, as well as basic YouTube community-cultivation features.

And in case creators don’t correctly tag their content, YouTube will backstop this new rule with machine-learning video-sorting algorithms.

“The issue here is what ‘child directed’ really means,” emailed Daisy Soderberg-Rivkin, a resident fellow on children and technology at the R Street Institute, a free-market-minded think tank in Washington, D.C. “Animated clips, crafting content, videos about antique toys, there's a lot of stuff that kids might watch that was not necessarily created for them.”

But YouTube didn’t take this step all on its own, even if its September post previewing these changes might suggest that nobody else had a hand in them. In September, YouTube agreed to pay a $170 million fine to the Federal Trade Commission and New York’s attorney general for illegally collecting personal data from children without their parents’ consent. As part of that settlement, YouTube committed to stepping up its enforcement of a rule called COPPA, short for Children’s Online Privacy Protection Act.

The 1998 statute essentially creates a parallel privacy universe for kids under 13 #

Instead of the general absence of rules about the collection and use of everybody else’s data, COPPA requires sites and online services to refrain from collecting those kids’ data without parental permission. (HBO’s comedy series “Silicon Valley” provided an excellent recap of COPPA in the season 4 episode “Terms of Service,” when a video-chat startup blew off this law and found itself facing a potential $21 billion fine.)

Back in the late 1990s, the primary fear driving COPPA was not that kids would get targeted by ads but that their personal info would be scooped up by unscrupulous if not predatory sites.

A contemporary story by Larry Magid in the *Los Angeles Times *emphasizes that safety angle and doesn’t even mention advertising. But in 2012, the FTC revised its regulations to, among other things, ban sites from tracking children using “persistent identifiers that can recognize users over time and across different websites or online services, such as IP addresses and mobile device IDs.”

YouTube violating that persistent-identifiers rule led to the FTC action and settlement in September, and then the crackdown #

"If content creators have child-directed channels on YouTube, they must comply with COPPA,” said FTC commissioner Christine Wilson at the Family Online Safety Institute’s annual conference in Washington in November. But, she added: "I am aware that there is a lot of consternation on this topic." The immediate concern among YouTube videographers is that they’ll see their incomes shrivel.

“We asked creators to go into Creator Studio and disable personalized ads (called ‘Interest-Based Ads’, under the ‘Advanced’ tab) for a few days,” Jonathan Katz and Victoria Fener wrote on TubeFilter in November. “Based on our initial testing, a video not running personalized ads sees a loss in revenue somewhere between 60% to 90%.”

A study released in 2019 of the economic effects of disabling ad tracking at news sites found a much smaller shortfall—only 4% when browsers such as Apple’s Safari blocked ad tracking and left sites confined to matching ads with their context instead of behavioral data. But YouTube is a different market than news sites, and its own widely-used apps don’t block ad tracking.

Jeremy Johnston, co-host of the YouTube channel J House Vlogs, said in an email that this potential shortfall represents his primary concern for his channel, in which he and his wife Kendra broadcast episodes of their life in Puerto Rico with their five kids to 1.99 million subscribers. “I love that viewers enjoy our content for free on YouTube, and we're able to make a living creating the content,” he said. “YouTube's use of personalized ads have been the mechanism that makes this transaction possible.”

But he’s also worried about being locked out of YouTube’s community features. Yes, even the comments.

“Losing comments is a big deal,” Johnston wrote. “We have spent hours responding to comments and engaging with viewers.”

YouTube’s new rules for content that appears to be made for kids will also cancel recommendation tools and polls on Johnston’s channel. Google publicist Ivy Choi pointed to a video YouTube posted Dec. 17, 2019 to answer creators’ questions, in which YouTube head of family partnerships Lauren Glaubach explained that the site had to deactivate those options on made-for-kids videos because “these features rely on user data.” Johnston added that he doesn’t know how many of his viewers might be under 13, because YouTube analytics only cover users above that age.

At a January event in Washington hosted by the libertarian group TechFreedom, multiple YouTube creators complained about being held accountable for the collection of children’s data when it’s Google that scoops up that data and then shares only vague outlines with them.

“We're practically just getting pie charts,” said Jackie NerdEcrafter, who hosts a YouTube channel about crafts projects from Quebec. “We don't have any deep-dive information or personal information on anyone at all.” She also complained about the vagueness of YouTube’s criteria for what makes a video kid-directed. In an email afterward, she added that YouTube’s algorithm had erred in classifying her own output. “YouTube had designated 2 of my videos as ‘directed to kids’ even though both of these videos were reviews for parents/people with buying power to make informed decisions,” she wrote. “The reason these videos were flagged, I believe, is probably due to the brand name Crayola being the focus of the review.”

As with many social platforms, the lingering problem can be the difficulty of knowing how to navigate new rules #

As NerdEcrafter put it: “As creators, we are not told WHY a video is flagged, so we are left guessing.”

Google’s clearest guidance may be in a video posted Nov. 12, in which Glaubach outlines a list of factors that could make a video “made for kids” (for example, “it uses language that is meant for children to understand” and “it includes activities that appeal to children”) and suggests that creators in doubt should “talk to a lawyer.” Glaubach also notes that YouTube’s machine-learning classification systems may override creators’ designations of videos as not made for kids “if we detect error or abuse,” in which case creators should avail themselves of YouTube’s send-feedback button.

That challenge may only get tougher if COPPA’s age limit gets raised from 13 to 16—a key provision of a bill introduced Jan. 9, and a move that Rep. Jan Schakowsky (D.-Ill.) defended at the State of the Net policy conference in Washington Jan. 28, saying “I want to shift the burden from the consumer.” Soderberg-Rivkin and Johnston argued that the more effective remedy for this problem of kids being tracked as adults is Google’s YouTube Kids app, which blocks that surveillance and only features family-safe content (although its human and machine filters have had problems with the latter).

Johnston said he also pays for YouTube Premium, an ad-free option, “for convenience and time saving”—not out of concerns with ad tracking, which he said he accepts as part of getting free entertainment online. “We don't ever let our kids just watch YouTube unsupervised or without a specific purpose,” he said.

The head of a Washington kids-advocacy non-profit that backs the FTC’s actions and YouTube’s crackdown made the same point.

"The fact that there's no longer the comments or the chats or some of the other things that can really veer kids in the wrong direction is a positive step,” said James Steyer, founder, and CEO of Common Sense, in an email sent by a publicist.

“But parents should still monitor their kids' usage of YouTube, both the amount of time they spend on it and also what they watch on it.”

The FTC, however, can’t mandate that parents stop letting their kids browse YouTube at random. It can’t issue orders to customers, only to companies. So that’s what it’s doing.