At Dubit, we sometimes refer to YouTube as “Kid Google.” When young people seek information or want to learn new skills, they turn to the video platform in lieu of a text-based search.
The US Federal Trade Commission’s recent settlement with Google over YouTube COPPA violations commands the service to “develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA.” As evidenced by young people’s search habits, though, child-directed content and child-viewed content aren’t the same thing. How the agreement affects kids’ and families’ practices, and content creators’ viability, rests substantially on that distinction.
Taken at face value, the settlement is suitable and important. It’s the world’s worst-kept secret that kids under 13 are on YouTube. According to Dubit’s global Trends survey, in the US over 50% of 6- to 12-year-olds view YouTube videos daily; three-quarters do so at least weekly. The figures are similar worldwide. We can’t turn our backs and pretend that the data-mining and advertising practices that drive the platform – acceptable for ostensibly media literate adults – are also right for kids.
The settlement affects three interdependent parties – YouTube/Google, content creators and users (kids, parents, educators, etc.). Accommodating the FTC’s terms may be easiest on Google. YouTube has already created a web-based version of its Kids app, so it’s accessible beyond mobile devices. It’s a solvable coding problem to enable those posting videos to identify child-directed content as such, turning off behaviorally-targeted advertising (as the agreement demands). The platform will supplement creator disclosures with AI-driven checks.
Google will incur costs and sacrifice revenue but will take a much smaller hit than YouTubers and those who operate channels on the platform. Decreased revenue, plus the restrictions on notifications and comments, means there will likely be a winnowing of smaller creators and those less able to spend big on off-platform marketing. Content may be king, but without discoverability it’s a lonely kingdom, and notifications and comments are substantial drivers of discovery.
When it began, YouTube was hailed as a democratic and social platform, a force for diversity of voices and visions. An individual or small business with a great idea - anywhere in the world - could access an audience directly, no longer needing to jump through commissioners’ hoops to be granted one of TV’s very limited schedule slots. The internet has unlimited shelf space, so YouTube could be anything from a vanity platform to a proving ground where unusual ideas could find their communities, and demonstrate appeal (some titles were picked up by “mainstream media,” like Yo Gabba Gabba, Storybots and Pancake Mountain).
But, YouTube – especially once embedded in Google’s massive reach – grew at such a massive clip (no pun intended) that it outstripped capacity to manage it with adequate nuance. Companies large and small rushed in with new and profitable ways to use the platform. The burgeoning size made it difficult for users to find what they wanted. Algorithms and behavioral tracking became more intense and complex, serving the dual masters of discovery and profit.
This is where the difference between child targeted and child viewed comes into play.
Kids, like adults, aren’t a monolithic audience. They have unique passions and personas. With YouTube, it didn’t matter if there wasn’t a cable channel for your particular interest: there was “Channel Me.” Skateboard stunts, animal videos, game cheats, make-and-do, homework help – you could and can (for now) find it all on YouTube.
Was it quality? Not all, certainly, but “quality” depends heavily on the needs of the child.
Was it “child targeted”? Well, fans don’t wait until their 13th birthday to develop enthusiasms. Kids go outside the walled garden of YouTube Kids because no gardener can anticipate and curate the infinite varieties of content that will spark the individual child. A fan will go wherever s/he needs to in pursuit of a passion.
Unquestionably, kids ignore content labeling and learn how to “game” age gates (often with parental help). Within her blog post on changing practice resulting from the settlement, CEO Susan Wojcicki spoke of working with families as well as creators, to create a supportive environment for parents to help their kids follow the rules and stay safe, while having a productive experience. One hopes that they’ll also find ways to support teachers who use YouTube videos and create playlists for the classroom.
Change has been sorely needed, but at some point, we also have to ask what we want children’s media to be – how to sustain development, production, distribution and promotion of high-quality, diverse content.
Digital media (not just video) offers must be transparent, fair and ethical for kids and parents. At the same time, we need sustainable options for paying content creators to make the videos, games, apps and more that – clearly – kids love and parents allow. The marketplace went very quickly from willingly paying $50 for a console game to complaining about a $1.99 app; kids and parents frequently choose “freemium” versions, ad-supported options, or tolerate embedded marketing. Subscriptions are equitable but with so many on offer, parents are quickly hitting budget overload.
The three-year, $100 million fund to support original kids programming will help, but there are questions. How will the creative priorities help sustain the incredible diversity that defines the platform? and is there a plan for grantees to become self-sustaining, if they prove popular.
We need new, deeper thought about data gathering, as well. Some data we want to be taken and kept, because it enables personalization, customization, and setting an appropriate learning level. Some data we want applied to improving user experience, but then immediately deleted. Then, there’s data that should never be collected or shared. Parsing these categories is in equal parts a regulatory need, a design challenge, a parental responsibility, and an argument for lifelong media literacy education.
As was true with TV, children are a small part of big tech companies’ market, and so need special protection from being swept up in practices that may work for grown-ups but are inappropriate for youth. The battleship of government turns too slowly to keep pace with the speedboats of industry (one need only hear lawmakers’ questions during hearings to know this). Parents want protections available, but seldom use them. Kids are facile with emerging technologies, but we too often mistake facility for critical literacy.
So, where will we find ethical solutions that will also reward innovation and independent ideas? Without them, we risk throttling the potential of new technologies and platforms, and returning to the old days where just a few massive companies controlled all flow of content.