YouTube has updated the application of its live streaming policy to prevent young children from going live unless they are supervised by an adult.
In response to widespread criticism that it is not doing enough to curb the spread of inappropriate content and actions related to young children on its platform, Youtube is taking further steps to limit what children can do on its platform.
In addition to limiting the recommendation of videos that show “minors at risk”, YouTube also updated its policies that relate to minors, restricting live functionality for younger minors.
From now on, unless they are “clearly accompanied by an adult,” children will not be able to stream live on YouTube, with the company also explaining last week that “channels that do not comply with this policy may lose their ability to broadcast live. “
In addition, YouTube also has announced the launch of “new classifiers (machine learning tools that help […] identify specific types of content) on […] live products to find and remove more of this content.
the ad comes after a New York Times Report who found that YouTube’s recommendation system “suggested videos of ‘prepubescent and partially clad children’ to users who had watched sexually-themed content. According to YouTube, it has now applied several new restrictions to its recommendation system based on algorithms, reducing recommendations for videos with minors to “tens of millions of videos”.
YouTube says a total ban on videos with children “would hurt creators who rely on the recommendation engine to generate views.”
“Responsibility is our number one priority, and the main focus of our work is the protection of minors and families,” the company explains in the blog post. “With this update, we will be able to better identify videos that may put minors at risk and apply our protections.”
In addition, YouTube explains that videos featuring minors “do not violate […] policies and are displayed innocently. However, there are still a large number of videos that do: the company claims that in the first quarter of 2019 alone, it removed more than “800,000 videos for violations of […] child safety policies (claiming the majority of them were removed before they had ten views).
Going further, YouTube also announced that it was already working with “law enforcement agencies to investigate crimes against children.” As he explains in the blog post, “reports sent to the National Center for Missing and Exploited Children have prompted more than 6,000 such investigations in the past two years.”
you might also like
YouTube answers 10 frequently asked questions about its recommendation algorithm and about finding and discovering videos.
YouTube announced that creators can now place midrolls, add end screens and fact sheets – and change captions -…
YouTube launches the $ 100 million YouTube Shorts fund to reward creators for their content on its short video feature,…
YouTube’s TikTok-style short video feature is no longer in beta and is rolling out to all creators based in the…
At some point we’ll all be looking to rename our YouTube channel, because, well, life. YouTube is finally doing it …
YouTube announced an extension of its initial and seasonal sponsorship offers on YouTube Select in the United States