YouTube content management
What more could YouTube be doing to protect children from unsuitable content?
YouTube is the leader in the online video market and the second most visited website worldwide. It has a content management system that incorporates a feedback system, an algorithm for prioritizing content, and a recommendation system. The system of management and content filtering plays a significant role in the context of working with vulnerable audiences such as children - and YouTube has a lot of content for children. Parents and supervisors make the primary choice of content, after which the YouTube tools form a list of the videos shown. Google, the owner of the YouTube platform, does not disclose the principles of the management algorithms and recommendations, but parents are delegating content control to the service.
Sergey Ananyev, a postgraduate student studying Applied Management at our Auckland International Campus, undertook research into how well Google manages content for children. Sergey's work was supervised by Edwin Rajah. Sergey sought to identify weaknesses and opportunities for improvement. He used publicly available information from YouTube and other analytics services. This is what he found:
- YouTube promotes mostly non-professional content
created specifically for the site and distributed exclusively or mainly on
YouTube by independent studios or even bloggers. It does not prioritize safe children's content from professional producers -
videos that meet television-quality standards.
- Unlike TV, DVD or theatrical movies, YouTube has no system of classification labels or warnings to guide viewers and their supervisors about the age appropriateness of the content (except option 18+, the criteria of which are also unclear). Such a system would be useful, for example what a child of 10 years old can watch is not always appropriate for showing to children 2 to 5 years old.
- YouTube currently has no option of manual moderation of the recommended content. This means parents and other supervisors are unable to set preferences to prioritise certain types of content or specific channels.
- Negative feedback from users and violation of platform standards (long titles, misleading, controversial covers) does not significantly affect popularity of content or whether YouTube actively promotes content. Increasing the influence of positive and negative feedback on the main elements of the recommendation system - likes, comments, user complaints - would help keep children safe online.