Tumblr has announced they will remove adult content from its platform from December 17, 2018, a move that eChildhood applauds.
As reported in the Guardian, this move will alter how the social network [tumblr] is used and shows an increased desire by major media companies to restrict which content appears on their websites.
Attempts to provide a “safe mode” for younger users have sometimes caused problems. Last month the Tumblr app was removed from Apple’s app store after its filtering system inadvertently allowed some child pornography to appear on the social network.
Tumblr’s chief executive, Jeff D’Onofrio, said that as the site has grown, its responsibilities to “different age groups, demographics, cultures, and mindsets” have increased and it has now decided to remove all adult material from 17 December.
“We spent considerable time weighing the pros and cons of expression in the community that includes adult content. In doing so, it became clear that without this content we have the opportunity to create a place where more people feel comfortable expressing themselves,” he said.
“Bottom line: There are no shortage of sites on the internet that feature adult content. We will leave it to them and focus our efforts on creating the most welcoming environment possible for our community.”
In our updated Public Health Approach, Safety by Design is an integral component in ensuring online safety for children.
With Tumblr on the front foot, we should expect to see more of these responses to children's safety from social media platforms and other tech companies. The Office of the eSafety Commissioner report that they aim to be global leaders in developing a set of principles that embed user-safety into the heart of products and services. The principles and subsequent resources will provide a blue-print for organisations, setting out continuous and on-going commitments to drive forward real change. Along with other key influencers around the globe, there's a rising expectation on industry to prioritise the protection of users by building user safety into the design of all online platforms and services, including new technologies before they are deployed.
The bottom line is that if social media sites, apps and other online platforms say that their product or service is suitable for children, they have an ethical responsibility to ensure that kids are not harmed by the content available via their service. To show your support for moves such as these, join the movement and #LetKidsBeKids.