Digital Solutions form one component of four within our eChildhood Public Health Approach.
There is no single solution that will solely act to reduce access to online pornography for children and young people—rather, it is a combination of multiple solutions that will significantly improve online safety. The online environment is complex, however, to ensure a positive and protected experience for children, an effective Public Health Approach requires a broad range of digital solutions to be in place nationally.
To create an understanding of digital solutions, eChildhood has adopted the term Digital Child Protection Buffers to refer to the various ways in which technology can prevent pornography harms.
The implementation of Digital Child Protection Buffers that enable flexibility and innovation (and ensure responsibility is placed in the correct hands), are essential for the best interests of child safety, health and wellbeing. A key component of this measure is to ensure that commercial providers of harmful content, the pornography industry, secure their platforms so that materials classified as prohibited will not be easily accessible by children. This is the current law ‘offline’, and in the best interests of child protection, we must update online legislation to reflect community expectations. With an estimated one-third of internet users under the age of 18, it is imperative to focus on the safety of online spaces.
The following is an ecosystem of the Digital Child Protection Buffers which eChildhood sees as pivotal in ensuring access to pornography by minors is significantly reduced:
Key stakeholder actions that support the implementation of Digital Child Protection Buffers are of vital importance.
Other essential conversations include the regulation and monitoring of other platforms (technology firms) to provide minimum standards that safeguard children (including but not limited to social media, gaming platforms, apps, live streaming and broadcasting services, and online advertising for adult content).
Age-verification measures is a child protection ecosystem underpinned by legislation, designed to create a safer online experience for children and young people. Adults accessing pornographic services are required to verify they are 18 or over by using Age Verification solutions. If a child stumbles upon a pornographic website or service that has Age Verification controls in place, they won’t be able to see pornographic content. Age Verification solutions are provided by third-party companies, so there is no need to share personal information directly with a pornographic website. An effective Age Verification ecosystem requires user safety, security and privacy to be placed at the core. Penalties must apply for non-compliant pornography companies.
Parental Controls are currently the most readily known and available response to prevent access to pornography. The eSafety Commissioner indicates that Parental Controls are software tools that allow monitoring and limiting what children see and do online. Parental Controls can be set up to do things like block children from accessing specific websites, apps or functions (like using a device’s camera, or the ability to buy things); filter different kinds of content—such as ‘adult’ or sexual content, content that may promote self-harm, eating disorders, violence, drugs, gambling, racism and terrorism; monitor children’s use of connected devices, with reports on the sites they visit and the apps they use, how often and for how long; and set time limits, blocking access after a set time. Parental Controls also include Child Safe Settings—options developed by technology platforms i.e.: social media, search engines, tube sites, etc. to enable parents and carers to ‘soft’ filter content on the respective platforms. Generally targeted at parents of children 0-12, platforms provide capabilities to change settings to reduce access to pornography and other ‘adult’ content, without the requirement of a third party filter. When parents are aware of this feature, Child Safe Settings can be a useful option and are available online globally. More can be learned on the eSafety Commissioner website and in the 2019 eChildhood Report.
Public Friendly WiFi is a helpful buffer to ensure safety standards for venues or precincts offering free public WiFi. Eligible entities may include public libraries and council spaces; chain stores; individual businesses; workplaces; transport companies; ISPs and secondary providers of WiFi products and services. Through an accreditation process, providers of WiFi must provide evidence to ascertain if the service they provide is safe for use within a public space. Upon accreditation, entities receive a ‘stamp of approval’ to display for public consumer confidence. Public Friendly WiFi is available in Australia, with the accreditation process undertaken by an independent accreditor, Digital Friendly WiFi is available in partnership with eChildhood.
Safety by Design Principles are an initiative by the eSafety Commissioner that places the safety and rights of users at the centre of the design, development, and deployment of online products and services, with a focus on children and young people as consumers. Developed in 2019 and gaining international recognition, Safety by Design is a set of principles— guidelines that provide a model to assess, review and embed user safety into online services. As SbD is not regulated or legislated, there is potential for developers not to implement the principles. An opportunity exists for the eSafety Commissioner to move the SbD principles to Codes of Practice, and for consumers to be made aware of those companies that underpin their products and services with SbD best-practice.
Network Restrictions incorporate filtering and device restrictions aimed at the network level. For example, Internet Service Provider (ISP) Level Filters, mobile device restrictions/child-safe sim cards, child-safe phones, etc. Network Restrictions are likely to innovate and adapt to shifting consumer needs.
Opt-In: makes provision for individual users to 'Opt-In' to request filtering of harmful websites at ISP level, including pornography (prohibited content) to be implemented.
Opt-Out: makes provision for default blocking. It allows individual age-verified users to ‘Opt-Out’ of blocking to still access content classified as illegal for minors by the Classifications Scheme.
Forced Blocks: with Age Verification implemented, forced blocks can be enacted when porn companies are non-compliant and fail to implement AV mechanisms. “Forced Blocks” are different to “Opt-Out” or “Opt-in” Filters in as much as this is not a system-wide approach; however, where applicable, individual websites in breach of non-compliance are targeted at the ISP Level.
Mobile Device Restrictions. This buffer requires every mobile phone number to be an account in its own right (irrespective of which device or handset the SIM is installed). Every account is assumed to belong to a child— therefore, access to content on the prohibited URL list is restricted unless and until the account holder completes an Age Verification process. This measure is implemented in the UK and relies upon a self-regulated Mobile Network Code of Practice; not available in Australia and best supported through the implementation of legislation.
The implementation of an Age Verification Ecosystem underpins the success of a number of different methods. In conjunction with appropriate initiatives such as parental controls, education programs, and therapeutic solutions, Age Verification is an imperative “cog in the wheel” for creating a truly comprehensive ecosystem of public health protection for Australian children. Implementation of an Age Verification ecosystem should sit within robust updated legislation, underpinned by safety, security and privacy. eChildhood recommends that good government policy include the implementation of such a system, no matter where content is hosted, to restrict minors’ access to all commercial porn websites. This measure will result in the vast majority of children being protected from online pornography harms because only adults will be able to view this content.
That said, it is recognised that children are exposed to pornography on other platforms (technology firms) including but not limited to social media, gaming platforms, apps, live streaming and broadcasting services, and online advertising for adult content. These areas must also be addressed.
If technology is a component of facilitating this problem, then it must also be a component of solving this problem.
Digital Solutions form one component of four within our eChildhood Public Health Approach. Detailed in the 2019 eChildhood Report, KIDS AND PORNOGRAPHY IN AUSTRALIA: Mobilising a Public Health Response, the eChildhood Public Health Approach presents a positive framework to be enacted in consultation with key stakeholders and supporters.
To see Digital Solutions implemented in Australia, we invite you to join eChildhood to build a connected community.
ORGANISATIONS: Please join the eChildhood Protection Coalition.
COMMUNITY MEMBERS: Please join the movement and take a stand.
We also invite you to consider how you can support our work financially. All gifts are tax-deductible.
every child in Australia deserves a porn free childhood