Prevention of online child sexual exploitation (OCSE) is an enormous task that requires time, effort and, very importantly, resources. However, it is often the case that organisations with the lack of time or resources are the ones tasked with running prevention and intervention programmes and navigating challenging environments created by providers of digital platforms. One of the first steps in prevention involves enhancing knowledge of both adults and children on matters related to child sexual exploitation which includes a broad range of topics from knowing and understanding one’s own body, setting personal boundaries, recognising the reasons behind secret keeping, identifying manipulative techniques that may be used by perpetrators, and recognising trusted adults who can assist in an ambiguous situation. Such knowledge builds foundation which then has to be contextualised for online environments. This can present a separate challenge because the majority of deliberations, choices and actions have to be made through the lens of unique affordances and limitations imposed by each digital platform.
Digital skills
As the digital environment becomes more complex, children and young people are required to critically understand the digital world in which they are increasingly immersed. There are also several dimensions of digital skills that can be obtained, each serving a different purpose and offering unique benefits. However, findings from ASEAN countries including Indonesia, the Philippines, Thailand and Vietnam show that while most young people view digital literacy as important for their future, they do not think they possess good digital skills. When asked whether learning digital skills is part of the schooling curriculum anywhere from 24 to 39% answered no. In addition, there is a gender gap in attainment of digital skills with UNICEF estimating that on average only 65 female youth have digital skills for every 100 male youths.
Parents and guardians play an absolutely vital role in the attainment of digital skills, especially the aspects related to processing of information and communication and interaction skills. However, Asia Pacific shows a high prevalence of cases where children are under the care of their older relatives. Research from several countries shows that about a quarter of them - with half being aged 50 and above – never used the internet and may require extra support and knowledge on how to support children in navigating online spaces. As a result they often either resort to prohibitive tactics, trying to keep children off the internet and potentially preventing kids from attaining valuable skills or eroding trust. Parents and guardians may also feel that parental controls, often marketed as a fail-safe solution, would add the desired safety buffer despite the lack of evidence for the efficacy of such tools.
With the lack of appropriate support from schooling curriculum and parents or guardians, children in the region are more at risk of not obtaining the whole range of relevant digital skills and, as a consequence, lack the ability to recognise abusive interactions or illegality of certain actions, to recover from negative online experiences, or to use varied safety and privacy features. Inevitably, this contributes to the risk of victimisation, especially in the context of sextotion or nonconsensual sharing of intimate images.
While effort has been put into proactive detection of risky interactions, this is not the kind of issue AI alone can solve or replace investment in accessible, localised, region-specific educational and support resources designed for different audiences.
Availability of resources
Nowadays, there is a plethora of ‘how-to’ resources offered by NGOs, governments and providers of educational services. Yet, it can be argued that nobody knows the full spectrum of tools and materials better than the companies that make online products. This places the onus on creators and owners of digital platforms to not just invest in OCSE detection capabilities but also in educational and support resources and regional partnerships.
Undoubtedly, companies have gone a long way in the last 5-7 years building up their Safety Centres, making sure they make their policies and community guidelines available to the public and providing instructions on how to use their products. Some have done a remarkable job creating tailored resources for youth, parents and educators. However, a number of aspects still stand out as needing improvements.
Resources are not always easy to find
Often help and support centres are created in iterations, by different generations of Trust & Safety teams. As a result, they have a tendency to be convoluted and in need of better structure. As platforms mature, build out new features, acquire new products, these pages get bulkier with new and updated instructions on how to report or operate those features. This creates a maze of resources covering a variety of topics from navigating different parts of the products and adjusting settings to policy explanations and filing reports to preserving one’s data and setting parental controls or time limits. Users have to sift through multiple pages to find the information that is relevant to them and understand the implications of various actions they may take while using a platform.
Resources are long and may be difficult to understand
Many of the articles, both covering policy clauses and product features, even the ones supposedly designed for younger audiences, are exceptionally long. While they don’t have the complexity of a typical Terms of Service document, they do not account for the varying abilities and capacities of adults and children in different age groups and different comprehension abilities. In addition, there is little alignment on key terms across different products and services making it difficult to search for relevant information or rely on knowledge obtained elsewhere.
Resources are not localised
It’s understandable that priorities for growing a global business vary a lot by company and that strategic investment decisions such as translation and localisation can depend heavily on those priorities. Yet, while India is likely one of the biggest regional markets for many of the companies, only two platforms seem to acknowledge through their localisation efforts the fact that the country speaks multiple languages. Meta is definitely a remarkable outlier with their Family Centre available in nine languages spoken in India other than English - Bengali, Gujarati, Hindi, Kannada, Malayalam, Marathi, Punjabi, Tamil and Telugu. Other platforms rarely cover the main language in the region let alone multiple languages from the same country. Bahasa Indonesia, Thai, Filipino and Vietnamese are among the ones commonly absent from the list of languages.
Finally, if Help Centre resources include any references to external organisations or partners, they are usually global or US-based organisations like the National Centre for Missing and Exploited Children (NCMEC) or the International Association of Internet Holtlines INHOPE. While INHOPE does offer a list of hotlines by country one has to do multiple clicks to get to the list which is once again only available in English. This means that in a situation when safety resources need to be found fast one has to look through several pages of text in a foreign language that may still not provide any useful contacts.
Chapter and series conclusion
In the era of AI solutions which are often presented as a ‘silver bullet’ despite having limitations and issues of their own, it becomes harder to petition for human resources or budgets for time-consuming projects with murky return on investment like the production or localisation of help centre resources.
While this issue affects all customers of digital platforms it puts those who are not native English speakers at further disadvantage.
With an increasing amount of OCSE content originating from South and Southeast Asia, and evidence pointing at the sustained growth of the problem, investment in preventive efforts, among other things, becomes more critical than ever.
Undoubtedly, some phenomenal progress has been made for content detection, age verification and broader customer safety in the past several years. However, there remains a risk that in the rush to implement new and allegedly more powerful AI driven solutions regional nuances, that affect performance of algorithms or signal vulnerabilities to ubiquitous AI implementation, would be forgotten in the attempts to reduce investment in responsible product design, safety resources, garner good publicity or appease regulators.
Companies cannot afford to rely on one solution however ingenious it may seem. Bad actors behave differently and constantly invent new ways to reach their victims while remaining undetected as long as possible. It is important to remember that CSE is a very complex societal issue, so we have to employ all tools at our disposal starting from seemingly rudimentary ones to the most ingenious ones. Only then we’ll stand a chance.
See other chapters in the "Combatting OCSE in Asia Pacific" series:
Comments