Blog

Our 2024 Trust and Safety Predictions 

trust and safety

Trust and safety professionals experienced their fair share (and then some) of new challenges in 2023—from the hype around generative AI to the moderation scandals and governance changes that online platform leaders faced. By joining forces, the trust and safety community brilliantly overcame some of them, making important improvements for online safety, raising awareness among platforms and the general public alike. Other challenges have had severe impacts, including the recalibration of the extent to which content is moderated.

Based on our observation of the industry, we identified four trends which should shape this new year. More than ever, the trust and safety community will play a huge part in supporting civility online as we navigate these challenges.

1. Governments Will Regulate the Industry

Regulations are not new to the trust and safety industry. For a few years now, regional and national governments have been working on regulating the internet and the content which circulates through it. 2024 should see a momentous increase for these initiatives. No matter their size and sector, platforms should prepare to modify their policies to ensure compliance.

Very Large Online Platforms (VLOPs) have already been subject to policy change by the EU. Since April 2023, 17 platforms designated by the European Commission as VLOPs must comply with the Digital Services Act, a European regulation that enhances users’ protection online. It requires platforms to develop solutions to ensure the publication of appropriate user generated content (UGC) and the respect of all users’ rights, among other things. Beginning in February 2024, all online platforms operating in the EU will also be subject to DSA regulations.

Additional regulations continue to be activated, like the Online Safety Act in the UK. To achieve compliance, platforms will need to be prepared, innovative, and resilient. Legal, tech, and operational advisors have emerged to help platforms decrypt the new rules that will apply to them—and these advisors may be the most cost-efficient way to get there.

2. Platforms Will Reevaluate the Role of Trust and Safety

Online platforms are currently faced with a demanding legal and social context. While governments ask platforms to take responsibility for UGC, users themselves expect to be increasingly safe online. Businesses, and more specifically VLOPs, must reevaluate the number of trust and safety initiatives they believe are required of them.

The responses to this context are varied—and sometimes radical. While many online leaders demonstrate a commitment to satisfying expectations by reinforcing safety measures and content review, cost pressures and a reassessment of what elements of trust and safety are a core responsibility of platforms have been impacting budgets in the industry, often leading to a reduction in the initiatives.

No matter what their final orientation is, any VLOPs’ decisions towards their investment in trust and safety will set the tone for online players in general, either favoring the development of safe spaces or leading environments towards increasingly unsafe interactions between users.

3. Global Events Will Impact Online Spaces

2024 will be a historic and important political year, with regional conflicts and national elections putting additional pressure on platforms. This year should be the biggest election year in history, with an estimated 76 elections being held being held in countries including the US, UK, India, Taiwan, and Russia. Strong convictions often lead to intense, potentially violent, reactions. Online platforms should expect a surge in emotive and reactional content being posted this year. This will include propaganda, images of violence, discriminatory content, threats, and so on.

To face such a context and protect users, the only solution is to be well prepared. An increase in content posted online requires platforms to mobilize greater moderation capacities to identify and remove harmful and problematic content. With the nature of this content being culturally complex, elements of this work will still fall to human moderation. This means performing this work will also require additional skills, including sufficient international awareness to identify and evaluate the gravity of political-related content all over the world. The way platforms react to this expected wave of content will have an impact on users’ safety online, and ultimately on the platform’s reputation.

Beyond public opinion, as we see governments take an active interest in how this is managed, we can expect regulators to take serious decisions in the event of failure to manage this situation. Online platforms will need to prepare for an increasing need for trust and safety resources. Finding a high number of moderators fit for the role in a short period, however, is not an easy task. Having a content moderation and trust and safety services partner can prove particularly useful in this instance.

4. Generative AI Will Continue to Develop

Last year, all eyes were on generative AI and its capacity to produce human-like content. While the hype seems to have died down, the presence and effects of generative AI online are still going strong. Users slowly integrate and accept AI-produced content, reducing their efforts to identify it as such. As a result, content moderation teams must take over in differentiating authentic content from an increasing volume of AI-generated images, videos, and text.

Part of this content is automatically identified by trained models, relieving human force from some of the work. However, the hardest cases will still need to be reviewed by moderators, thus increasing the level of complexity of the work they undertake. To allow moderation teams to be resilient enough to provide quality reviews in the long run, scaling and supporting them will be more critical than ever. This involves a focus on employee wellbeing from businesses, both in terms of work environments as well as psychological support, among other things. Protecting users will no longer be sufficient. As content to be reviewed gets increasingly complex (with AI doing more heavy lifting and human moderation needed for enforcement of the most complex of policies), additional focus will be on platforms and their processes to ensure the care of their moderators in 2024.

Part of this content is automatically identified by trained models, relieving human force from some of the work. However, the hardest cases will still need to be reviewed by moderators, thus increasing the level of complexity of the work they undertake. To allow moderation teams to be resilient enough to provide quality reviews in the long run scaling and supporting them will be more critical than ever. This involves a focus on employee wellbeing from businesses, both in terms of work environments as well as psychological support, among other things. Protecting users will no longer be sufficient. As content to be reviewed gets increasingly complex (with AI doing more heavy lifting and human moderation needed for enforcement of the most complex of policies), additional focus will be on platforms and their processes to ensure the care of their moderators in 2024.

Online businesses will also be under intense pressure this year. To make their platforms safe places for users, businesses can rely on trust and safety communities, which emerged all over the world in 2023, and will continue to develop in 2024. Such communities are crucial for this new industry which craves innovation and experienced knowledge sharing and feedback. The key to facing the challenges and uncertainties is to uplift and support the trust and safety community, to ensure it’s ready to face down everything 2024 has in store. One thing’s for sure experts, tech providers, operational providers and platforms have a role to play in shaping safer online communities.

To learn more about how platforms of all sizes need to adapt to changes in the trust and safety ecosystem, watch our on-demand webinar.

Paul Danter

Paul Danter

Head of Trust & Safety

Concentrix

Concentrix + Webhelp is now Concentrix. Bringing together integrated technology and services to power your entire business.