For community platforms, the DSA introduces numerous changes and new obligations to better protect users against harmful behaviour and illegal content. In this blog post, we provide an overview of the most important DSA guidelines - from the perspective of a community platform.
Reporting and remediation procedures (Articles 12 and 16)
One of the most significant changes is the obligation to establish user-friendly reporting and redress procedures. Platforms must provide mechanisms for users to easily and effectively report illegal content. This requires platforms to invest in intuitive reporting tools and ensure timely processing of reports.
Obligations to remove illegal content (Article 14)
Hosting services must remove illegal content immediately or block access to it. This requires the development of efficient moderation tools and processes that enable the rapid identification and removal of such content.
Notification obligations (Article 17)
When content is removed or blocked, the affected users must be informed. This notification must be transparent and comprehensible, providing users with the opportunity to contest these decisions. This makes shadow banning impossible. A method that many platforms have been keen to use up to now.
The following points concern VLOPs and VLOSEs:
Creating an Internal Complaints System (Article 20)
In addition, users must have access to an effective internal complaints management system after a decision has been made. This will provide users with a tool to directly and easily lodge complaints against deleted content that violates the platform's terms and conditions.
Transparency obligations (Article 24)
Very large Community platforms are now required to disclose their moderation practices. This includes publishing yearly transparency reports detailing measures taken to combat illegal content and harmful behaviour. These reports must be uploaded to the DSA Transparency Database, ensuring they are publicly accessible in a timely manner. This transparency allows users to consistently view when and why content was deleted from a platform.
Prioritisation of trusted flaggers (Article 22)
A trusted flagger is a person or organisation that is recognised as particularly reliable and competent when it comes to reporting illegal content on the internet. These trusted flaggers have proven experience and expertise in their field and can make accurate and well-founded reports. Therefore, their reports should be processed faster and prioritised in order to remove or block illegal content more efficiently. This helps to remove harmful content more quickly and make the online community safer.
Mitigating Systemic Risks (Articles 26 and 27)
Very large online platforms (VLOPs) must assess and mitigate risks, including the spread of illegal content and manipulation by fake accounts. Platforms must develop and implement specific measures to minimise these risks. These include algorithms to detect fake accounts, improved security protocols and advanced moderation technologies.
Penalties for Non-Compliance
Article 52 of the DSA outlines conditions and penalties for all digital service providers that fail to comply. Member states are to impose fines up to 6% of a provider's annual revenue or turnover. This substantial penalty empowers national media authorities and the EU Commission to enforce the DSA effectively.
ALL IN SHORT - Essential actions for Community Platforms
In order to comply with the new obligations of the DSA, all community platforms must take these technological and organisational measures:
- User-friendly reporting and complaints mechanisms: build intuitive and accessible reporting platforms that make it easy for users to report problematic content and access to an effective internal complaints management system against moderated content.
- Advanced moderation tools: Development and implementation of new moderation tools that can detect and respond to harmful behaviour and illegal content. Removal or blocking of illegal content as quickly as possible.
VLOPs and VLOSEs can be fined regarding Article 52 and have to take further steps:
- Transparency dashboards: Creation of dashboards (including publication in DSA Transparency Database) that provide users and regulators with insight into moderation activities and actions taken.
- Risk management systems: Implementation of comprehensive risk management systems that continuously analyse threats (for example manipulation by fake accounts, bots and trolls) and suggest countermeasures.
Minimise the additional work load caused by the DSA
Many sections in the DSA require methods that will increase your overall moderation efforts dramatically, such as justifying every moderated piece of content, managing internal complaints systems, and dealing with prioritised reporters. To bring your moderation efforts to a minimum, it is crucial to reduce hate and disinformation from the outset. Trusted Accounts offers all the necessary tools to block bots and trolls that spread disinformation and hate while retaining real users on your platform, thereby saving you a substantial amount of moderation work.
The Digital Services Act presents community platforms with new challenges, but also offers the opportunity to strengthen user trust and ensure platform integrity through improved security and transparency measures. Ultimately, embracing DSA guidelines can position your platform as a responsible digital entity that prioritises user safety and regulatory compliance in the evolving digital landscape.
Links
https://wikimedia.brussels/the-eus-new-content-moderation-rules-community-driven-platforms/
https://www.weforum.org/agenda/2022/12/how-companies-prepare-digital-services-act/