On February 17, 2024, the Digital Services Act (DSA) took effect. This means new obligations for smaller platforms and hosting providers, among others. According to the Digital Services Act (DSA), specific services that boast over 45 million active EU users are required to address risks associated with electoral processes. This must be done in a way that protects fundamental freedoms, particularly the right to free speech. The new law also has consequences for some online store owners.
The core of the Digital Services Act
The EU Digital Services Act (DSA) has begun to take effect in European Union member states. The new regulations are expected to improve the safety of users online, support the fight against illegal content on the Internet, and clarify content moderation.
The Digital Services Act primarily concerns intermediary service providers and online platforms, including sales platforms, social networks, content sharing platforms, app stores, or online travel and accommodation platforms. The DSA, enacted to modernize the digital space, protect consumers and establish a transparent and secure online environment, also has implications for online store owners.
What online stores are affected by the Digital Services Act (DSA)?
If you have an online store that allows users to add content such as comments or product reviews then DSA applies to you.
According to DSA regulations, if you allow users to self-publish content on your store, you are a hosting provider. The new regulations impose a number of new obligations on such entities.
Are online store owners facing a revolution?
Absolutely not! The new regulations in many cases will require only minor changes to the Terms & Conditions or creating it for scratch. Many of the existing stores already have procedures for reporting and moderating unauthorized content. If your store doesn't have customized rules or a mechanism for reporting content that violates the rules - it's a call to action.
Important to note: The Digital Services Act offers protection to online stores from being held liable for content generated by users, on the condition that they take appropriate action upon receiving notifications about forbidden content. While there is no obligation for stores to proactively monitor user content, they are required to address and remove content that infringes upon regulations once it is reported to them.
Implications for online store - what should be done?
If you own an online store that allows you to self-publish content by customers in the form of reviews or comments, here's what you can do to make sure your portal meets the DSA standards: review and update your terms of service, privacy policies, and content moderation practices.
- Define the concept of unauthorized content and define precisely what content is considered to be in violation of the regulations.
- Set up reporting tools: develop accessible tools that allow users to report illegal content, goods, or services easily. This includes clear guidelines on how to report and what constitutes illegal content.
- Specify the rules for content moderation and how to proceed if the vendor deems the content in question to be unauthorized content.
- Specify the rules for handling notifications: indicate the deadline for issuing a decision on the notified content, and develop a procedure for appealing a vendor's decision on unauthorized content.
- Regular compliance checks: regularly review your practices and policies to ensure ongoing compliance with the DSA.
How to create a mechanism for reporting unauthorized content?
Creating a Digital Services Act (DSA)-compliant mechanism for reporting prohibited content involves establishing a clear, accessible, and efficient system that enables users to notify the platform of any illegal content, goods, or services they encounter. Here's a step-by-step guide to developing such a mechanism:
- Verify your terms of service and definitions of prohibited content.
- Make these definitions easily accessible to users. Clear guidelines help users understand what can be reported and ensure the reporting tool e.g. application form is used appropriately.
- Ensure that the application can be verified: to the form you can add a place for a link, leading directly to the forbidden content.
- Provide predefined categories for reports (e.g., copyright infringement, hate speech) to streamline the process and facilitate quicker review by you or your team.
- Develop clear procedures for evaluating reported content and deciding on the appropriate action, such as removal, restriction, or maintaining the content if it's deemed compliant.
- Inform users about the outcome of their reports, enhancing transparency and trust in the reporting mechanism.
- Provide appeal mechanisms. Ensure that both the reporters of content and the affected parties (e.g., content creators, sellers) have the opportunity to appeal decisions made about reported content.
By following these steps, you can create a DSA-compliant mechanism that not only meets regulatory requirements but also supports a safer and more trustworthy platform for your users.
Conclusion
The Digital Services Act ushers in a new era of digital commerce, marked by increased responsibility, transparency, and protection for consumers. For online store owners, navigating the DSA's requirements will be crucial for compliance and for leveraging the opportunities it presents. Although the new law is not a revolution, and much of the regulations already cover the new regulations, some owners of e-commerce platforms need to examine their terms of service and the procedures they have in place for managing customer-published content to ensure alignment with the updated regulations.