fbpx

European Parliament Approves the AI Act

What will EU regulation of AI-based products and services look like?

As the hype around artificial intelligence continues to grow, the European Union takes a crucial step towards the world’s first attempt at AI regulation. The European Parliament has approved the current draft of the legislation known as the AI Act.

This is our first look at what regulation of AI-based products and services could look like, though there are details yet to be worked out. Among the headline provisions are outright bans on the use of live facial recognition technology and predictive policing.

Facial recognition software has been legal worldwide, with just two countries (Belgium and Luxembourg) ever having banned it. Its use is often associated with China, representing a critical piece to its larger “social credit” projectPredictive policing is also currently being used in the United States, United Kingdom, Denmark, Japan, China, and more as the list looks likely to grow. It’s an approach that uses personal data (such as past convictions, location, and group affiliations) to predict future behaviour. The AI Act’s ban on these technologies was welcomed by groups such as Amnesty International but didn’t come without a fight.

Another focus of the AI Act was generative AI applications such as ChatGPT. The growing popularity of this technology was clearly prevalent in the minds of legislators. The approved draft text places obligations on these applications, including the labelling of AI-generated media and reporting what copyrighted material was used to train underlying models. Furthermore, there is a general requirement that every step in training AI models abides by all European laws. Contravention of these provisions risks deletion of the offending application or a fine of up to 7% of revenue.

This legislation arrives at a time when the conversation around AI is split between its innovative potential and possible regulation. Lawmakers in much of the rest of the world are still unsure about how this legal framework would come together. The EU, on the other hand, has progressed at a pace that even those who welcomed regulation are weary of.

Some of those concerns are reportedly technical, with the copyrighted materials requirement particularly standing out as “impossible” to comply with. Civil society organisations are also concerned with some of the AI Act’s current blindspots pertaining to human rights.

What’s Next?

The draft legislation is now set to be negotiated further among the European Commission and Council, including the heads of state and government leaders of European countries. Though the bloc is hoping to reach an agreement by the end of the year, there’s a lot to figure out. Questions include, but are most certainly not limited to:

  • How, specifically, will the AI Act be enforced?
  • What will the final list of High-Risk Systems look like?
  • What are the rules governing the interplay between the AI Act and, say, GDPR?
  • How do individuals seek redress, as the AI Act currently states, if they determine they were harmed by in-scope services? Furthermore, how would they even know they were harmed, to begin with?

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

TikTok DSA Statement of Reasons (SOR) Statistics

What can we learn from TikTok Statements of Reasons? Body shaming, hypersexualisation, the spread of fake news and misinformation, and the glorification of violence are a high risk on any kind of Social Network. TikTok is one of the fastest growing between 2020 and 2023 and has million of content uploaded everyday on its platform.…
10 minutes

Podcast Moderation at Scale: Leveraging AI to Manage Content

The podcasting industry has experienced an explosive growth in recent years, with millions of episodes being published across various platforms every day. As the volume of audio content surges, ensuring a safe and trustworthy podcast environment becomes a paramount concern. Podcast moderation plays a crucial role in filtering and managing podcast episodes to prevent the…
4 minutes

The Impact of Trust and Safety in Marketplaces

Nowadays, its no surprise that an unregulated marketplace with sketchy profiles, violent interactions, scams, and illegal products is doomed to fail. In the current world of online commerce, trust and safety are essential, and if users don't feel comfortable, they won’t buy. As a marketplace owner, ensuring that your platform is a safe and reliable…
9 minutes

3 Facts you Need to Know about Content Moderation and Dating Going into 2024

What is Content Moderation? Content moderation is the practice of monitoring and managing user-generated content on digital platforms to ensure it complies with community guidelines, legal standards, and ethical norms. This process aims to create a safe and inclusive online environment by preventing the spread of harmful, offensive, or inappropriate content. The rise of social…
6 minutes

Educational Content: Enhancing Online Safety with AI

The internet has revolutionized the field of education, offering new resources and opportunities for learning. With the increased reliance on online platforms and digital content, it is now a priority to ensure the safety and security of educational spaces. This is where artificial intelligence (AI) plays a big role. By using the power of AI,…
3 minutes

Ready or Not, AI Is Coming to Content Moderation

As digital platforms and online communities continue to grow, content moderation becomes increasingly critical to ensure safe and positive user experiences. Manual content moderation by human moderators is effective but often falls short when dealing with the scale and complexity of user-generated content. Ready or not, AI is coming to content moderation operations, revolutionizing the…
5 minutes

Content Moderation for Virtual Reality

What is content moderation in virtual reality? Content moderation in virtual reality (VR) is the process of monitoring and managing user-generated content within VR platforms to make sure it meets certain standards and guidelines. This can include text, images, videos, and any actions within the 3D virtual environment. Given the interactive and immersive nature of…
31 minutes

How to use Content Moderation to Build a Positive Brand Image

The idea of reputation has changed dramatically in the digital age, moving from conventional word-of-mouth to the wide world of user-generated material on the internet. Reputation has a long history that reflects changes in communication styles, cultural developments, and technological advancements. The importance of internet reviews has been highlighted by recent research conducted by Bright…
5 minutes

The Digital Services Act (DSA) Guide

What is the Digital Services Act (DSA)? The Digital Services Act, otherwise known as the DSA, is the first attempt by theEuropean Union to govern platforms at the regulatory level. Up until this point, all 27 EUmember states have each had their own laws that may or may not apply to onlineplatforms. The DSA is…
7 minutes

Content Moderators : How to protect their Mental Health ? 

Content moderation has become an essential aspect of managing online platforms and ensuring a safe user experience. Behind the scenes, content moderators play a crucial role in reviewing user-generated content, filtering out harmful or inappropriate materials, and upholding community guidelines. However, the task of content moderation is not without its challenges, as it exposes moderators…
4 minutes

Live Chat Content Moderation Guide

During any live streaming nowadays, whether it be a content creator on Youtube, an influencer on Instagram, or even live sports in some cases, there's always some sort of live chat. These are public commentary sections where viewers can interact and share their thoughts and opinions, but depending on which event or what sort of…
6 minutes

The longest 20 days and 20 nights: how can Trust & Safety Leaders best prepare for US elections

Trust and Safety leaders during the US elections: are you tired of election coverage and frenzied political discussion yet? It’s only 20 days until the US votes to elect either Kamala Harris or Donald Trump into the White House and being a Trust and Safety professional has never been harder. Whether your site has anything…
5 minutes

Top 3 Digital Services Act Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

Scaling Content Moderation Through AI Pays Off, No Matter the Investment

In the rapidly evolving digital landscape, user-generated content has become the lifeblood of online platforms, from social media giants to e-commerce websites. With the surge in content creation, content moderation has become a critical aspect of maintaining a safe and reputable online environment. As the volume of user-generated content continues to grow, manual content moderation…
4 minutes

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert