EU lawmakers approve law to limit use of AI

Companies such as OpenAI that produce powerful, complex and widely used AI models will also be subject to new disclosure requirements under the law…reports Asian Lite News

European Union lawmakers on Wednesday gave final approval to a landmark law governing artificial intelligence (AI) to limit its use in businesses and organizations in Europe for everything from health care decisions to policing.

The first-of-its-kind law imposes blanket-bans some “unacceptable” uses of the technology while enacting stiff guardrails for other applications deemed “high-risk.”

The EU AI Act outlaws social scoring systems powered by AI and any biometric-based tools used to guess a person’s race, political leanings or sexual orientation.

It also bans the use of AI to interpret the emotions of people in schools and workplaces, as well as some types of automated profiling intended to predict a person’s likelihood of committing future crimes.

The law further outlines a separate category of ‘high-risk’ uses of AI, particularly for education, hiring and access to government services, and imposes a separate set of transparency and other obligations on them.

Companies such as OpenAI that produce powerful, complex and widely used AI models will also be subject to new disclosure requirements under the law.

It also requires all AI-generated deepfakes to be clearly labelled, targeting concerns about manipulated media that could lead to disinformation and election meddling.

The sweeping legislation, which is set to take effect in roughly two years, highlights the speed with which EU policymakers have responded to the exploding popularity of tools such as OpenAI’s ChatGPT.

The legislation approved by a plenary vote in the European Parliament this week is the result of a proposal that was first introduced in 2021, which gave lawmakers a head start when the release of ChatGPT spurred an investment boom and public frenzy. (ANI)

ALSO READ-EU fights anti-Ukraine propaganda ahead of vote

Tagged:

Leave a Reply

Your email address will not be published. Required fields are marked *