A whole host of creators have filed suit in the U.S. alleging that AI companies improperly used the creators’ content to train AI programs (if you need to catch up on these lawsuits, we recommend our video blog here).  In most cases, the creators don’t know for sure whether the AI companies copied their works, although they allege that copying can be inferred based on the AI programs’ outputs.  But a new law in the EU may soon provide creators with a mechanism to find out if their works have been copied, and may provide those creators with greater protections than those afforded to creators in the U.S.

On March 13, 2024, the European Parliament approved the Artificial Intelligence Act, known as the AI Act.  Formal adoption of the AI Act is expected in early Summer 2024, with implementation spearheaded by the newly-formed European AI Office.  The AI Act is the one of the first major legislative frameworks in the world to emerge in response to the spread and seeming ubiquity of the relatively new generative AI technologies. The Act aims to ensure safety and compliance with certain individual and property rights, including IP rights.

The AI Act will regulate AI programs based on the level of risk they present.  Generative AI programs that are capable of generating text, images, and other content, and may perform any number of functions with general or specific purposes, are classified as “high risk.”  The AI Act refers to these programs as “General Purpose AI” or GPAI.  The AI Act places the most stringent obligations on developers and deployers of high-risk AI systems that are put to use in the EU, even if the developer or deployer is not actually based in the EU.  This partly because the EU believes that application to non-EU companies whose programs will be used in the EU is “necessary to ensure a level playing field among providers of [GPAI] models where no provider should be able to gain a competitive advantage in the EU market by applying lower copyright standards than those provided in the [EU].”   

Under the AI Act, GPAI model providers must:

  1. provide technical documentation, including training, testing processes, and results of evaluations;
  2. provide information and documentation to supply to end providers that intend to integrate the GPAI model into their own AI system so that the latter understands the capabilities and limitations thereto and is able to comply with the AI Act’s requirements;
  3. establish a policy to abide by the EU Copyright Directive; and
  4. publish a detailed summary about the content used for training the GPAI model.

One goal of these provisions to ensure that AI developers are disclosing whether the used material subject to copyright protection to train their AI programs.  There are some exceptions, however, including for “open license” AI models, which only have to provide disclosures if their AI programs present a “systemic risk.”

For those companies required to make disclosures, they must prepare sufficiently technical summaries to encourage IP rightsholders or others with legitimate interests to exercise and enforce their rights in the EU.  The EU AI Office will not be conducting a “work-by-work” assessment to ensure that GPAI providers are abiding by copyright laws.  Instead, the Office has passed the onus on to GPAI providers to satisfactorily educate rightsholders about their enforcement rights and responsibilities vis-à-vis the use and incorporation of their content by GPAI.

Companies must also “establish a policy to abide by the EU Copyright Directive,” that “requires the authorization of the rightholder concerned” before using any copyright protected content “unless relevant copyright exceptions and limitations apply.” The Directive does have some exceptions to this requirement, such as allowing reproductions of works for the purposes of text and data mining in certain limited scenarios.  Unless that text/data mining is for scientific purposes, however, the rightsholder can opt out. 

If a rightsholder opts out, then under the terms of the Copyright Directive and the AI Act, companies must “identify and respect the reservations of rights expressed by rightsholders pursuant to Article 4(3) of Directive (EU) 2019/790.”  Accordingly, GPAI providers would have to obtain special permission from the rightsholder that opted out in order to proceed with text/data mining activities that would access or utilize their protected content.

The AI Act includes a carve-out for small GPAI providers, such as start-ups, to promote innovation even for those with fewer resources than large corporations.  The carve-out provides “simplified ways” for smaller providers to comply with the AI Act.  The idea is that compliance with the AI Act should not “represent an excessive cost” or “discourage the use of [GPAI] models.”

It remains to be seen what will happen once the AI Act is officially enacted, but we expect an uptick in copyright litigation and related counseling (as well as costs) in the EU brought by copyright owners and other rightsholders.  We also expect that the number of potential stakeholders will increase by virtue of the ubiquity of GPAI and the ease and accessibility of content via the Internet and connected devices.  With this may come niche legal practices and novel legal issues, which will likely result in changes to the AI Act or its interpretation.  It will be crucial for owners of IP and AI companies to understand their respective rights and obligations under the AI Act; otherwise, IP holders risk unfettered and unauthorized use of their creative content, while AI companies run the risk of being sued.