Flag of France

Quick Hits

  • For the first time, a French court has ruled on the implementation of AI processes within a company, emphasizing the necessity of works council consultation even during experimental phases.
  • The Nanterre Court of Justice determined that the deployment of AI applications in a pilot phase required prior consultation with the works council, leading to the suspension of the project and a fine for the company.
  • The ruling highlights the importance for employers of carefully assessing the scope of AI tools experimentation to ensure compliance with consultation obligations and avoid legal penalties.

More specifically, the Nanterre Court of Justice was called upon to determine the prerogatives of the works council when AI technologies are introduced into the workplace.

In this case, a company had presented in January 2024 to its works council a project to deploy new computer applications using artificial intelligence processes.

The works council had asked to be consulted on the matter and had issued an injunction against the company to open the consultation and suspend the implementation of the new tools.

The company had finally initiated the works council consultation, even if it considered that a mere experimentation of AI tools could not fall into the scope of the consultation process of the works council.

However, the works council, considering that it did not have enough time to study the project and did not have sufficient information about it, took legal action to obtain an extension of the consultation period with suspension of the project under penalty of a fine of €50,000 per day and per offense, as well as €10,000 in damages for infringement of its prerogatives because the AI applications submitted for its consultation had been implemented without waiting for its opinion.

On this point, it should be noted that in France, the works council, which is an elected body representing the company’s staff, has prerogatives that in some cases oblige the employer to inform it, but also to consult it, before being able to make a final decision. The consultation process means that the works council renders an opinion about the project before any implementation. This opinion is not binding, which means the employer can deploy the project even if the works council renders a negative opinion.

However, in the absence of consultation prior to the implementation of the project, the works council may take legal action to request the opening of the consultation and the suspension of the implementation of the project under penalty. The works council may also consider that failure to consult infringes its proper functioning, which is a criminal offense.

Indeed, in application of Article L.2312-15 of the French Labor Code,

[t]he social and economic committee issues opinions and recommendations in the exercise of its consultative powers. To this end, it has sufficient time for examination and precise, written information transmitted or made available by the employer, and the employer’s reasoned response to its own observations. […] If the committee considers that it does not have sufficient information, it may refer the matter to the president of the court, who will rule on the merits of the case in an expedited procedure, so that he may order the employer to provide the missing information.”

Within the area of new technologies, the prerogatives relating to consultation of the works council are numerous and variable, as it is stipulated that in companies with at least fifty employees, the works council must be:

  • informed and consulted, particularly when introducing new technologies and any significant change affecting health and safety or working conditions (Article L.2312-8 of the Labor Code);
  • informed, prior to their introduction into the company, about automated personnel management processes and any changes to them, and consulted, prior to the decision to implement them in the company, about the means or techniques enabling the monitoring of employees’ activity (Article L.2312-38 of the Labor Code); and
  • consulted where a type of processing, particularly when using new technologies, and taking into account the nature, scope, context, and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall carry out, prior to the processing an analysis of the impact of the envisaged processing operations on the protection of personal data (article 35(9) of the European Union’s General Data Protect Regulation (GDPR)).

In addition, regarding AI applications, it is worth noting that the EU’s regulation of June 13, 2024, on AI (Regulation (EU) 2024/1689) provides in its Recital 92 that in certain cases the

Regulation is without prejudice to obligations for employers to inform or to inform and consult workers or their representatives under Union or national law and practice, including Directive 2002/14/EC of the European Parliament and of the Council, on decisions to put into service or use AI systems. It remains necessary to ensure information of workers and their representatives on the planned deployment of high-risk AI systems at the workplace where the conditions for those information or information and consultation obligations in other legal instruments are not fulfilled. Moreover, such information right is ancillary and necessary to the objective of protecting fundamental rights that underlies this Regulation. Therefore, an information requirement to that effect should be laid down in this Regulation, without affecting any existing rights of workers.

In the case at hand, the company considered that the works council consultation was irrelevant as the AI tools were in the process of being tested and had not yet been implemented within the company.

However, the Nanterre Court of Justice, in a decision of February 14, 2025 (N° RG 24/01457), ruled that the deployment of the AI applications had been in a pilot phase for several months, involving the use of the AI tools, at least partially, by all the employees concerned.

To reach this conclusion, the court relied on the fact that certain software programs, such as Finovox, had been made available to all employees reporting to the chief operating officer (COO) and that the employees of the communications department had all been trained in the Synthesia software program. As such, the employer could not validly claim that such an implementation was experimental since so many employees had been trained and allowed to use AI tools.

The court, therefore, considered that the pilot phase could not be regarded as a simple experiment but should instead be analyzed as an initial implementation of the AI applications subject to the prior consultation of the works council.

The court therefore ordered:

  • the suspension of the project until the end of the works council consultation period, subject to a penalty of €1,000 per day per violation observed for ninety days; and
  • the payment of damages amounting to €5,000 to the works council.

Key Takeaways

In light of the Nanterre Court of Justice’s ruling, employers in France may want to remain cautious before deploying AI tools, even if it is worth noting that:

  • the ruling is only a summary decision, i.e., an emergency measure pending a decision on the merits of the case; and
  • this decision confirms that an experimental implementation of AI might be feasible, provided that it is followed by an information and consultation of the works council, prior to a complete deployment of AI tools. However, the range and scope of this experimentation is to be assessed with care because a court might consider the experiment actually demonstrates that a decision to implement AI was irrevocably taken.

Ogletree Deakins’ Cybersecurity and Privacy Practice Group will continue to monitor developments and update the Cross-Border, Cybersecurity and Privacy, and Technology blogs as additional information becomes available.

Follow and Subscribe
LinkedIn | Instagram | Webinars | Podcasts

Author


Browse More Insights

Fingerprint Biometric Authentication Button. Digital Security Concept
Practice Group

Technology

Ogletree Deakins is uniquely situated to provide tech employers and users (the “TECHPLACE™”) with labor and employment advice, compliance counseling, and litigation services that embrace innovation and mitigate legal risk. Through our Technology Practice Group, we support clients in the exploration, invention, and/or implementation of new and evolving technologies to navigate the unique and emerging labor and employment issues present in the workplace.

Learn more
Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more
Glass globe representing international business and trade
Practice Group

Cross-Border

Often, a company’s employment issues are not isolated to one state, country, or region of the world. Our Cross-Border Practice Group helps clients with matters worldwide—whether involving a single non-U.S. jurisdiction or dozens.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now