Skip to Main Content

Keeping you informed

What Legal Risks Employers Face When Using Artificial Intelligence Transcription Tools

    Client Alerts
  • November 22, 2024

Artificial intelligence transcription tools are changing how internal and external meetings are recorded and notes are shared. These tools generate real-time transcripts of meetings, letting participants focus on the substance of the call rather than their notes. However, legal and compliance professionals must weigh added convenience against increased legal risk. Using these tools without careful consideration could result in confidentiality breaches and employment disputes.

Four Key Risk Areas of AI Transcription Tools for Employers

AI transcription tools, such as Otter.ai, introduce risks in four key areas:

  • Consent under state recording laws.
     
  • Waiver of attorney-client privilege.
     
  • Inadvertent disclosure of confidential or proprietary information.
     
  • Disclosure of employee review transcripts.
     

Participant Consent: Navigating the Patchwork of Recording Laws

Consent requirements for recording conversations vary across jurisdictions. Some states, such as California, require the consent of all participants, while others, like North Carolina, allow recording with the consent of just one party. When discussions span multiple states or countries, and the location of participants may be unknown to the meeting host, the legal landscape becomes even more complex.

To ensure compliance, organizations should put participants on notice at the outset of a conversation, using a disclaimer that the meeting is being recorded and transcribed. Some meeting platforms have a built-in notification that the meeting is being recorded, and often that notice can be customized, including banning alternate recording devices than the one utilized by the meeting host.

It is worth noting that certain AI tools circumvent detection and can record and transcribe the meeting without participants’ knowledge. Moreover, there are certain instances where AI tools have been removed from meetings but could still transcribe the call and distribute notes. Therefore, providing a disclaimer and notice reduces the associated legal risk, even where AI tools are believed to have been removed.

How Companies Can Protect Confidential Information

Transcripts of meetings often contain confidential or proprietary information. Storing this information in cloud-based systems introduces unauthorized access, breaches, or inadvertent sharing risks. Otter.ai’s collaborative features, while useful, make it easy for users to share transcripts without considering the sensitivity of the information. Therefore, confidential information was not only discussed on the call, but is now in a written form that participants can easily forward. When third-party tools are used, the information is also in the hands of those parties and subject to the terms and conditions that the third-party employs. While various doomsday scenarios exist, the easiest to imagine is a breach of a contract’s confidentiality clause, which may not be subject to a limitation of liability.

Legal and compliance teams should develop clear internal policies on the acceptable use of transcription tools. These policies should emphasize proper classification, storage, and sharing of transcripts. Training employees on these protocols minimizes risk and promotes good data hygiene. Additionally, organizations should carefully review AI tools, data security, and privacy policies to ensure they meet the company’s standards.

Attorney-Client Privilege: The Hidden Cost of Convenience

Using AI transcription tools for conversations involving legal advice introduces the risk of inadvertently waiving attorney-client privilege. Many transcription services process and store data on third-party servers or use the information to train the model. Chief Justice John Roberts echoed these concerns in his 2023 end-of-year report, saying, "some legal scholars have raised concerns about whether entering confidential information into an AI tool might compromise later attempts to invoke legal privileges."

To mitigate the risk of waiving privilege, organizations should:

  • Deploy enterprise-only versions of the transcription tools that offer robust encryption and user access controls.
     
  • Prohibit AI transcription tools for sensitive discussions where privilege is paramount.
     
  • Train employees on the potential privilege implications of recording a meeting.
     

Employment Implications: Managing Auto-Shared Transcripts and Summaries

Otter.ai’s feature for automatically sharing meeting notes can be a double-edged sword. While it promotes collaboration, it can also create issues in employment contexts. For example, sharing notes from a performance review or human resources meeting without proper vetting could expose sensitive managerial discussions or lead to misinterpretations. The transcription could undermine the organization’s position or contradict any formal documentation in a legal dispute. Summaries are often forwarded to any invitee — whether they were asked to leave early or did not attend the meeting.

Organizations can avoid these risks by disabling auto-sharing features for sensitive meetings and requiring a review process before distributing transcripts. It is also critical to align shared transcripts with HR and legal documentation practices to ensure consistency.

Moving Forward

AI transcription tools like Otter.ai can provide significant benefits but require careful management to avoid significant legal risks. Organizations can balance leveraging technology and protecting their legal interests by addressing participant consent, safeguarding privilege, securing confidential information, and managing employment-related implications.

The key is to implement policies to limit or prohibit the use of AI transcription tools and to train personnel on the risks and appropriate uses.

For more information, please contact us or your regular Parker Poe contact. You can also subscribe to our latest alerts and insights here.