AI Represents Potential Erosian of Attorney-Client Privelege


Introduction

Artificial intelligence (“AI”) tools have rapidly proliferated across the modern workplace, fundamentally changing how attorneys and other professionals conduct business and communicate with each other. AI-enhanced notetaking applications are now commonplace in corporate offices and law firms alike. While these technologies may offer efficiencies, many organizations have adopted them without fully comprehending the significant legal risks they present.

For attorneys and their clients, perhaps the most pressing concern is the potential erosion of attorney-client privilege. When AI tools process, store, or transmit what would otherwise be privileged communications between attorneys and clients (especially when disclosed to third-party cloud providers), they may inadvertently waive the confidentiality protections that shield those communications from discovery. Independent of communications with counsel, client use of AI notetaking tools also creates vast amounts of potentially discoverable business records, exposing candid conversations that have not been historically recorded and increasing the complexity of discovery with the sheer volume of data. Use of AI notetaking applications also implicate ethical considerations for attorneys to ensure use of AI tools including notetaking applications do not conflict with core ethical obligations, including duties of competence, independent judgment, loyalty, and confidentiality.

Recent state bar opinions and court decisions have brought into focus the risks associated with using AI notetaking tools in legal settings. As this technology continues to evolve and become further embedded in daily operations, attorneys must remain vigilant in understanding how these tools function, where client data travels, what safeguards exist to protect privileged and confidential information, and how to advise clients regarding use of such tools. 

State Bar Opinion on Ethical Issues of Recording Client Conversations

On December 22, 2025, the New York City Bar Association released a formal opinion regarding a lawyer’s ethical obligations under the New York Rules of Professional Conduct when using AI notetaking applications (the “Opinion”). The Opinion considers ethical issues fundamental to attorneys’ professional responsibilities regardless of jurisdiction.

The attorney’s duty of loyalty requires that they do not engage in conduct involving dishonesty, fraud, deceit or misrepresentation. This has been interpreted by the New York City Bar Association to demand that a lawyer obtain a client’s consent when using recording devices, even in a state that only requires one-party to consent. Clients have an expectation of confidentiality when speaking with their attorneys and they may choose their words more carefully while knowing that they are being recorded. However, when the lawyer does not disclose such, the client is robbed of the choice to do so. 

Attorneys must also adhere to their duty of confidentiality when using AI notetaking programs and retaining the information those programs produce. Attorneys must consider the privacy and security safeguards in place by the tool, including what data is stored, the duration of retention, rights of deletion, and whether such tools are used for training AI models. Advising clients of the risks of the loss of confidentiality and attorney-client privilege protection is important to meet ethical obligations.

The New York City Bar Association also emphasizes that the duty of competence prohibits attorneys from relying solely on AI work product. All transcripts and summaries must be reviewed for accuracy and attorneys must not rely solely on the AI work product. As a general matter, attorneys must also have an understanding of the technical features, limitations, and security of the tools they and their clients use and be aware of the ethical issues that arise from such use. Training and supervision of subordinate attorneys and other employees should involve these same practices to assure that practices consistent with attorney ethical obligations are universally applied throughout the firm.

Relatedly, the Opinion acknowledges the problems inherent in attorneys’ representations of clients who use their own AI tools. When clients independently use these programs, attorneys have little control over the recordings or summaries. To mitigate negative consequences, the Opinion recommends that attorneys should address the possibility of such use at the outset of the relationship. If client AI tools are used, the New York City Bar Association recommends attorneys ask for conversations between the attorney and client to not be recorded, include terms in their engagement agreements that AI work product will not be dispositive or binding unless they have been reviewed by the firm for accuracy, and implement clear and consistent disclosure of the risks of losing confidentiality and privilege.

Separately, attorneys must not only consider the ethical duties that are owed to the client, but also compliance with legal statutes involving use of these AI tools. Recording a meeting without first obtaining the consent of all participants could violate wiretap laws. Consent requirements vary, but some states require the consent of every participant before a meeting can be recorded. California, Connecticut, Florida, Illinois, Maryland, Massachusetts, Nevada, New Hampshire, and Pennsylvania are among these two-party consent states. AI notetaking applications may not have proper notice settings that would be required in two-party consent states where all parties must consent to recordings. Additionally, some states have laws protecting biometric data, including “voiceprints” derived from audio files that are used by these tools to generate transcripts and summaries. AI tools alone cannot reliably determine the legality of recording in all jurisdictions, so attorneys must be cognizant of such to assure compliance.

In re Otter.AI Privacy Litigation

In re Otter.AI Privacy Litigation, No. 5:25VC06911 (N.D. Cal., Aug. 15, 2025) showcases the confidentiality risks that could also be exposed when attorneys and clients alike use notetaker applications. This case stems from a California federal court case against Otter.AI, an AI notetaker application, where the plaintiff alleges the application records and transcribes conversations without receiving proper consent from all participants in the meeting.. 

The Otter.AI privacy policy pushes responsibility for obtaining consent to users. Additionally, the privacy policy at issue stated Otter.AI would share the user’s personal information with third parties including “Cloud service providers…Data labeling service providers …Providers of integrated third-party programs, apps or platforms” and that when users “connect third party platforms to [its] Services, [users] authorize [it] to share designated information and data created and/or uploaded by [users] to [its] servers with these third-party programs on your behalf.” Use of AI notetaking and transcription tools with similar policies are likely to nullify the protection of attorney-client privilege and create ethical issues for failing to maintain confidentiality.

United States v. Heppner

United States v. Heppner, No. 25-cr-00503-JSR (S.D.N.Y. Oct. 28, 2025) underscores the concerns addressed in the New York City Bar Association Opinion when clients use AI for their legal matters without the direction or control of counsel. Heppner addressed whether documents that a criminal defendant created using an AI platform were protected by attorney-client privilege and work product. The defendant, indicted on securities fraud and related charges, independently used Claude, Anthropic’s generative AI tool without attorney direction, to prepare reports outlining potential litigation defense strategies, which he later shared with counsel. 

The Court rejected the privilege and work product claims and raised issues similar to the concerns of the New York City Bar Association that the documents were not confidential, given that Anthropic’s privacy policy expressly permits data collection and disclosure to third parties, including government authorities. This decision carries significant implications for the use of AI platforms with similar disclosure policies, suggesting that entering privileged information into such systems may erode attorney-client privilege, work product, and effectively waive confidentiality protections. 

Although the Heppner court decision pertained to a third-party tool whose public policies indicated insufficient protection of output confidentiality, the Court did not decide whether utilizing enterprise or in-house AI tools with appropriate confidentiality safeguards for such tasks under counsel’s direction could fall within the scope of attorney-client privilege. Accordingly, it is important to review whether an AI notetaking tool provides sufficient protections to be viewed as an attorney-client communication tool that can preserve privilege. 

Practical Considerations

Attorneys must approach AI notetaker applications with deliberate caution and comprehensive safeguards:

  • Obtain consent from clients to any attorney use of AI tools: Attorneys should inform clients that AI notetaker or transcription tools may be used during the course of an engagement, what diligence was performed to protect confidentiality, and address any concerns expressed by the client prior to use. Practically speaking this notification is often provided in the engagement letter. In addition, some of these tools can provide notice and seek consent at the start of each meeting in which an AI tool will be used. It is best practice to use such a notification when available. 
  • Review all AI tool output: All notes and summaries should be reviewed by attorneys for accuracy. Attorneys cannot deliver competent representation if they rely solely on the output of AI tools without review.
  • Make clients aware of the risk of independent use of AI notetaking tools: Attorneys should advise clients that independent use of third-party AI notetaking tools is likely to result in records that are not protected by privilege or the work product doctrine which would make such records discoverable in litigation. Advise clients to avoid use of AI notetaking tools without attorney direction in any capacity that discusses active litigation matters, such as legal strategies or damage calculations. 
  • Establish clear internal policies and vetting procedures: Firms and clients should develop policies governing AI notetaker use, including verification procedures to ensure accuracy. Organizations should carefully vet AI tools to confirm closed-system implementation, no third-party data distribution, and no use of client data to train underlying models. In-house attorneys should consider internal policies limiting the types of information that can be recorded by AI notetaking apps to prohibit recording discussions of legal or sensitive confidential matters. 
  • Cross-functional governance: A team comprised of technology, compliance, and legal experts should review and approve AI tools including those with recording capabilities to strengthen safeguards.
  • Prepare for discoverability: Attorneys should advise clients on retention procedures with respect to transcripts and summaries developed by AI notetaking tools, including what data may be retained, for how long, and development of procedures to ensure legal holds extend to such information when appropriate.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *