By Anne Lockner
Recently, during a virtual meeting of a board-of-directors committee, the chair noticed one of the absent committee members had suddenly appeared in attendance. But after closer review, the chair realized it was not the human board member who had joined, but rather, the member’s “AI assistant” that automatically joins any meeting listed on the missing board member’s calendar, summarizes meeting content, and then circulates notes to all those included on the meeting appointment. This is a scary scenario for many organizations, but especially for those like boards and board committees, whose meetings are the backbone for corporate governance.
Artificial intelligence (AI) tools are rapidly transforming the way organizations conduct business, manage information, and communicate, including AI-powered note-taking tools, which promise increased efficiency by transcribing meetings, summarizing discussions, and providing actionable insights. While these tools offer convenience and utility, they also introduce significant governance risks—particularly for corporate boards, where confidentiality, privilege, and trust are paramount.
Boards must tread carefully when incorporating AI tools into their workflows. AI note-taking tools, if not implemented thoughtfully, can inadvertently compromise the confidentiality of board discussions, including privileged communications and sensitive executive-session deliberations. This article explores the governance challenges posed by AI note-taking tools and offers strategies to mitigate these risks while harnessing the benefits of AI responsibly.
AI note-taking tools use advanced natural language processing to transcribe conversations in real time, summarize key points, and even suggest follow-up actions. For boards, these tools promise to streamline processes, improve documentation accuracy, and free up directors and corporate secretaries to focus on strategic decision-making rather than minutes-taking.
The sensitive nature of board discussions, however, makes the use of such tools far more complex than their adoption in routine business contexts. Boardrooms are the epicenter of organizational strategy, risk management, and governance, often dealing with highly confidential and legally sensitive matters. As such, the use of AI in these spaces cannot be undertaken lightly.
Key Governance Risks
- Confidentiality Breaches
AI note-taking tools process data in ways that can expose sensitive boardroom information to unintended parties. Many tools rely on cloud-based platforms for transcription and analysis, creating the risk of unauthorized access or data breaches. If confidential information about mergers, acquisitions, strategic plans, or litigation is compromised, the organization could face legal, reputational, and competitive harm. - Loss of Privileged Communications
Certain board discussions are protected by attorney-client privilege. Introducing an AI note-taking tool into these discussions can inadvertently waive privilege if the tool is not adequately secured or its use is not carefully controlled. The loss of privilege can expose sensitive information in future legal proceedings or be used to competitively disadvantage the company. - Compromising Executive-Session Privacy
Executive sessions allow boards to deliberate privately without the presence of management or external advisors. These sessions are critical for candid discussions, particularly regarding performance evaluations, succession planning, or sensitive governance matters. Recording or transcribing these discussions with AI tools risks eroding the sanctity of the executive session and may deter directors from speaking openly. - Regulatory and Legal Risks
Depending on the jurisdiction and industry, boards may face regulatory obligations to safeguard certain types of information. For example, healthcare and financial services boards may handle data protected under HIPAA or GLB regulations. The use of AI tools that do not comply with these requirements can expose the organization to penalties and litigation. - Data Ownership and Vendor Risks
Many AI tools are provided by third-party vendors who may assert ownership or access rights over the data processed by their platforms. Boards must carefully assess whether their chosen tool’s terms of service adequately protect the organization’s intellectual property and sensitive information.
Best Practices for Responsible Use of AI Note-Taking Tools
Boards seeking to leverage AI tools while safeguarding governance integrity should adopt the following strategies:
- Assess Necessity and Scope
Before adopting an AI note-taking tool, boards should critically assess whether the tool is necessary and appropriate for their context. In some cases, traditional note-taking methods may be more suitable for preserving confidentiality and privilege. - Choose the Right Tool
Not all AI tools are created equal. Boards should conduct thorough due diligence to select tools designed with robust security features, such as end-to-end encryption, data localization, and compliance with relevant regulations. Tools that allow local data storage rather than cloud processing may offer enhanced security. - Establish Clear Usage Policies
Boards should develop and enforce policies governing the use of AI tools, including:- Prohibiting the use of AI tools during executive sessions or privileged discussions.
- Restricting access to AI-generated transcripts and summaries to authorized personnel only.
- Implementing protocols for having an attendee or corporate secretary review and revise AI-generated notes to ensure accuracy and then delete AI-generated data not required for preservation.
- Engage Legal Counsel
Legal counsel should be consulted to ensure the use of AI tools does not compromise attorney-client privilege or violate regulatory requirements. Counsel can also advise on contractual terms with vendors to protect the organization’s interests. - Provide Training for Directors and Staff
Directors and board staff must understand the risks associated with AI tools and how to use them responsibly. Training should include guidance on identifying sensitive discussions where AI tools should not be used. - Monitor and Review
The board should periodically review the use of AI tools to ensure they remain appropriate and secure. This includes monitoring for updates to vendor terms of service, changes in regulatory landscapes, and advancements in AI technology that may introduce new risks or opportunities.
AI note-taking tools can potentially enhance board efficiency and governance, but only if implemented thoughtfully and cautiously. Boards must balance the benefits of these tools with the need to preserve confidentiality, privilege, and trust. By adopting a proactive and informed approach, boards can navigate the risks of AI responsibly while reaping its advantages.
The boardroom is a unique environment where the stakes are high, and the margin for error is small. As AI tools become increasingly integrated into corporate governance, board members must remain vigilant about the risks and ensure their use aligns with best practices in governance and compliance.
AI tools are not inherently problematic, but their misuse can have profound consequences. For board members, the guiding principle should be clear: Technology must serve governance, not undermine it. By prioritizing confidentiality, privilege, and the integrity of executive discussions, boards can embrace innovation without compromising their core responsibilities.
Related Attorneys
- Partner