skip to main content

AI and the Board of Directors

In this episode of “Waking Up With AI,” Katherine Forrest and Anna Gressel look at some of the governance, regulatory and technological questions that boards of directors may want to consider amid a rapidly evolving AI landscape.

  • Guests & Resources
  • Transcript

Katherine Forrest: All right, hello everyone and welcome to another episode of “Waking Up With AI,” a Paul, Weiss podcast. I'm Katherine Forrest.

Anna Gressel: and I'm Anna Gressel.

Katherine Forrest: And Anna, it’s become autumn.

Anna Gressel: I know that happened so, so fast.

Katherine Forrest: I mean, I feel like when I walk outside now, there's that back to school feel everywhere you go.

Anna Gressel: Mmhmm. I have to say I love the fall.

Katherine Forrest: Well, I do too. And when I was little, before the first day of school, I literally could not fall asleep. And I'd get so excited — you have your new clothes all laid out and all of that.

Anna Gressel: For me, it's the moment when the pumpkins start appearing in the delis. I love that moment. I love seeing them on the stoops everywhere in New York. It's just such a festive season here.

Katherine Forrest: Yet, you know, you just placed yourself in New York City with, like, the pumpkins in the deli. All right, so today's topic for our audience is AI and the board of directors.

Anna Gressel: Yep, and as AI has become a key topic for so many industries, there are a ton of companies.

Katherine Forrest: Yeah, and so many regulators.

Anna Gressel: And it's really important to make sure that your board knows what the company is doing so they're kept appropriately informed and, you know, just that they're educated about the key things they need to know regarding AI.

Katherine Forrest: Right, and our audience is really a mixture of people, some of whom deal with boards on a regular basis and some who don't. So let's start with a question of why. Why it's important to keep boards informed about the AI usage at their companies and then let's go on to the “what” and the “how.”

Anna Gressel: Yep, I think it's always good to start with the “why.” So let's dig into that a little bit. Boards of directors are responsible for the ultimate corporate oversight of a corporation. They have obligations to the company, to the owners of the company, and that can be the shareholders or investors. And that includes ensuring that appropriate governance is in place.

Katherine Forrest: Right, and now in the area of AI, we have regulators at both the state, the federal levels, across a number of different geographies, across a variety of industries, and they've all, seems like, promulgated rules and principles that they expect companies to be aware of and/or in compliance with.

Anna Gressel: So I think it would make sense today, Katherine, to talk through a few things a board member would want to understand about a company. And that could be questions they might ask, actually, to the company in their capacity as a member of the board. So the first question is to get a report on how AI is actually even being used across the company.

Katherine Forrest: It's really so important knowing the “how” or even whether AI is being used is an important starting point because then, from that, you can look at the issues, the risks, and the responsibilities that might follow.

Anna Gressel: Yeah, and we see companies brief boards at all levels of maturity. I mean, some are really not using AI. Some are using a ton of AI. And I do think that's very critical context for the board to have, because part of the board's understanding of the company's usage of AI is to understand actually how AI fits in with the company's strategic plan and its value alignment in terms of its tech investments more broadly.

Katherine Forrest: Absolutely. And the second question that I talk to boards about is, assuming that there is some AI use, what structure is in place to ensure that the AI use is being appropriately overseen?

Anna Gressel: Yeah, I think that phrase is really key, “appropriate oversight.” And the kind of oversight any company needs to have for AI depends on a number of factors. How AI is being used, where the company does business — so is it US only? Is it a company with reach into the EU or other jurisdictions that have their own regulations and principles? And whether they have the use of AI that could be seen to carry different kinds of risk. That could be regulatory risk, operational risk, all kinds of risk, depending on the jurisdiction.

Katherine Forrest: Right, and there really is no one-size-fits-all for a board. So a member of the board could, within the context of a company that they're associated with, determine what the uses of AI are, what structures are in place to ensure appropriate oversight. And then the board member might want to ask what systems are in place to test the AI tools for accuracy, fairness, and efficiency. They don't have to become, by any means, experts in the engineering, but just ensuring that there are appropriate systems in place for ensuring that the AI tools are accurate, fair, and efficient, I think is an important additional consideration.

Anna Gressel: Yeah, definitely. And a third issue for boards to get their arms around are just a basic understanding of the regulatory frameworks applicable to AI in their particular industry of operation, but also, again, within the geographies in which that company conducts business.

Katherine Forrest: Right, and it's important to emphasize that a board member doesn't have to be, as I've said, an expert in the engineering, nor do they have to be an expert in the granular nuances of the White House Executive Order, the EU AI Act, the California AI bills that we'll talk about in another episode, the Colorado laws, what's happening in other far-flung jurisdictions. They don't have to be expert in all of that, but they need to have a sense of where their company's use of AI fits into the most important applicable regulatory schemas for their company.

Anna Gressel: Yeah, and I think it's also important to recognize there are differences in how companies are organized and the different legal frameworks that will apply to them based on their size, whether they're publicly traded. So a board of a publicly traded company would want to have a general understanding of the kinds of questions to ask and information that's relevant to their compliance with SEC rules and regulations, and that might include disclosure requirements.

Katherine Forrest: Yeah and in pharma, of course, there's the FDA and so many companies have particular regulatory schemas that they have to keep abreast of.

Anna Gressel: Yeah, now AI is really becoming its own type of regulatory scheme.

So, the fourth area for boards to understand and ask questions about is AI risk. And if AI is being used, how are risks being assessed by the company? How are they actually being addressed? I mean, if you think about our episode we just did on mitigations, that's a key part of that question. And if there are mitigations or risk reduction actions that are being taken, are they effective, and is there appropriate personnel and resource coverage for those initiatives?

Katherine Forrest: Alright, so let's just take two examples. Let's assume we've got a company that's had AI usage for years, such as a financial services company, which may have been using various forms of narrow AI, really almost for a couple of decades, and now has new uses with generative AI tools. And,with that new usage comes new regulatory requirements and there's a lot of proposed rulemaking. And a board member would want to get to a baseline of how AI is being used generally, what regulatory schemas are applicable and what the company has put in place to ensure compliance. So the board member may also want to understand the level of risk that some of the tools may expose the company to.

Anna Gressel: Katherine, why don't we take this as a hypothetical? What if a financial services company is using a tool to assess real-time changes in macro and microeconomic conditions, and then that's fed into some, let's say, real-time trading that's occurring? What would you think a board member would want to ask?

Katherine Forrest: In that case, model error could have real impacts. The board member doesn't need to understand the technological details of all of this, just what type and level of risk there might be. And if there's potentially minor risk, that's one thing. If there could be a catastrophic event that could occur, if there's certain error risk, that's something else. And so the board member, in each case, might want to calibrate their follow-up questions accordingly.

But then I said I'd give a second example, and the second example is one where the company is simply using AI really just for operational office efficiencies. For instance, some ChatGPT-type usage to assist with drafting emails and drafting some marketing materials — that's an entirely different level of risk.

Anna Gressel: Yeah, and I think the risk mitigation measures that need to be put in place there are just, they're so different. And that's really what the company, the legal department, the AI council should be thinking about and advising on. And then the board can ask for information just, again, about how AI risks are being mitigated and then the resources that are being devoted to the effort to make sure that they're sufficient.

Katherine Forrest: Right, and then another question I think for board members is what use cases are being used in the industry in which their company sits, and trying to understand what the competition is doing with AI and why, just to ensure that they've got sort of a sense of the overall environment where they fit, how their strategic plan may be implicated, etc.

Anna Gressel: And another question that we don't always talk about as lawyers, but I think is really critical for boards and for companies generally, is what is the impact of AI usage on the workforce? Is it going to be a job eliminator? Does there need to be re-skilling for personnel or hiring initiatives? mean, AI is a huge driver of transformation, but sometimes that also means transformation and workforce change. And we've seen real efforts in that area by some companies that are being forward thinking.

Katherine Forrest: That's right. And really these are seismic shifts that are happening in the business landscape all around us with AI. And it's just so important that boards ask the right questions to educate themselves and keep abreast of everything.

Anna Gressel: Yep, and boards don't have to become AI engineers. That is not at all what we're saying. But with increased regulatory scrutiny, they're going to want to ask enough questions to understand the basic corporate implications of these important technologies.

Katherine Forrest: And I get asked, Anna, sometimes, well, how do we ensure that our board does ask the right questions? And one recommendation is that companies have some form of board education that's calibrated to the board level, not calibrated to, again, detail levels that they don't have to sort of get involved with. But what do they need to do to ensure that they're complying with their fiduciary obligations?

Anna Gressel: Yeah, and I'd add that because the landscape is changing so quickly, they'll need periodic updates on that education as well.

Katherine Forrest: Right, know, updates on topics that we've raised like the company and how the company is using AI, the regulatory developments, the risk exposure, and what's coming up on the horizon.

Anna Gressel: Yeah, I mean, all of this has a big financial impact as well. So it'll help the board understand how and why certain money is being budgeted and invested by the company.

Katherine Forrest: Alright folks, well we're going to certainly, I think, dig into these topics a little bit more deeply in future episodes. But that's all we've got time for today. I'm Katherine Forrest.

Anna Gressel: And I'm Anna Gressel. See you all next week.

Apple Podcasts_podcast Spotify_podcast Google Podcasts_podcast Overcast_podcast Amazon Music_podcast Pocket Casts_podcast IHeartRadio_podcast Pandora_podcast Audible_podcast Podcast Addict_podcast Castbox_podcast YouTube Music_podcast RSS Feed_podcast
Apple Podcasts_podcast Spotify_podcast Google Podcasts_podcast Overcast_podcast Amazon Music_podcast Pocket Casts_podcast IHeartRadio_podcast Pandora_podcast Audible_podcast Podcast Addict_podcast Castbox_podcast YouTube Music_podcast RSS Feed_podcast

© 2024 Paul, Weiss, Rifkind, Wharton & Garrison LLP

Privacy Policy