<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1490657597953240&amp;ev=PageView&amp;noscript=1">

How AI is Upping the Ante in Vendor Management

“Every now and again, it’s actually nice to get a student essay that is so confused and rambling there is no chance they used AI.”  
–Corina Roche-Baron, my College English Teacher Daughter 

Gonzo readers, nobody disputes the changes, benefits, or future impact that AI will bring about. Almost every financial institution has people with knowledge of, and possibly experience with, AI tools such as CoPilot, ChatGPT, and Azure. Many have internal projects or research underway for processes and tasks that will use these tools. 

However much bankers are being impacted internally by AI, it pales in comparison to what is happening with their vendors.   

Consider these questions:  

1. How familiar are you with these names?

  • Anthropic Claude 
  • jasper.ai 
  • Zest 
  • copy.ai 
  • Einstein 
  • Larky 
  • Posh  
  • H20.ai 
  • SageMaker 
  • Datarobot 

This is just a sampling of tools technology vendors use in lending, delivery, marketing, fraud, and other systems that banks use. Some access customer data. Some use algorithms for credit decisioning, pricing, risk ratings, fraud mitigation, trading/investments, and other things that directly affect a bank’s client base.  

2. How much does your institution know about vibe coding?

In its simplest terms, this is using AI to create initial code that can be modified by a programmer. This is no small matter. A recent Wired magazine article quoted a Checkmarx survey of chief information security officers who said that more than 60% of their organization’s code was generated by AI in 2024. At the same time, only 18% said that their organizations have a list of approved tools for vibe coding. And for the most part, there is no deep knowledge about the specifics of how the code is created.  

Here’s the point of these questions. We can see that AI will have a huge impact on delivery, risk management, growth, profitability, staffing, and other areas. But the greatest impact on financial institutions will not be by AI bought and deployed internally. It will be through the AI used in systems purchased from vendors. This means that an area that will need a lot of attention sooner rather than later is vendor management.  

3. How much will we need to know about what vendors are doing with AI? 

First, where will AI initially be deployed? Earlier this year, Cornerstone surveyed 300 bank and credit union executives on where they want/expect AI to be deployed. These were their answers: 

ai-deployment-areasNote that four of the top six use customer data and have a direct impact on customer delivery.  

Understandably, regulation is behind deployment in the AI arena. But experience tells us that, based on where AI will be used, this will change.  

This all falls under the concept of “Know Your AI,” a broad construct created by regulators and the Government Accountability Office that identifies areas where banks need to monitor and judge the trustworthiness of their AI tools and deployment.

 (To be clear, “Know Your AI” has nothing to do with the growing number of people who say they have a romantic relationship with AI. These “AI-lationships” are incomprehensible, non-Gonzo, and will never be discussed again in these pages. Like, ever.) 

Areas deemed monitor-worthy by regulators and the GAO include: 

  • Accountability and Transparency – Relevant parties are responsible for negative outcomes, and information about the AI models is available to users. 
  • Validity and Reliability – AI models perform as intended, and outputs are accurate and robust. 
  • Safety – Human life, health, and property are safe. 
  • Security and resilience – Attacks can be avoided or mitigated, and normal operations can be maintained. 
  • Explainability and interpretability – Financial institutions understand how and why an AI algorithm produced decisions, predictions, or recommendations. 
  • Privacy – Customer information and other sensitive data are safe from unauthorized access or use. 
  • Fairness – Bias and discrimination are mitigated. 

So what? Bankers need to know a lot about what their vendors are doing with AI and explain it to their regulators. 

Here is a vendor management checklist to help you and your team get started: 

1. Read these regulatory publications:  
2. If you do not have an Acceptable Use Policy, create one. Use publicly available examples as a starting point. Your team needs to know what they can and can’t use AI for, the tools that can be utilized, how data must be protected, etc. 

3. Create an inventory of AI tools used by your vendors and the models (algorithms) they have produced. This can be complicated because tools and models keep changing. Cornerstone’s Vendor Management team, who are obsessed with this kind of stuff, are in the process of identifying and classifying this ecosystem. Boy, is it complex!  

4. Obtain and monitor vendor AI policies that may be part of your exams. In the longer term, determine what must be added to vendor contracts to ensure safety and compliance. 

5. Assign internal accountability for understanding models/algorithms. For example, who is responsible for understanding a lending system vendor’s scoring model? Somebody in lending? IT? A focused AI position? This may take some thought.  

6. Start linking major AI initiatives, either a vendor’s or your internal ones, to measurable outcomes. Some AI initiatives will be pure R&D, and they should be. But in the end, AI needs to measurably improve efficiency, reduce risk/loss, enhance the customer experience, or in some other way increase the bottom line. 

Understanding and managing new technology and capabilities is part of a banking executive’s job description. But AI will be the most rapid and fundamental game-changer we have ever seen. Smarter banks will move fast to understand, manage, and apply it. 


Terence Rocheis a founding partner atCornerstone Advisors.Follow him onLinkedInandX.