Cybersecurity has ranked, yet again, as the top concern this year among community bankers surveyed by the Conference of State Bank Supervisors – with “technology and related costs” second.
Against that backdrop, Federal Reserve Gov. Michael Barr outlined some of the concerns and promises artificial intelligence and burgeoning technology hold for community bankers, in a speech Wednesday at a conference in St. Louis.
Here are four takeaways:
1. More affordable fraud detection tools
AI’s use in fraud may have become more sophisticated as time has worn on. But as the threat has proliferated, so, too, has the number of firms issuing products to fight it. That has made fraud fighting tools less expensive, Barr told attendees of the Community Banking Research Conference.
“It is my understanding that automated fraud detection, in particular, hasn't been cost effective for some community banks, which haven't been able to compete with larger banks that have the economies of scale to deploy these services,” said Barr, who, until February, served as the Fed’s vice chair for supervision. “But the huge buildout in AI capacity now underway, along with the explosion in the number of firms seeking to get into AI-based services, may have the potential for driving down costs enough to make AI-based fraud detection more feasible for community banks.”
2. Lean into new AI-related business
“Large investments underway in data centers – many of them in remote or rural areas where electricity is less expensive – could bring new business for community banks,” Barr said Wednesday.
The Fed governor said he “tend[s] to be an optimist about the potential for AI to make workers more productive, raise living standards and create more jobs in new industries,” but added he is “realistic … that it could cause considerable dislocation of workers and businesses.”
“Communities dependent on a small number of employers or a single industry that is significantly affected by AI could experience these dislocations,” Barr said.
However, community banks that choose to follow the money on upcoming data, energy and AI-related projects could see short- and long-term benefits, he said.
3. Differentiate through relationship banking
“Beyond the numbers on a loan application, what community bankers understand about their neighbors is ‘soft information’ that can be used to make better credit decisions,” Barr said.
Personal relationships with customers can give community banks an edge that their regional or national competitors may find difficult to gain, Barr said. But knowing the customer can also tie back to fraud prevention.
“Community banks have earned the trust and the business of people in their communities because of an understanding that the banks' successes depend on their communities succeeding,” Barr said.
4. Once more, on deepfakes
In his speech Wednesday, Barr cited a report from the International Monetary Fund indicating that the direct and indirect costs of cybercrimes have grown to encompass 10% of the global gross domestic product. A second source noted a 20-fold increase in deepfake attacks over the past three years.
Deepfakes use generative AI to re-create people’s voices and images to incite an intended victim to action.
“Using only a brief sample of audio and access to information about individuals on the internet, criminals employing GenAI can impersonate a close relative in a crisis or a high-value bank client seeking to complete a transaction at their bank,” Barr said. “What was once the stuff of movies is now a reality, and a lot is at stake.”
Wednesday’s comments are hardly Barr’s first on deepfakes. At a New York Fed event in April, the central bank governor urged banks to evolve their AI to use facial recognition, voice analysis and behavioral biometrics to fight deepfakes. Technical solutions, Barr added, can be used to detect subtle inconsistencies that point to the use of AI in audio or video that human observers may miss.
“In the past, a skilled forger could pass a bad check by replicating a person’s signature. Now, advances in AI can do much more damage by replicating a person’s entire identity,” Barr said at the time. “If this technology becomes cheaper and more broadly available to criminals – and fraud detection technology does not keep pace – we are all vulnerable to a deepfake attack.”