10.8 C
New York
Saturday, March 21, 2026
Array

Compliance costs risk widening the AI gap


AI might be a boon — if a company can absorb the indirect “compliance tax.”

In a follow-up to the latest InformationWeek Podcast, panelists Ameya Kanitkar, CTO at Larridin, and Eddie Taliaferro, director of enterprise governance, risk and compliance and data protection officer at NetSPI, described how the cost of regulatory compliance could stymie some AI plans.

Policies meant to set guardrails specifically on AI are still under debate in many jurisdictions. The Trump administration finally issued a national legislative framework on March 20. Meanwhile, data privacy regulations such as the European Union’s GDPR already intersect with the technology. Kanitkar said costs from GDPR compliance may widen the divide between deep-pocketed, larger companies that can afford to pay versus companies still working on profitability and growth. Together, these overlapping and changing rules are creating a compliance landscape that’s costly and uneven.

Related:AI-driven layoffs add new demands on CIOs to prove value

“You actually end up making the companies that are already powerful … even more powerful,” he said. 

The compliance challenge for AI is different — and more volatile ––than traditional mandates, Kanitkar said, because of the pace of the technology and the risks it raises. Regulations, while necessary, could slow companies down instead of letting them innovate. 

“At least we understand what privacy is. With AI, when things are changing so quickly, any well-intentioned compliance laws can still backfire,” he said. 

At the same time, the lack of clear rules creates its own uncertainty, leaving companies unsure of how aggressively to invest in or deploy AI. 

Part of the problem is a fundamental mindset difference between policymakers, who may work on laws over multiple years, versus fast-moving startups that change gears within weeks. “We are in that week-stage for all of AI. So, by design, there’s so much gap between the two,” Kanitkar said.

 

AmeyaKanitkar_EddieTaliaferro.jpg

Companies may already be gun-shy of breaching policies such as GDPR, which can incur potential fines of up to 4% of their global revenue for data privacy violations. Adding AI to the mix could mean a new layer of headaches. “Companies just tend to be far more conservative in terms of dealing with it, which means everything just slows down, everything becomes bureaucratic, everything requires approvals,” Kanitkar said.

The pace of change with AI models and their capabilities makes it unclear what will be regulated, he said. Kanitkar argued that laws grounded in principles rather than language that specifically targets AI could be more effective. “You can have a law that says, ‘Okay, no mass surveillance. Protect privacy.’ Something like that is true no matter the law, no matter the technology,” he said.

Related:AI transformation: Early wins are not enough for CIOs

On Friday, the United States got its first look at the framework issued by the White House, which seeks to supersede state laws on AI but still requires Congress to draft actual legislation. The effort reflects the pressure – particularly from the tech giants — to establish a national standard and preempt the patchwork of stricter state-level rules. 

In the meantime, Taliaferro noted that state-level regulations for AI are already in the offing and, in some cases, already in effect. “If you’re a U.S. company and you’re doing business with customers in California, Texas, Michigan, New York, they’re going to have their own set of AI governance regulations. And you’re going to have to learn how to adapt to that,” he said. 

More AI policy may be on the way in overseas jurisdictions, as Brazil, China, and the United Arab Emirates are also developing their own regulations and requirements, he said.

Looking at compliance costs for disaster, security, and other required coverage from financial and risk management perspectives, the potential impact on companies can go beyond putting technology resources in place, Taliaferro said. “Let’s say that from an administrative perspective, you don’t have the management in place. Or maybe you don’t have a particular person in charge of information security. Those are additional costs that you would have to incur to comply with those regulations.”

Related:Accelerate AI adoption: 3 reasons for adopting MCP

As updates to GDPR and other regulations account for AI risks, such as hallucinations and where AI gets its training data from, the policies may feel a bit familiar. “When you’re talking about AI governance and the risk associated with using AI, you’re really thinking about data privacy,” Taliaferro said.

Despite that potential familiarity with the intent of compliance, some companies may still grouse about additional expenses as they explore different AI tools and training. “They don’t quite know what direction they want to go in. They know that they have to. They know that AI is hot. It’s here … but they lack the proper direction on how to proceed,” he said.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

0FansLike
0FollowersFollow
0FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

CATEGORIES & TAGS

- Advertisement -spot_img

LATEST COMMENTS

Most Popular

WhatsApp