Across boardrooms, enterprise AI has become the biggest line item in the innovation budget — yet it’s also become the biggest source of anxiety. Andrew Frawley, CEO of Data Axle, believes the major problem begins before even a single line of code is written. “The real issue isn’t the technology itself, but the foundation,” he told me. “Companies are obsessing over models while neglecting or under-nurturing the one thing those models rely on: data.” Fragmented records and siloed systems have become default conditions in most enterprises. AI only exposes those fractures faster and at scale. “Some brands, blinded by AI’s possibilities and potential, rush for immediate deployment while bypassing the crucial, foundational work of establishing a data infrastructure,” he explained. “The most critical steps — which include establishing data ownership, building governance into workflows and enforcing quality standards — often get pushed aside in the interest of speed.” But that, according to Frawley, always results in misfires that damage trust. Udo Foerster, CEO of German consultancy Advan Team, sees similar dysfunction among the businesses he advises. For all the talk of algorithms, it’s the invisible plumbing beneath AI that’s doing the damage. Ken Mahoney, CEO of Mahoney Asset Management, flagged another overlooked bottleneck: The physical limits of AI’s appetite for energy and infrastructure. Frawley says that without clear strategy and clean data, models confidently push the wrong action. “Deploying AI on fragmented or inaccurate data is an act of self-sabotage,” he said. “It will amplify existing flaws, erode the quality of analytics and introduce a false sense of confidence in misinformed decisions. With fragmented or inaccurate data, they amplify errors and bias at speed, autonomously executing actions, pushing a business further in the wrong direction before the problem can be detected.”