I recently had the opportunity to participate in a TDWI expert panel discussion on "Integrating Your Data and AI Platforms" alongside Dave Stoddard from TDWI, Asad Mahmood from HSO, and Sami Akbey from Insight Software. The conversation reinforced something I've been saying for a while: If your data program rates low on the maturity curve, your AI initiatives will rate even lower. You can catch the full discussion in the on-demand recording, but I wanted to share some key perspectives that emerged from our panel.
When asked about the biggest barriers to building and deploying AI models, the poll results during the webinar were telling — with 34% of respondents citing "lack of integration between data analytics and AI platforms" as their top concern. Another 32% pointed to insufficient data governance and security for AI expansion as a top challenge. These aren't just technical concerns — they're strategic imperatives that organizations can’t ignore.
Your data foundation determines your AI ceiling
Here's the reality that many organizations are discovering the hard way: AI without quality data is ineffective, and data without AI utilization is underutilized. These strategies cannot exist in separate silos because they fundamentally depend on each other.
I've seen this pattern repeatedly across industries. Organizations that rate themselves relatively low on data maturity inevitably struggle with AI initiatives. It doesn't matter if you're talking about generative AI, traditional machine learning, or data warehousing — the foundation remains the same. If your data program has significant gaps, those gaps will amplify when you layer AI on top.
Why business integration matters more than you think
One of the most crucial insights from our panel discussion was that implementing GenAI applications without involving business units is like implementing an ERP system without bringing the business into the conversation. You would never do that with SAP, right? The technology works, but the real value comes from how it applies to your specific business processes.
The same principle applies to AI initiatives. There's tremendous value waiting to be unlocked within your business operations, but that value can only be realized when business teams are actively involved in defining use cases, validating outputs, and integrating AI capabilities into their daily workflows.
This is where I see many organizations stumbling. They approach AI as a pure technology play, expecting IT teams to deliver business value without sufficient business context. The differentiation comes from your data combined with your unique business processes, not from the AI models themselves, which are increasingly commoditized.
Stop making users become prompt engineers
Here's something that's been bothering me about the current AI implementation landscape: We're asking business users to become incredible prompt engineers instead of building applications that work for them.
Instead of training everyone to craft perfect prompts, we should be thinking about GenAI applications more like traditional business applications. Users should be able to click a button and get the answer they need, with sophisticated agents, prompts, and models working in the background to deliver that experience or result.
Let me give you a concrete example that illustrates this point. During my 10 years at Oracle, I did 40 quarterly business reviews (QBRs). Each one took a minimum of 4-6 hours to prepare, and I'm probably cutting that estimate short. Ask any salesperson how long QBR preparation takes, and they'll confirm this time drain. It's not just the salesperson's time — sales managers spend equal amounts of time reviewing and validating the QBR.
Now, imagine if you could automate this process. What if data from CRM systems, support ticketing platforms, and product systems automatically fed into an agent-based application or combined with a well-designed system prompt that generated standardized, informative (not fluffed up) QBRs at the click of a button? Suddenly, salespeople get 4-5 hours of selling time back, and sales managers get that same time for coaching instead of QBR review.
This is the kind of practical AI application that delivers immediate business value. It's not flashy, but it solves a real problem that consumes significant time and resources across virtually every sales organization.
Focus on micro applications that solve real problems
The QBR example highlights a broader principle: The biggest AI wins often come from focusing on specific, repetitive tasks that people don't enjoy doing. Rather than trying to build one massive AI system that does everything, consider developing targeted micro genAI applications that address particular pain points.
This approach has several advantages. First, you can use smaller, private, structured datasets, which are easier to manage and validate. Second, you can create very specific system prompts or agents that consistently deliver the output format and quality you need. Third, you can iterate quickly and measure success clearly.
Organizations are achieving remarkable results by taking structured data sets and pairing them with well-designed agents or models. When you can give an AI system good context about what the output should look like and then query your data to feed that context, you start seeing results that would have taken hours to produce manually delivered in 30-45 seconds.
The possibilities here are infinite once you start thinking in terms of these focused applications rather than trying to solve everything at once.
Let cloud providers handle infrastructure so you can focus on outcomes
Nobody is asking me for infrastructure these days. Nobody says, "Kelly, get me into the infrastructure game." And frankly, no one is going to out-datacenter Google, AWS, or Microsoft. So why try?
If you can get scalability, reliability, optionality, security, and governance within platforms and services that are provided for you, that's exactly what an AI-ready data stack should deliver. When you don't have to spin up infrastructure, you can focus on those downstream data outcomes and business applications that actually drive value.
This shift in thinking is crucial for AI readiness. Instead of spending time and resources on the build-it-yourself approach to infrastructure and infrastructure management, you can concentrate on building those micro genAI applications and agentic processes that solve real business problems.
The modern cloud data platforms are increasingly providing storage, compute, and frameworks for any data workload. You no longer need to move data between different platforms to do AI versus traditional analytics. The data and AI strategies are converging at the infrastructure level, which makes integration significantly easier.
The multi-model reality
Here's something interesting I’m seeing in the current AI landscape: Solutions increasingly involve multiple specialized models working together rather than relying on a single large language model. The days of thinking, "We'll use a single model for everything," are giving way to more specialized choices.
A recent personal example for me was having three models all working together in one end-to-end application — one that provided solution context, another that generated synthetic datasets, and a third that created a data application from those synthetic datasets. Each model is optimized for its specific task, and together, they deliver a compelling outcome that no single model could achieve.
This trend toward model specialization reinforces the importance of having flexible, well-integrated data infrastructure. When you're orchestrating multiple models, each with different data requirements and output formats, you need platforms that can handle this complexity without forcing you to build custom integration layers.
Making the strategic shift
If you're a data leader wondering how to navigate this integration challenge, here are the key principles I'd recommend:
- Start with high-impact, low-risk business problems. Make sure your data is available, trusted, high-quality, and usable for these initial use cases. If those fundamentals aren't in place, step back and address them first.
- Think building blocks, not monoliths. Your data strategy should account for enterprise-wide governance and compliance, but implementation should happen incrementally. Identify specific agents or applications you want to build, ensure your data foundation supports them, then build on that success.
- Focus on business outcomes, not technology capabilities. The question isn't "what can this AI model do?" but rather "what business problem are we trying to solve, and how can AI help us solve it more effectively?"
- Embrace the platform approach. Use managed services that handle the infrastructure complexity so your team can focus on building applications that drive business value.
The foundation matters
The reality is straightforward: Every successful AI initiative starts with high-quality, accessible, and trusted data. Whether you're building customer-facing AI applications, internal productivity tools, or advanced analytics capabilities, the data foundation determines what's possible.
This is where Fivetran becomes critical. When you can reliably move data from source systems to your cloud data platform or data lake, you create the foundation for everything else. Your data engineers can focus on building AI applications instead of troubleshooting data pipelines. Your data scientists and AI engineers can spend time on modeling and building instead of data acquisition and preparation.
The organizations winning with AI aren't necessarily the ones with the most advanced algorithms — they're the ones with the most reliable, comprehensive, and accessible data. They've solved the fundamental integration challenges that allow them to iterate quickly on AI applications while maintaining data quality and governance standards.
As we look ahead, the integration of data and AI platforms will only become more critical. The organizations that recognize this connection and build accordingly will have significant advantages over those that continue to treat data and AI as separate initiatives.
The question isn't whether you should integrate your data and AI strategies — it's how quickly you can make it happen. The foundation you build today will determine what's possible tomorrow.
[CTA_MODULE]