We caught up with Analytics Engineer and writer Madison Schott for a no-fluff conversation on what “AI-ready data” really means in the real world. From hidden blockers and AI hype to how data teams are actually using these tools—Madison doesn’t hold back.
Introduce yourself!
I’m Madison Schott, an Analytics Engineer at Kit and the writer of the Learn Analytics Engineering newsletter. I share weekly tips and resources on analytics engineering. I also post daily on LinkedIn. I’ve been in this field for almost five years—started out as a data engineer at Capital One. I studied business in college, so analytics engineering is kind of the perfect intersection between business context and the technical depth I gained as a former data engineer.
Are you using LLMs or natural language interfaces on top of your data stack?
Honestly, no. Most companies I’ve worked with aren’t even close to ready for that. They’re still trying to nail the basics, like getting metric logic out of BI tools and into consistent data models. AI can’t help if your foundation is a mess.
What’s the biggest blocker to implementing AI?
Foundational work. At smaller companies, teams often skip the groundwork. There’s a lot of logic written ad hoc in BI tools with very little vetting. But at some point, as these companies begin to scale and the pressure to be data-driven increases, that foundational gap becomes a real blocker. You can’t use AI on data that lacks context.
What does “AI-ready data” mean to you?
Three things:
- Core models – Well-built, validated, and cleaned.
- Documentation – Not just technical, but understandable by business users.
- Processes – Business users must know how to use the data properly.
If any of those are missing, AI is just going to guess, and that’s risky.
Where do teams underestimate the prep work?
Everyone underestimates how long it takes to build solid, trustworthy data models. People think it’ll take weeks. It takes months. You have to understand what’s being used, what’s valuable, and gather all the business context. That means talking to stakeholders, figuring out their real needs, and translating that into reliable models. It’s not fast, but it’s necessary.
Should all metrics live in dbt?
Not always. If you don’t have a semantic layer, some logic will live in BI tools. But your dbt models should make it easy to reuse logic without rewriting it every time. Don’t let analysts guess which field to use.
When, then, should logic move from the BI tool to a governed model?
Start by checking dashboard usage. No point in modeling unused reports. Focus on:
- Most used dashboards – logic that shows up in the dashboards people use most is a strong signal it’s worth moving upstream into a governed model.
- Repeated logic across dashboards
- Complexity – if something requires 15-table joins every time? Move that to dbt.
One common AI use case is dashboard search, but AI needs to know which dashboards are actually trusted. What makes a dashboard “trusted”?
If it’s not good enough for AI, it shouldn’t be in your stack at all.
Data teams need to stop hoarding dashboards. Keep the ones that are used, maintained, and based on governed definitions. And if there’s already a lot of clutter, that’s a strong sign it’s time to run a decluttering initiative. Don’t throw AI on top of chaos. Clean up, enforce ownership, and remove dead dashboards. Only then should you give AI access. Otherwise, you’re just training it to trust bad data.
What’s your approach to managing dashboard sprawl?
If no one touches a dashboard in a month? Delete it. If someone complains, great—you just found out it’s still needed. Otherwise, good riddance.
AI-ready buzzword that’s more hype than help?
“AI Engineer.” I saw a job post recently and realized it was just… a software engineer. Is the expectation that they’ll just prompt ChatGPT all day? Feels like the “Analytics Engineer” hype cycle from five years ago, just with a shinier label.
How do you actually use AI today?
As a tool, not a replacement. I use it to:
- Write dbt model templates
- Help with PR reviews (especially explaining someone else’s SQL)
- Look up syntax or logic patterns quickly
- Document source models by pulling API docs
But if AI can write your entire documentation? Your docs probably aren’t good enough. Real documentation should reflect business context—not just schema descriptions.
Want to get your data AI-ready?
If you’re facing cluttered data environments, duplicated logic, and inconsistent definitions, now is the time to act. Euno helps data teams declutter their environment, create a source of truth for metrics, and certify trusted dashboards, so when AI enters the picture, it’s pulling from reliable, governed data. Deploy AI today with Euno and watch it get smarter over time.
Learn more here.