Key Takeaways
- Snowflake Intelligence has strong performance out of the box, but a few strategic moves can take its performance over the top.
- Start with data readiness: clean, label, and structure your data before adding AI.
- Treat your AI tool like a product with a roadmap, phases, and a vision.
- Invest heavily in knowledge management. It’s what makes an AI system sound like your business, and builds user trust.
- Use your users’ questions to guide continuous improvement; their interactions reveal knowledge gaps.
Starting With Data Readiness
The first step wasn’t about dashboards or interfaces. It was about data quality.
We ran a complete AI readiness assessment to understand how well our data was labeled, formatted, and cleansed of PII. From there, we built semantic models across major business units and adjusted parts of our data infrastructure to make it easier for AI to interpret.
It wasn’t glamorous work. But that investment in structured, well-defined data made everything downstream smoother. When we started testing Cortex, it immediately understood our business context — because we had set it up to.
Treating It Like a Product
We created a clear vision and roadmap with three phases: structured intelligence (data and models), unstructured intelligence (documents and transcripts), and agentic intelligence (automations and actions). Communicating that roadmap early helped us set expectations and secure long-term buy-in.
We got leadership buy-in during the planning phase, which allowed the business to have a say in the design choices, communications management and overall rollout. This made it easy to gain user adoption because it wasn’t just the Data team pushing it; but business leaders were also urging their teams to take advantage of the new tool.
Designing for Clarity and Transparency
By default, Snowflake Intelligence shows users the reasoning steps behind its answers. It’s a great feature, but only if you understand SQL.
Most business users don’t — so we customized how the agent explains its thinking. We instructed it to speak like a human analyst: clearly state which filters were applied, explain its assumptions in plain English, and describe how it interpreted the question.
This change had an immediate effect. People trusted the system faster because they could see how it worked and correct it if something was off.
Developing and Integrating Knowledge Management
We quickly realized that even the most advanced AI tools are only as good as the context they have. So we treated knowledge management as part of the core build — not an afterthought.
We started by collecting scattered institutional knowledge and turning it into structured, searchable assets. The most transformative was a Sales Playbook that outlined our selling strategies, pricing structures, customer segments, and objection handling. Once Cortex had access to that, it began generating insights and recommendations that actually sounded like us.
Over time, we turned this into a full integration process. We took real user questions, analyzed where the AI struggled, and then asked it directly: “What knowledge gaps caused this?” Its responses guided us toward missing documentation or unclear definitions. From there, we either found existing resources or created new ones — FAQs, how-to guides, taxonomy sheets — to fill the gaps.
That loop became self-reinforcing: users asked better questions, the AI produced better answers, and our documentation grew more complete. What started as a data initiative quietly became a company-wide knowledge improvement engine.
Looking Back
Building Snowflake Intelligence at Puffco wasn’t just about keeping up with hype. Instead it was about future-proofing how we use data.
If you’re thinking about building something similar, start simple: get your data clean, document what your business knows, and treat AI like a product that needs iteration, not immediate perfection.
Listen to your users. Their questions are your roadmap. Every time someone asks something the AI can’t answer, it’s not project failure — it’s feedback.