Skip to content
Install
Back to Blog

Context is the product. And now it’s portable.

Mar 6, 2026
Jim Brown
Jim Brown
Context is the product. And now it’s portable.

I've spent my career in analytics engineering, and the most important aspect of the work I do isn't what most people think. It's not the dbt models, dashboards, or even the metrics themselves. The most important thing I build is business context around the data.

That means understanding the language of the business, learning the shape of the data, building relationships across departments, and absorbing each team's unique needs. Once I have all of that context, I can start to make an impact by shaping the data and the ways people interact with it to meet the needs of the business. That's the "work" of analytics engineering.

The problem, though, is all of that context has always been perishable. It lives in your head, in a Slack thread you'll never find again, and in the assumptions baked into a query that nobody documented. And when you switch projects, go on vacation, or hand something off to a teammate, you have to re-gather all the relevant information, retrace your steps, and rebuild the mental model you already had. Every analytics team knows this pain.

This “context re-gathering tax” quietly shapes a lot of analytics work and is often the reason certain improvements never make it out of the backlog. AI-native workflows are starting to change that.

Recently, I ran into an example in my own workflow of how that dynamic is changing.

A familiar scenario

We had a metric calculated in a one-off query in Hex. It worked fine for its original purpose, but then came a few too many urgent requests to slice it by a dimension that wasn't in the original query. Time to move it to the semantic layer.

I fed all the context — what the metric represented, how it was calculated, why it mattered — to our agent orchestration tool, Intent. I linked it to the Hex dashboard where the query lived, connected via MCP, and instructed it to incorporate the metric into the semantic layer.

While testing, I noticed some small deltas that were enough to matter for a highly-visible metric. I could see the root cause, but I decided not to make a call on the solution in a silo. Instead, I told the agent to create a Linear issue, also connected via MCP, so I could bring it to the team.

Nothing revolutionary so far. Standard analytics engineering workflow, just with a different tool in the middle.

The part that changed everything

The team met the next day. We agreed on the path forward, but we almost backlogged it. Verification felt like a nightmare. Avoiding breaking changes on a metric this visible would require careful work, and we all had other priorities pulling at us.

A few months ago, I probably would have let it go to the backlog too. We had a solution that was functional, after all. Spending my time to see it through would have been a tough sell.

But unlike in my old workflow, now all I had to do was point a new agent to that Linear issue. The issue already contained the full context from my original working session, including the metric definition, the delta I found, and the root cause analysis. It had been updated with notes on how to proceed based on our team conversation. The agent coordinated all the work, and after some additional verification, it was done.

And that's when it clicked for me. The context survived. There was no "let me re-gather all the context and remember what I was thinking" tax.

It survived the handoff from my working session to a ticket. It survived the gap between Tuesday's discovery and Wednesday's team discussion. It survived the transition from one agent session to another. Anyone on our team could have picked it up, and any amount of time could have passed.

Context as a durable asset

For as long as I've worked in analytics, context has been the most valuable and most fragile thing on the team. Senior people carry enormous amounts of it. When they leave, it walks out the door. When they context-switch, it decays. We've tried to solve this with documentation, but documentation is a snapshot. It goes stale the moment you write it.

What I experienced in this scenario felt different. The context wasn't captured in a static doc that someone would need to interpret. It was embedded in a chain — working session to ticket to agent to resolution — and it stayed alive through each link. That chain became a transferable artifact, not just a record. And the pattern performed so well that we’ve started incorporating it into our regular workflow (Intent is in public beta if you want to try it for yourself).

Post image

Context becomes a portable artifact that survives across surfaces.

I still have the IDE open, but it's not the primary surface for interacting with our projects anymore. The ability to offload the execution gives me time and brain power to think further ahead, to focus on where to spend my expertise rather than on the mechanics of spending it.

What else belongs on the backlog?

That near-miss with the backlog keeps nagging at me. We almost shelved important work because the cost felt too high relative to the payoff. It’s a rational decision that analytics teams make every week. But the cost calculation was wrong.

We were estimating based on the old model, where someone would need to re-absorb all the context and do the careful, tedious verification work by hand. If context is portable and agents can coordinate the execution, a lot of those cost estimates need to be revisited.

How many data quality fixes, metric refinements, and schema improvements are sitting on your backlog right now — not because they aren't important, but because the person with the context moved on to something else?

I think the backlog is about to get a lot shorter. And I think that matters more than any individual productivity gain.

Written by

Jim Brown

Jim Brown

Analytics Engineer

As a data and analytics engineer at Augment Code, Jim builds analytics systems and AI-native workflows that make business context easier to capture, share, and act on. He focuses on the intersection of analytics engineering, semantic modeling, and agent-driven tooling to explore how durable context can improve how teams build and use data. He studied at Columbia University.

Get Started

Give your codebase the agents it deserves

Install Augment to get started. Works with codebases of any size, from side projects to enterprise monorepos.