Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124


During their first experiences with LLMs and agentic AI, software engineers from AI concept Applied advanced code generation, complex schematics and heavy instructions.
But quickly, trial and error taught the team that they could get rid of all that complicated data modeling. Ryan Nystrom, Notion’s head of AI engineering, and his team looked to simple prompts, human-readable representations, minimal abstraction, and familiar markdown formats. The result was a considerable improvement in model performance.
Applying this revamped approach, the AI-native company released V3 of its productivity software in September. Its notable feature: customizable AI agents, which quickly became Notion’s most powerful AI tool to date. Based on usage patterns compared to previous versions, Nystrom calls this an “incremental feature improvement.”
“It’s that feeling of having the product taken away from you rather than trying to push it,” Nystrom explains in an article. VB Podcast Beyond the Pilot. “We knew from that moment, very early on, that we had something. Now we’re asking ourselves, ‘How could I use Notion without this feature?’
As a traditional software engineer, Nystrom was accustomed to “extremely deterministic” experiments. But a bright moment came when a colleague advised him to simply describe his AI’s prompt as he would to a human, rather than codifying rules for how agents should behave in various scenarios. The rationale: LLMs are designed to understand, “see,” and reason about content in the same way humans do.
“Now, every time I work with AI, I reread the prompts and tool descriptions and [ask myself] Is this something I could give to a person without context and they could understand what’s going on? » Nystrom said on the podcast. “Otherwise it will do a bad job.”
Moving away from “pretty complicated rendering” of data in Notion (like JSON or XML), Nystrom and his team represented Notion pages as markdown, the popular device-agnostic markup language that defines structure and meaning using plain text without the need for HTML tags or formal editors. This allows the model to interact with, read, search, and make changes to text files.
Ultimately, this forced Notion to rewire its systems, with Nystrom’s team focusing largely on the middleware transition layer.
They also identified early on the importance of exercising restraint regarding context. It’s tempting to load as much information as possible into a model, but this can slow things down and disrupt the model. For Notion, Nystrom described a limit of 100,000 to 150,000 tokens as the “sweet spot.”
“There are cases where you can load tons and tons of content into your popup and the model will struggle,” he said. “The more you put in the pop-up, you see a degradation in performance, latency and also accuracy.”
A Spartan approach is also important in the case of tooling; This can help teams avoid the “slippery slope” of infinite features, Nystrom advised. Notion focuses on a “curated menu” of tools rather than a large Cheesecake Factory-style menu that creates a paradox of choice for users.
“When people ask for new features, we can simply add a tool to the model or agent,” he said. But “the more tools we add, the more decisions the model has to make.”
The bottom line: Channel the model. Use APIs as they are intended to be used. Don’t try to be fancy, don’t try to overcomplicate things. Use simple English.
Listen to the full podcast to learn more about:
Why AI is still in the pre-Blackberry and pre-iPhone era;
The importance of "dog food" in product development;
Why you shouldn’t worry about the profitability of your AI functionality in the early stages – this can be optimized later;
How engineering teams can keep tools minimal in the MCP era;
The evolution of Notion, from wikis to full-fledged AI assistants.
Subscribe to Beyond the Pilot at Apple Podcasts, SpotifyAnd YouTube.