Developing a Product Process for Notion
This is my final assignment from Stanford GSB's iconic Product/Market Fit (PMF) class, taught by Andy Rachleff. The course challenged us to rethink how great products are built - not by chasing flawless execution, but by obsessively hunting for a market that's truly desperate for what you're offering. Here, I lay out my personal product development process.
For this assignment, I am assuming the role of Chief Product Officer at a consumer software company similar to Notion - a place I would love to join post-GSB. To ground the process, I will illustrate an example of launching a hypothetical product, "Notion Knowledge Agents," where users can turn parts of their Notion workspace into AI-driven knowledge bases that others can pay to access.
1. Idea Generation & Prioritization
A. Authentic Sourcing
We begin by ensuring every proposed insight is championed by someone with a tangible, authentic connection to the problem space. Authenticity means:
- Personal Experience: The champion has directly felt the pain point or repeatedly seen it in their environment.
- Domain Expertise: They understand the relevant technology and user personas deeply, which often hints they have an "unfair" perspective.
- Genuine Enthusiasm: They are mission-driven rather than mercenary. This passion helps sustain momentum through inevitable challenges.
Suppose the PM lead from the Templates team has observed that many creators already sell specialized Notion templates. Inspired by recent breakthroughs in LLMs, she proposes: "What if we enabled these creators to train AI on their personal knowledge, then monetize access to it?" Her authenticity stems from seeing creators attempt (and struggle) to package and sell unique expertise beyond just static templates.
B. Tech Inflection (Ideally Non-ConsensusDrawing from Howard Marks who believes that being non-consensus and right is the only way to generate superior returns in investing.)
We focus on technology-based inflection points because these shifts tend to create brand-new product possibilities.
- If the champion's proposal is purely reliant on a social or behavioral trend without underlying technological change, I'm cautious about devoting resources to a fully new product.
- Conversely, if the technology (e.g., powerful LLMs) is at a tipping point, we see an opening to create something truly distinctive.
C. Consensus Filtering & Disruption Check
I as CPO gather one engineering leader, one business leader, and one design leader for quick one-on-one discussions rather than a group meeting (to minimize groupthink).
- If all three rave about it, the idea may be too consensus. If at least one or two are intrigued while another is dubious, that's typically a green light for deeper testing.
Time Check: This initial vetting usually spans 1–2 weeks of discussions.
We also give special weight to ideas showing signs of disruptive potentialAccording to Clayton Christensen Disruptive Innovation describes a process by which a product or service takes root in simple applications at the bottom of the market - typically by being less expensive and more accessible -and then relentlessly moves upmarket, eventually displacing established competitors. (new-market or low-end). If it's possible to serve an underserved or non-consuming audience in a simpler, cheaper, or more convenient way, that's a significant differentiator.
2. Small, Focused Team Formation
Once the idea clears the initial filters, I form a lean, cross-functional "insight team." Typically, it includes:
- The Champion (who has authenticity and passion),
- One Engineer (capable of quick prototypes and feasibility checks),
- One Business/Go-to-Market lead (to shape pricing models and market experiments).
This team's mandate is to learn swiftly rather than execute a predefined plan.
We invite the PM Lead (original insight generator), a senior engineer familiar with LLM integration, and a Notion marketing lead who understands how to position "Knowledge Agents" to the existing user base.
3. Articulating the Value Hypothesis
A. Building a Short, Structured Brief
At this stage, the team documents a concise 1–2 page overview:
- What? Identify the core job to be done and how the product addresses it.
- Who? Pinpoint a narrow initial user segment. In a consumer context, this might involve demographics or a specific user persona. Importantly, aim for early adopters who are desperate enough to try incomplete solutions. Need is insufficient, desperation is key.
- How? Propose a working business model that might serve as the initial monetization approach. The question is: "How do we charge?" - subscription, transaction fee, usage fee, or otherwise?
B. Leap-of-Faith Assumption
They then clarify the single biggest assumption that has to prove true for the product to succeed. Often, it's something about user willingness to pay, or consistent usage patterns. If this assumption fails, the entire idea collapses.
- Brief: "Let advanced creators train an AI agent on their Notion workspace to package and sell specialized knowledge."
- Who: We start with top-tier template sellers who already have paying customers. They are likely early adopters open to new revenue streams.
- How: A subscription model where an external user pays $10 per month to query the AI. Part of that revenue goes to the creator; part to Notion.
- Leap-of-Faith: "People will pay real money on a recurring basis to tap into curated AI-driven knowledge from recognized experts."
4. Concept Test vs. Implementation Phase
At this point, the team needs to decide whether to build a quick MVP or start with a concept test - depending on feasibility and time/cost constraints.
A. The Concept Test (If MVP Is Too Costly or Time-Consuming)
- Typically consists of a landing page or a short video pitch. We use signups, waitlists, or pre-order deposit requests to gauge real interest.
- If user engagement is weak, we might change the "who" or refine the messaging. This phase often lasts about two weeks.
B. The Implementation Phase
- This might involve building out a "concierge" approach or partial functionality that manually simulates some parts of the product to gather data.
- If our engineer says, "We can spin up a rudimentary system in under a month," we proceed with a short Implementation Phase.
- Alternatively, if it's more complex, we do a quick concept test: a demo video showing how "Knowledge Agents" might look and a waitlist sign-up. We share it with advanced creators, monitor conversions, and watch for early excitement or apathy.
- If the signups or feedback are abysmal, we either pivot the target segment or kill the project.
5. MVP Build & Early Rollout
A. Building a Bare-Bones MVP
If the concept resonates, we proceed to a bare-bones MVP that delivers only the critical features needed to validate the leap-of-faith assumption. We keep scope tight to avoid wasted effort. The champion and engineer collaborate to prioritize must-have functionality, focusing on the key job to be done.
B. Targeted Rollout & Real Usage
We launch the MVP to a small set of potential early adopters—ideally, a curated group that's predisposed to test new features. We then observe behavioral metrics:
- Retention: Do testers continue to use or pay for the product once initial novelty wears off?
- Referrals: Are early adopters spontaneously recommending it to peers (organic word-of-mouth)?
- Willingness to Pay: Do they subscribe
- The MVP includes: (1) a minimal interface to upload or tag certain Notion pages for AI training, (2) a basic subscription payment flow, and (3) a barebones query UI.
- We select a handful of top-tier creators from our waitlist. They integrate their content, set a price, and share the link. We watch if actual consumers pay or if it all fizzles out.
6. Implementation & Iteration
A. Daily Tracking & User Interviews
- The champion team checks usage logs daily, looking for drop-off points.
- They also conduct short user interviews or "follow me home" sessions to understand friction. They might incorporate the "5 Whys" method to dig deeper into user complaints or confusion.
- Savoring the Surprise: Actively look for unexpected pockets of strong usage or novel use-cases. If a user group you never considered loves the product, double down on that audience instead of trying to fix the "bad" or chase an indifferent group. This "surprise" often reveals the real desperation or best fit - embrace it even if it changes initial assumptions.
- Detailed Logging: Keep records of user comments, usage spikes, or friction episodes. Surprises typically show up as anomalies or unusual patterns in user behavior.
B. Adapting the "Who"
- If advanced creators turn out to be lukewarm, we might pivot to a different user group (e.g., coaches, domain experts, or small businesses).
- We do not rewrite the entire product's "what" at this point; we just reexamine the target segment or distribution strategy.
- We spend a few weeks refining how easy or hard it is for a creator to configure the AI after hearing complaints about the training process.
- If we notice only a niche group of professional consultants adopting it, we might pivot the "who" to consultants specifically, offering deeper customization for them rather than chasing Notion template creators.
7. Determining Product-Market Fit
We give each new initiative a 3-month total runway to demonstrate meaningful traction:
A. The Hunt for Exponential Organic Growth
The clearest sign of product-market fit in a consumer or prosumer context is explosive growth driven by word-of-mouth and retention. No amount of paid acquisition can replicate the signal that customers themselves find the product valuable enough to tell others about it.
B. Behavioral & Revenue Metrics
We review:
- Retention: Are the same users returning consistently over multiple weeks or months?
- Referrals: How many net new users came from existing users?
C. Pivot or Stop
- If, after repeated iteration on the "who" or distribution model, usage remains tepid and there is no surprising bright spot, we shut down the project.
- If we see an enthusiastic, sticky group with high word-of-mouth, we double down on that market and expand with a bowling pin strategyThe "bowling pin" strategy in the technology adoption cycle refers to a method of targeting a specific market segment (the "head pin") to gain traction and then leverage that success to attract more customers (the "other pins")..
- If usage among our pilot group remains strong, and some creators earn real revenue from their "Knowledge Agents," we see that as a product-market fit signal.
- If adoption remains flat despite pivoting to multiple audiences, we accept defeat, kill or pause the project, and document lessons learned.