Hey everyone, it’s Kai! I’m the third and final co-founder of Prompt, here to take over the blog this week.
We’ve been talking to a lot of people about our startup over the past few weeks, but something is missing. Too many conversations end with people nodding in agreement to our pitch, and telling us, “I think you’re onto something!” Yet too few of these conversations result in real learnings where we feel like we’ve made some degree of progress.
I think a lot of it has to do with ego. Being a part of Next 36, a summer-long startup accelerator full of young ambitious founders, there is a lot of pressure to make the most of our summer. Internally, we feel like we are making big sacrifices to be here, so “it had better work.”
So of course, when we talk to people about what we’re working on, it’s easy to fall into the routine of aiming to convince them that what we’re building will be a big thing. This pitfall might make more people like us, but it certainly isn’t leading to real learnings that test the weaknesses of our model of reality.
The conversations that lead to real learnings are the ones that we can only have if we’re not emotionally attached to our current solution. The ones where we don’t make an attempt to pitch why our solution is the best fit for a problem that people are having, but where we instead attempt to learn a lot about human behaviour.
Instead of asking someone if they can picture themselves using our product, we can focus on figuring out what they do by default (unprompted) when they experience a particular set of circumstances, and if the different elements of our solution make sense to people given the constraints of people’s default behaviours.
The scariest thing about doing this is that means we might learn that the idea we have in mind is actually not worth pursuing. It takes guts to be constantly refining the model of reality that your startup depends on, because it involves putting yourself in situations where learning you’re horribly wrong is a real potential outcome.
But it’s worth it. Because once you get to an accurate model of reality, you get a shot at making real impact on real people. And is that not the entire point of this whole journey?