Discover more from Compressed
Intuition Before Data
One comes before the other
Intuition is often misunderstood as simply “feeling” something out. This is largely a problem of definition. Poorly defined intuition is certainly “feeling” something out but well-defined intuition is closer to highly tuned theory. At its best, well-defined intuition is something that can be articulated, evaluated, and understood as the culmination of theories based on previous experience we latently hold to problem solve.
And just like theory must come before an experiment, intuition must come before the data. Otherwise interpretation of that data is effectively useless at best or misleading at worst. Let’s ignore the epistemology squabbles — I don’t want to get into the problem of induction or how knowledge is created — and focus on the practical nature of developing products. One of the hardest things for a product manager to develop is product intuition. There’s a reason why this is such a common interview category — because having theories of what works proves you are capable of potentially developing working solutions somewhere else.
Looking at data should be, in the simplest sense, an exercise in invalidation. Invalidation of what? Your theory!
I had to learn this lesson myself the hard way.
And I learned it at one of the heaviest data-driven companies out there. At the time, I was leading a very technical initiative (one of those “AI” things) and was looking for opportunities to move the needle for a particular model. I discovered our ads product. I played around with it briefly and chatted with the product and engineering teams. I learned the product was relatively new but demand was growing dramatically and we were supply constrained. I tried to run a few ads myself. It was simple enough: there were a bunch of fields that were available and you selected your criteria to build up a target audience, you then made your ad, and set your budget. Done. I could instantly see how expanding coverage here could be a money-maker.
I then ran some data on popular fields and bingo — I found my target. A particular field that had everything I was looking for — extremely high usage (2nd most utilized field) and a limited baseline model (covering only 40% of eligible supply). I staffed up a team, quickly wrote up a product document, and a quarter later reviewed the very nice results from the latest model from the team. Same accuracy, but dramatically improved coverage. We were ready to test.
Needless to say, I expected this model to crush it. Imagine my surprise when a week into the A/B test, I got back the most negative A/B test of my professional career. Every single metric was red. Statistically significant.
I stopped the test and asked the lead engineer to dive into results. I expected it was a bug so we started there. Nothing — everything was working great. Weird. Maybe something about how the data was used then? We sat down with the ads team. They couldn’t think of anything specifically but sent us access to their logs. A few days later, the lead engineer asked to talk as I was walking by. He had found the issue. I immediately cancelled the meeting I was going to and sat down. What was it? A cohort we didn’t account for? A bug in the ads system we fed into where we didn’t have visibility?
No. It turns out the field I had selected was the most popular negative attribute. As in when the advertiser created the audience, they would select for supply NOT in this field. By increasing coverage of this, I didn’t increase the supply, I dramatically reduced it!
Of course. I went into this entire project without bothering to learn how advertisers actually built audiences. Yes, I played around with the UI, I talked to experts, and I ran data but I didn’t actually understand that both positive and negative targeting were essential. I had “validated” investment with data without actually developing the appropriate user intuition. I made all the right motions, but I failed the one thing that mattered - truly understanding the problem space upfront. Had I properly understood that, I would have done a simple cut of it in my data analysis to segment by positive and negative targeting and made better decisions from there.
Remember that order matters: intuition must come before data.
Thanks for reading Compressed! Subscribe for free to receive new posts and support my work.