Discovery is Not a Task
This week, I am back on AI. It can complete tasks, but are we sometimes cheating ourselves by taking this shortcut?
I was having a conversation about buyer personas the other day. It wasn’t with someone I had worked with, so I had no discovery, but I was asked, so, on the spot, from my experience, I sketched out a rough, standard B2B buying group and their influencers, with the heavy “it depends” caveat of any seasoned professional.
We’d need to do some discovery.
I then got a document to review, on the surface a fully fleshed-out set of personas, laced with em dashes (yes, I know humans love an em dash too), but more tellingly, written in moments, clearly by an LLM.
The document was……
fiiiiiiine….
Arguably, the task of describing a set of B2B personas could be considered complete, but was that the task?
I felt a bit like it had been hollowed out, not that this was cheating, but that the process had been cheated, plus being asked to review work that had required so little effort, was weird - did it matter less than if I was reviewing a person’s carefully crafted baby?
I wondered if this document, therefore, had less value to them and their business, and let’s face it, discovery documents are often slung into a digital drawer, never to be seen again. Still, I wondered who would read it, and feel it, knowing it was the work of generative AI?
In the same way that we’ve discussed that the process of writing is one of learning (like I did in this post last year), it’s the same with discovery.
If a group of people collaborate on hammering something out - in this example, a set of personas - the feeling of inclusion in the process and ownership of the outcome is the task. We have a roomful of people who feel the work and are immersed in it.
Something you don’t get if we assign one member of the team to enter a prompt and then ask everyone to read the output. Yes, it's more efficient, but discovery is not a simple tickbox task, and gaining knowledge is not a task.
I saw something float through my LinkedIn feed that maybe you’ve seen (and apologies, I can’t attribute it as I’ve lost it), which compares working with AI to baking a cake, which seems relevant here:
- You can create the recipe yourself and then make the cake yourself
- You can ask AI for the recipe and make the cake yourself
- You can ask AI for the recipe and then have the AI help you make the cake
- You can ask AI for a cake
And depending on the choices we make, when we then show up in the world and say, “here I made this cake” (to misquote a bit of Godin there) - what does that now mean, to the cake maker and the person being presented with the cake?
And on the cooking theme, when the microwave was invented, restaurants didn’t chuck away all their cookers, fryers, and burners, as here was a much quicker, more efficient, and futuristic way to heat food.
Microwaves became a tool to be used, and yes, they are used more by the lower end of the market, where consumers want cheap and efficient, but I imagine every kitchen has a microwave, however fancy.
I’m a frequent user of ChatGPT, so I am not anti-microwave, and I think we are all still figuring this out.
We are all choosing how synthetic we want our work to be.
We are all choosing which steps to skip in the process.
Should discovery and the knowledge we gain be one of them?