A constant frustration for many in healthcare is the cognitive dissonance between the elegant, highly anticipated promise of technology solutions and the messy, lived complexity of clinical practice.
In this context, I was fascinated–and feel compelled to share–this unexpectedly revealing excerpt from a recent (and, as always, captivating) a16z podcast, featuring a conversation a16z founder Marc Andreessen and board partner Balaji Srinivasan recorded at Stanford.
Following an extensive conversation about the factors associated with startup success and VC success, as well as about emerging (or re-, re-emerging) trends such as artificial intelligence (AI), an audience member asked whether AI might not select investments better than actual VCs–a VC version of the “will computers replace doctors?” gauntlet that tech VCs have thrown down before the medical establishment.
Andreessen’s response (at around the 40-minute mark) speaks for itself–but also, I’d argue, for most in healthcare (emphasis added):
The computer scientist in me and engineer in me would like to believe this is possible, and I’d like to be able to figure this out–frankly, I’d like us to figure it out.
The thing I keep running up against–the cognitive dissonance in my head I keep struggling with, is what I keep seeing in practice (and talk about in theory vs. in practice)–like in theory, you should be able to get the signals–founder background, progress against goals, customer satisfaction, whatever, you should be able to measure all these things.
What we just find is that what we just deal with every day is not numbers, is nothing you can quantify; it’s idiosyncrasies of people, and under the pressure of a startup, idiosyncrasies of people get magnified out to like a thousand fold. People will become like the most extreme versions of themselves under the pressure they get under at a startup, and then that’s either to the good or to the bad or both.
People have their own issues, have interpersonal conflicts between people, so the day job is so much dealing with people that you’d have to have an AI bot that could, like, sit down and do founder therapy.
My guess is we’re still a ways off.
Who knew that developing data-driven tech solutions could be challenging in a profession that at its core is focused on human idiosyncrasies, especially under conditions of stress?