“Only the shallow know themselves.” — Oscar Wilde
Human instrumentation is booming. FitBit can track the number of steps you take a day, how many miles you’ve walked, calories burned, your minutes asleep, and the number of times you woke up during the night. BodyMedia’s armbands are similar, as is the Philips DirectLife device. You can track your running habits with RunKeeper, your weight with a WiFi Withings scale that will Tweet to your friends, your moods on MoodJam or what makes you happy on TrackYourHappiness. Get even more obsessive about your sleep with Zeo, or about your baby’s sleep (or other biological) habits with TrixieTracker. Track your web browsing, your electric use (or here), your spending, your driving, how much you discard or recycle, your movements and location, your pulse, your illness symptoms, what music you listen to, your meditations, your Tweeting patterns. And, of course, publish it all — plus anything else you care to track manually (or on your smartphone) — on Daytum or mycrocosm or me-trics or elsewhere.
There are names for this craze or movement. Gary Wolf & Kevin Kelly call this the “quantified self” (see Wolf’s must-watch recent Ted talk and Wired articles on the subject) and have begun an international organization to connect self-quantifiers. The trend is related to physiological computing, personal informatics, and life logging.
There are all sorts of legal implications to these developments. We have already incorporated sensors into the penal system (e.g., ankle bracelets & alcohol monitors in cars). How will sensors and self-tracking integrate into other legal domains and doctrines? Proving an alibi becomes easier if you’re real-time streaming your GPS-tracked location to your friends. Will we someday subpoena emotion or mood data, pulse, or other sensor-provided information to challenge claims and defenses about emotional state, intentions, mens rea? Will we evolve contexts in which there is an obligation to track personal information — to prove one’s parenting abilities, for example?
And what of privacy? It may not seem that an individual’s choice to use these technologies has privacy implications — so what if you decide to use FitBit to track your health and exercise? In a forthcoming piece titled “Unraveling Privacy: The Personal Prospectus and the Threat of a Full Disclosure Future,” however, I argue that self-tracking — particularly through electronic sensors — poses a threat to privacy for a somewhat unintuitive reason.
I do not worry that sensor data will be hacked (although it could be), nor that the firms creating such sensors or web-driven tracking systems will share it underhandedly (although they could), nor that their privacy policies are weak (although they probably are). Instead, I argue that these sensors and tracking systems are creating vast amounts of high-quality data about people that has previously been unavailable, and that we are already seeing ways in which sharing such data with others can be economically rewarding. For example, car insurance companies are now offering discounts if you install an electronic monitor in your car that tells the insurer your driving habits, and employers can use DirectLife devices to incentivize employees to participate in fitness programs (thereby reducing health insurance costs).
Such quantified, sensor-driven data become part of what I call the “Personal Prospectus.” The Personal Prospectus is a metaphor for the increasing array of verified personal information that we can share about ourselves electronically. Want to price my health insurance premium? Let me share with you my FitBit data. Want to price my car rental or car insurance? Let me share with you my regular car’s “black box” data to prove I am a safe driver. Want me to prove I will be a diligent, responsible employee? Let me share with you my real time blood alcohol content, how carefully I manage my diabetes, or my lifelong productivity records.
All of this seems like merely (quirky) personal choice at first, particularly for those with “good” information who begin the trend by self-quantifying and then using that data to personal advantage (through discounts, etc.). But personal choice begets privacy issues if these information markets begin to unravel. Unraveling occurs because when a few people with “good” information can verifiably measure, track, and share information, everyone (even those with “bad” information) may ultimately find they have little choice but to follow suit. If all candidates for a job are willing to wear a blood alcohol monitor and you’re not, the negative inference drawn about you is obvious. If all the safe drivers quickly sign up for “discounts” that require electronic monitoring of their driving, those who refuse will quickly find themselves paying what amounts to a penalty. (For my recent post on unraveling as corporate strategy, see here.)
There are harms here beyond the pressure to consent. If you were somewhat horrified by the first paragraphs of this post — if you thought “why would anyone want to track so much data about themselves?” — the unraveling threat may particularly bother you. As Anand Giridharadas recently asked in a (short and worth watching) discussion of the quantified self movement, taken together these devices “imply an approach to life that may be something different than what we want life to be about … Because we have these things we’re just doing them, without thinking about whether we want to become the kind of people who do them.”
Your choice to quantify your self (for personal preference or profit) thus has deep implications if it necessitates my “choice” to quantify my self under the pressure of unraveling. What if I just wasn’t the sort of person who wanted to know all of this real-time data about myself, but we evolve an economy that requires such measurement? What if quantification is anathema to my aesthetic or psychological makeup; what if it conflicts with the internal architecture around which I have constructed my identity and way of knowing? Is “knowing thyself” at this level, and in this way (through these modalities), autonomy-enhancing or destroying, and for whom? What sorts of people — artists? academics? writers? — will be most denuded or excluded by such a metric-based world?
For anyone who has read Gary Shteyngart’s Super Sad True Love Story, it’s not hard to see a future in which obsessive measurement — of ourselves, others, everything — may leave some feeling reduced immeasurably by the hegemony of the measurable. Because of the unraveling effect, these reluctant late adopters may not have a choice; as many choose to quantify the self, all may have no real choice but to follow …
Scott Peppet is Associate Professor of Law at the University of Colorado Law School. His most recent scholarship focuses on informational privacy and structural changes to information architecture. This post originally appeared in the Concurring Opinions blog.
Hi. this website is catered towards the rap community. My firm promotes the social media accounts of rappers.If your site would be interested in referring clients, let us know on our site. Thanks.
I say write your own individual story (personal narrative -which is as unique as your personal genome) and make it part of your official medical record.
It will be very useful but very difficult to quantify which makes me for one very pleased
“…an assumption of a weak legal regime in which private entities are allowed to discriminate in ways that should not be permitted”
Assumption? Do you think the way health insurance is sold in this country is non discriminatory? You can legislate whatever you want, discrimination is alive and well everywhere, covertly, of course.
“Once we make negative discrimination based on information ….. illegal……”
What does this sentence even mean? What else would discrimination be based on? Divine intervention? Would positive discrimination be OK? What’s the difference?
And yes, Scott’s point about the shifting “Overton” window, as it were, for norms of information sharing is well taken, but I think that the danger he worries about results from an assumption of a weak legal regime in which private entities are allowed to discriminate in ways that should not be permitted.
We do need anti-discrimination and anti-underwriting laws in areas like employment and health insurance. We need them now, and perhaps the availability of new kinds of information will even highlight our need for such laws in the future.
Once we make negative discrimination based on information (or on unwillingness to provide it) illegal, we can encourage the positive uses (from self awareness to health promotion to health monitoring for intervention purposes). That’s a little simplistic, but I think it’s reasonable, and I don’t feel like writing a dissertation right now!
I love autohacking. First of all I have a massively parallel wet computer between my ears and it’s always doing interesting things. I like to observe it, modify it, play with it. Then I have some excellent oxygen/carbon dioxide exchangers in my chest, a solid biofluids pump in reasonable good shape, and a whole range of other filtering and related life support systems whose performance I like to optimize over time. I’m a veritable biological spaceship or submarine, a self contained evo-engineered bio-unit, assisted increasingly by an all glass cockpit (aka my iphone). And, yes, as a health information technology geek, I like to keep track of the data that the vehicle generates. It’s a hell of a lot more sophisticated than my car, and a lot more interesting and relevant too. I’m not into mods personally, but I do use some classics, such as foot wear and corrective lenses. I love the spirit of your article and am surprised (well not surprised but predictably annoyed) by the skepticism of Margalit and Dr. Angry. Hey guys, lighten up, Scott’s talking about the future here, maybe not for everyone, but for those who want it. It’s just another fun way to experience being alive. I find it hard to imagine that everyone will have to play the game.
It will only happen in the world of bored and stupid people.
Very nicely written!
The way I see it, this is a mixture of narcissism and egotism, which is paradoxically diminishing the value of an individual, which is now just the sum of the quantifiable measures of one’s self, and no more.
The sci fi writers have it all wrong. We will not be creating a race of machines. We will be transforming ourselves into those machines. Just so many drones, all functioning within normal parameters, as defined by central command.
Or maybe this is just a fleeting infatuation of people too rich, too comfortable and with nothing of any significance to fill their pampered lives, much like the pet rocks were, while most humanity lives in squalor, hunger and cruelty.