The John Hancock company has announced a program offering discounts on life insurance to customers with good health habits, as registered on their Fitbit monitors — wearable computers that automatically upload data on their physical state. The most physically active customers may earn as much as 15 percent off their premiums, according to a New York Times report.
Good corporate marketing, this. Treating two groups of people differently, psychologists tell us, is much more acceptable if it’s presented as offering something extra to one group rather than as penalizing the other group — even when the two policies are identical in practice.
And if having your body and your activities monitored directly by your insurance company sounds a little intrusive, or even creepy, don’t worry, because you have a choice in the matter. “You do not have to send us any data you are not comfortable with,” a company spokesman reassuringly told the Times. This sort of choice deserves a hard look. Over time, decisions to share personal information may not be reversible or easy to deny.
Think of your tax returns. These documents detail enormous amounts of revealing personal data. Compiled under government compulsion of accuracy, tax returns originally were to be shared only with the IRS. But just try seeking a college scholarship for your child, or a mortgage or a job without supplying copies of your returns. Given the legal necessity of filing with the IRS, there’s no denying the existence of this record; the very fact that authoritative personal data is known to exist generates pressure to disclose it.
Similar pressures are evident in countless other areas of life. There’s a push to computerize and centralize all records of all Americans’ medical encounters in a central database — every visit to a doctor, every note he or she takes, every diagnosis, every treatment. The stated aim is to make such data available wherever and whenever someone makes contact with the system, a potentially life-saving as well as cost-saving innovation, no doubt. But the effect will be to make it impossible to keep any medical information off one’s record, and at least difficult to keep it from any party with a “legitimate” right to know. That category could come to include opposing counsel in divorce cases or employers (of pilots, for example) or the Department of Homeland Security. The choice to refuse such access may become about as meaningful as a choice over sharing one’s tax returns.
Now consider other new targets for comprehensive monitoring. Drivers of “smart” vehicles are generating detailed data on their driving strikingly parallel to that collected by Fitbit on the human body, also for use by insurance companies. Law-enforcement agencies will undoubtedly develop an interest in these same data.
Computerized lessons are generating precise records of what each pupil does and doesn’t understand, how they learn and what they are interested in — potentially yielding insights concerning their aspirations and fears, their political inclinations and their susceptibilities as consumers. Cellphone users — nearly all of us — are creating a database detailing where we’ve been and with whom we are communicating.
You don’t need to be a conspiracy theorist to foresee a Faustian bargain — consent to a totally monitored world — emerging from these trends. Our greatest concern should not be unauthorized access to our data, but access by interests rightfully entitled to exploit any data known to exist.
The best hope to forestall such a bargain is to avoid recording data in the first place or to delete it quickly once the original purpose is served. One can also imagine an absolute prohibition against disclosing personal data outside the immediate context of its collection — the IRS, the local school, the cellphone- or internet-service provider or your health-care provider.
But measures like these would trigger howls of protest from interested parties, including the courts, employers and the national-security establishment — interests accustomed to getting what they want. And their strongest argument would be that the data they seek would benefit those with “good” records. That is, those with nothing to hide.
But who has nothing to hide? Next time an organization offers to monitor your personal data on behalf of some seemingly unexceptionable purpose, think twice. In the long run, it could prove to be an offer you can’t refuse.
James B. Rule is a researcher at the Center for the Study of Law and Society at UC Berkeley.