by Ethics Newsline editor Carl Hausman
As Facebook readies itself to become a publicly traded corporation, it faces the prospect of unrelenting pressure to turn a quarterly profit. At the same time, it confronts close scrutiny from privacy groups and politicians over how it uses the massive troves of data that it collects from its 845 million users, as the New York Times reports this week.
Google, which already has gone public, is facing a backlash over its plans to combine data from its various services in order to wring more money out of user information. Essentially, Google wants to begin merging information from its search engine, YouTube views, and keywords identified in email to create more specific and profitable profiles. Attorneys general from a number of states have taken issue with this plan, and many privacy watchdog groups are crying foul.
From an ethical point of view, these types of privacy dilemmas are interesting because the issues created by advancing technology usually aren’t centered so much on the invasion of privacy per se but rather on the “repurposing” of information.
Most of us are willing, for example, to post that we “like” a particular product or service. But it has raised eyebrows — and some hackles — when that “like” information is not only used to steer appropriate ads our way but used in advertising, with our “like” nod of approval and our photograph inserted into an ad that is circulated to our virtual “friends.”
Repurposed information always made us a little uneasy, even before the widespread availability of computer databases. When I volunteered at a museum in the early 1980s, I was surprised to learn that the museum sold its member list to companies that were advertising products that, for whatever reason, were deemed to be attractive to museum members. (I learned this from an angry member who had deduced the source of some unwanted sales calls.) The museum also sold its list to other museums, apparently because a member of one is more likely to be sold on a membership to another.
The member’s complaint, I thought, was valid. Paraphrased: I joined a museum. I didn’t sign up to have my name and address sold to salespeople. I do, though, remember a verbatim quote from his harangue: “I want to deal with people I can trust.”
As my full-time job was, and has always been, journalism, I repurposed that complaint into articles, books, and presentations that became something of a cottage industry about information ethics. It’s a durable field because the advent of the personal computer and readily available programs to mix and match data reliably generates controversy.
In the 1990s, when computers and database software went mainstream, I began to see how repurposed information could be put to unexpectedly sinister uses. For example, an enterprising database programmer discovered that he easily could collect public-domain data on anyone who had ever sued a landlord. And for a fee — paid by a landlord — he would search his database and see if a prospective tenant was on it. While some landlords might have appreciated the repurposed information, it should be noted that there was no mention of the disposition of the suits or if they were justified.
A decade or so later, the techniques became a little more cringe inducing. Databases were established that identified people who had filed malpractice suits; these databases were intended to be used by doctors to screen prospective patients. While this may have had minimal impact on people living in urban areas, it had the potential to create considerable hardship for, say, women who lived in rural areas who needed an obstetrician.
Both practices eventually were declared illegal, at least in the states where the incidents achieved their first notoriety, but it took many years for the law to catch up to the technology.
The lag between abuse and judicial remedy can be long and even indefinite. We still don’t have definitive case law on conundrums such as:
- How your EZPass data can be used to monitor your movements
- The extent to which your driver’s license can be used as an instrument of social control (denying renewal of a driver’s license to someone who owes child support or library fines, for example)
- If and how your movements, as tracked by a GPS in your smartphone, can be used in advertising to attract you to the coffee shop a block away
Today, the level of unease about privacy has reached a level unequalled in the three decades or so that I’ve been covering the issue. Regulation could be accelerated and it could be heavy-handed, making it a double-edged weapon. As much as I cherish my privacy, I also like the power of my Gmail account, the stream of news tailored to my personal surfing habits, the facility to compensate for having a terrible sense of direction by seeing my destination on Google Street View, and the ability to reconnect with 240 high school Facebook “friends,” which is probably about twice the number of physical friends I actually had back then.
It may not even require legislation to tamp down the positive aspects of new media. Consumers may even vote with their feet soon, fleeing from a cyberspace Wild West where they fear ambush from beyond the next hill.
Why could regulation and public disapproval stanch our electronic lifelines? There are two reasons. One is obvious: People are worried about repurposed and recombined data stripping their privacy and being fused into a fissionable mass of deeply personal information that turns their private lives into public commodities.
But the other and perhaps larger reason is a question of ethics: Companies trading in personal information, particularly Facebook, have subscribed for too long to the old maxim that it’s easier to apologize than to ask permission. Virtually every week we see stories (many of them covered in Newsline) about major online firms that have been forced, usually under the threat of legislation, legal action, or consumer outrage) to back away from some subterranean overreaching centered on user information.
So while I hope I don’t have my electronic tentacles cut by regulation and while I hope that I don’t have to seriously consider cutting back or ending altogether my activities on various Google services or in social media, either is a possibility.
I just want to deal with people I can trust.