- The Washington Times - Saturday, March 5, 2005

The revelation that the private data of hundreds of thousands of individuals was exposed in the ChoicePoint and Bank of America scandals has led to an outcry for new privacy regulation.

It’s easy to understand the frustration of many at risk of identity theft and other misuses of their data and why they might be amenable to a new wave of privacy rules. But before rushing to new laws, it might be worthwhile to pause, take a deep breath and consider if this will best serve our needs.

The first thing to consider about any new regulation is the cost. For instance, a 2001 study estimated new privacy regulations would add $9 billion to $36 billion a year in costs to the U.S. economy. For a specific case of the costly burden of privacy regulations, one need look no further than the Health Insurance Portability and Accountability Act (HIPAA). A final rule on privacy regulations effective in 2003 and of encyclopedic dimension, the act exhaustingly details how hospitals and doctors offices must overhaul their practices to avoid the privacy police. For instance, office workers must fret over how to hang charts on the walls, guard the fax machine from unwanted visitors and speak in hushed tones on the telephone.

Some health providers have had to remodel offices so lingering patients will be unable to glance at unauthorized information. The rules are so misguided that if a patient mistakenly chooses a “do not announce” status, the hospital may be unable to let a family member know he was admitted.

What does the HIPAA regulatory leviathan cost? A much-criticized HHS study put the cost at $17.6 billion over 10 years. An American Hospital Association-commissioned study, found that only part of the regulations could end up costing hospitals alone $22.5 billion over five years. A BlueCross BlueShield-funded study saw a $42.9 billion five-year cost.

In California, known for aggressive privacy regulations, companies and institutions have felt the economic effect. For example, when hackers broke into San Diego State University servers, to comply with California’s law, university officials had to notify 207,000 students that data such as Social Security numbers may have been compromised, though there was no evidence this had happened. The effort cost the university $200,000, in addition to all the negative publicity.

Even if we set aside the extraordinary privacy regulation costs, we should ask ourselves if we even can effectively protect individuals’ personal data. Daily accumulating evidence suggests protection is increasingly impossible.

The ChoicePoint case generated significant publicity, largely because of the significant number of people affected and the fact the company was in the cross hairs of privacy advocates for a number of months. ChoicePoint had even been the central antagonist in Robert O’Harrow’s book, “No Place to Hide.”

But this kind of security violation occurs regularly. We just don’t hear about it because companies outside California have little incentive to report such breaches.

Securing databases is becoming, in many ways, a futile exercise. Due to the information economy’s needs, our personal data are spread across hundreds if not thousands of databases, giving thieves an unlimited number of targets. Evidence indicates just about any technology can be cracked by sufficiently persistent criminal. In the last few weeks, we learned both TiVo and Napster’s copy protection technologies were hacked. Just ask Paris Hilton how little security databases offer. And one can use social engineering and pose as legitimate businesses to obtain data and avoid security protections altogether as was done in the case of ChoicePoint.

So if privacy regulations are a burdensome cost to companies that probably can’t protect their data anyway, why would we want to go down a path that has negative consequences for the economy and little chance for success? I would suggest it’s because old habits die hard.

Perhaps the time has come to revisit how we think about personal data. This would call for be a paradigm shift that says information is not the problem, but how people use it. Instead of focusing on the increasingly difficult task of trying to lock down data behind a dike that has more holes than Swiss cheese, we should focus on making sure data are used appropriately. For instance, if banks required that applicants for credit apply in person and use a biometric such as a digital picture or fingerprint, the data stolen from a ChoicePoint database would be useless to criminals. Combined with a secure driver’s license that organizations can trust, this would cause a major funding source for identity thieves to dry up quicker than a river in the Mohave Desert. This approach shifts the responsibility for verifying identity to companies that use personal data, rather than forcing consumers to clean up identity theft after the fact. Or if the data could threaten one’s position in the workplace, as could a criminal record, Congress could follow one of HIPAA’s better provisions, which prevents discrimination based on certain types of information — in HIPAA’s case, medical information.

In an open society where information wants to be free, a more enlightened approach is not to plug the dike but let the river flow and make sure it isn’t channeled in the wrong direction. Now we just need to find out how to spend all the money that would be freed by fewer privacy regulations and smaller investments in security.

Dennis Bailey is the chief operating officer of the technology firm Comter Systems. He is the author of “The Open Society Paradox: Why the 21st Century Calls for More Openness Not Less,” recently published by Potomac Books.


Copyright © 2018 The Washington Times, LLC. Click here for reprint permission.

The Washington Times Comment Policy

The Washington Times welcomes your comments on Spot.im, our third-party provider. Please read our Comment Policy before commenting.

 

Click to Read More and View Comments

Click to Hide