Author Archive

Openess as a Privacy Protection Strategy

Tuesday, October 26th, 2004

Daniel J Weitzner has an editorial in this week’s Computerworld online called “Openess as a Privacy Protection Strategy“. At first it seems like a contradictory statement and he references David Brin’s seminal work, The Transparent Society.

But just as Brin argues that increases in loss of privacy from surveillance aren’t Orwellian as long as _everybody_ has access to the surveillance, Weitzner seems to argue that customers needn’t necessarily fear companies collecting large amounts of information about them as long as this activity is “transparent,”

As Weitzner puts it:

Is the transparent enterprise destined to be the engine of the elimination of privacy? Has the analytic power and data-gathering reach of today’s information networks rendered privacy a disappearing artifact of simpler, less-networked times? I don’t believe so, but in order to retain the dignity, control and occasional solitude that are at the heart of privacy, we have to start designing systems differently.

First, we should embrace transparency as a design philosophy that can help people ensure that information about them isn’t used in a way that’s contrary to legally permissible purposes or in violation of agreements under which it was collected. Our design goal should be to provide active transparency to users. In many cases, people are comfortable about information collection, provided they know that it’s happening, understand the purpose of it and can check that it’s not being used inappropriately.

While I still think there is a very strong case to be made for actively working to minimize data collection, just as I believe there is a very strong case for actively working to minimize governmental surveillance, I think Weitzner’s point is valid. Transparency of data handling, i.e., making customers aware of what data is being collected about them and how it will be used, is a perfectly valid design goal. Software engineers need to be thinking about how this goal would affect their system designs.

EWeek Editorial on Data Governance

Friday, October 22nd, 2004

EWeek has a great editorial titled “The Governance Edge” in the current edition which does a great job of drawing the connection between between controls in IT infrastructure and corporate ethics. As they put it:

“Without information management, there can be no corporate governance of any kind, good or bad.”

and later in the same editorial:

IT people will have to take part in the good governance of their own companies, through helping to implement Sarbanes-Oxley, the Patriot Act, SEC 17a-4 and HIPAA compliance solutions. The tools that vendors are offering IT managers to meet these compliance guidelines give IT managers the power to preserve data and audit it when need be. IT managers need to harness this technology to make good governance a day-to-day practice within their companies.

The problem we face as an industry is that we have to work governance issues into software engineering practices and eventually good governance principles need to be “baked in” to the products and services that are offered to customers.

What does this have to do with privacy management? Everything. Privacy management, financial data controls (Sox), HIPAA (medical privacy), COPPA (child protection), are all about placing controls on how and when data can be used, all of which fall under the umbrella term, “Data Governance.”

Massive Data Breach at University of California, Berkely

Wednesday, October 20th, 2004

SecurityFocus News is reporting that data for about 1.4 million Californians was put at risk due to a security breach at a computer system that contained data for California’s In-Home Supportive Services program.

It’s interesting to note that investigators are note sure whether or not the the personal information was actually extracted from the system. But California’s recently passed anti-identity theft law, SB1386, requires that all 1.4 million people whose data was on that system be notified so that they can take appropriate measures to protect their identity by calling the credit reporting agencies, etc.

Imagine, having to write a letter on your university letterhead to 1.4 million citizens of your state telling them that you were not protecting their information from theft and that an incident has occurred in which the citizen’s personal information, including social security number, has been downloaded by an unknown person.

Legal XML

Tuesday, October 12th, 2004

I just got back from IBM’s Security and Privacy Leadership conference and was thoroughly impressed at the depth of discussions. At events like this three years ago, we were talking about subjects like “is there really a difference between privacy and security?” Today, everyone is comparing notes on their Sarbanes-Oxley complaince efforts or sharing the pain of HIPAA compliance.

One of the keynote speakers mentioned in passing a project that should be on the radar screen of anyone developong privacy enhancing technologies. It’s a relatively new OASIS working group called “Legal XML“.

Their website describes the working group as follows:

LegalXML brings legal and technical experts together to create standards for the electronic exchange of legal data.

LegalXML is a member section within OASIS the not-for-profit, global consortium that drives the development, convergence and adoption of e-business standards. Members themselves set the LegalXML agenda, using the open OASIS technical process expressly designed to promote industry consensus and unite disparate efforts. LegalXML produces standards for electronic court filing, court documents, legal citations, transcripts, criminal justice intelligence systems, and others.

OASIS members participating in LegalXML include lawyers, developers, application vendors, government agencies and members of academia.

I’ve run several workshops in which we’ve analyzed privacy legislation and expressed the requirements in XML so that it can be related to access controls and, believe me, if was tough. Law writers are all about principles and (frankly) ambiguity. All too often they want to express goals and leave interpretation n how to achieve goals to the courts. On the other hand, IT people need very prcies, actionable items to follow. So bridging the gap between the legal world and IT world is no small taks.

But because privacy management is rooted in social expectations, I personally believe work efforts like Legal XML are gong to be an extremely important component of future privacy enhancing technologies.

Compliance Oriented Architecture

Monday, October 4th, 2004

Wow. Stephen O’Grady, from the analyst firm RedMonk is on the Board of Advisors for The Privacy Place and yet is humble enough not to have mentioned his recent paper “SOA Meets Compliance: Compliance Oriented Architecture.” But I happened to stumble across it as I was doing google searches on compliance technology.

The opening teaser in O’Grady’s Paper states is:

Leveraging IT to enhance business processes with transactional transparency is a necessary response to corporate governance scandals. Building the “real time enterprise” is fast becoming the preferred method for reducing fraud, and, in more and more cases, it is a mandated one.

I believe the key phrase here is “transactional transparency” in one deft phrase O’Grady has captured to industry’s trend of melding together IT (“transactional”) and business requirements (“transparency”, as in auditable acxtivity). He goes on to build a case for building transactional transparency into an IT environment using services oriented architecture, yielding what he calls, “Compliance Oriented Architecture.”

While O’Grady focuses on legal compliance issues such as Sarbanes-Oxley, it’s clear to see that a compliance oriented architecture is also key to privacy management issues. This paper is a must read for anyone who cares about Privacy Enhancing Technologies for the enterprise.

–Calvin Powers

60 Minutes, Data Provenance, and Meta Data Technology

Friday, September 17th, 2004

I’ve been fascinated by scandal over the past week regarding a story on 60 Minutes in which Dan Rather showed some supposedly newly-discovered documents that had been discovered that were damaging to Bush’s service record while he served in the National Guard. Over the past few days there has been a growing body of evidence to suggest that the documents used in the 60 Minutes story were forged. While it hasn’t been proved 100% to anyone’s satisfaction, enough questions have been raised about the authenticity of these documents that CBS is widely being criticized for basing their news story on these documents.

There are aspects of this emerging story that are relevant to all of us who care about privacy and technology, regardless of how we feel about Dan Rather and CBS and regardless of how we feel about the upcoming elections.

It’s amazing to see how quickly the facts surrounding these documents were investigated and how quickly serious questions about these documents were raised by web loggers and other interested people on the Internet. For a good summary of the chronology of events immediately following the story see

A Taxonomy for Customer Loyalty Cards

Friday, September 10th, 2004

If you’re like me, you carry around a pack of “customer loyalty cards” every where you go. I have one for my favorite Chinese restaurant (Buy 7 buffet dinners, get the 8th for free), a used CD store (Buy 15 DVDs, get the 16th, free), and no less than 3 customer loyalty cards for coffee shops (Border’s Cafe, Weaver Street Market, and Cup-A-Joe).

Most of these loyalty cards are simple and benign. They are just business card sized pieces of paper which the sales clerk at the store marks when you make a purchase. Because they have no personal information associated with them, I like to think of them as anonymous customer loyalty cards. Their simplicity is a tremendous advantage to stores. For a fairly low cost, the issuers of these cards can build repeat business.

I can’t help but note that their simplicity is accompanied by a complete and utter lack of security. The only thing that keeps these loyalty cards from being subject to spoofing attacks (i.e., customers falsifying marking purchases on their card), is the honor system, as far as I can see. But that’s a subject for a different blog.

There is another class of consumer loyalty cards, which I call profile cards, that aren’t nearly so benign. Most commonly these are found in grocery stores. A customer fills out a customer profile survey and is issued a card with a unique identifier. The Harris-Teeter where I shop calls them “Very Important Customer” cards and I’m proud to report that I am Very Important Customer #4-098911769-1. Every time I buy something at my local Harris-Teeter, I flash my VIC card and get discounts on various items. You can bet your bottom dollar that Harris-Teeter keeps track of all purchase I make with my VIC card and uses that information for various marketing and business analytics purposes.

On the whole, is this such a bad thing? If Harris-Teeter knows that I am going to buy 4, 2-liter bottles of Diet Dr. Pepper every Friday afternoon on my way home from work, maybe they’ll make sure it’s on the shelf when I get there. And if they give me 10 cents off the price just for associating the purchase with my name, why not?

The problem is that profile customer loyalty cards are getting stores into trouble on multiple ways. First, the profile information such as your personal buying habits are increasingly becoming subject to subpoena requests in various criminal and in some cases even civil court cases. If John Doe bought a 12-pack of Rolling Rock beer 90 minutes before he was involved in a late-night car crash, it could potentially be used as evidence to support charges of drunk driving.

As far as I can tell, the situation hasn’t gotten so bad yet. So far stores have been reasonably successful in refusing to divulge such information. But in an era where government officials can demand that libraries hand over your book check out records (thank you PATRIOT Act), one can only imagine that the pressure to hand over customer profile buying habits will increase, not decrease.

In addition to being criticized for handing over customer data in legal proceedings, these stores are also feeling pressure to use the information they collect in ways that they don’t want to. Imagine this scenario. a batch of ground beef is found to be tainted and the FDA issues a recall on the meat. The grocery store pulls the meat off the shelf, but some of it has been sold already. Does the grocery store have an obligation to search through its customer profiles and try to find the people who bought the meat and notify them. This scenario is the subject of at least one class action suit from customers.

So these profile customer loyalty cards, as valuable as they are to the stores that issue them, can have significant negative consequences associated with them. At minimum the stores that issue them can find themselves keeping several lawyers busy in court fending off various lawsuits relating to how they use or don’t use the data, not to mention the uneasiness that an increasing number of shoppers have about the potential privacy invasion. Some stores have abandoned their customer loyalty program for these reasons.

Dropping the customer loyalty program all together is not the only option however. The customer loyalty card programs can be saved, and can continue to be valuable to both the customers and the stores that issue them without raising the sticky privacy issues that currently plague profile based customer loyalty cards.

I think the problem stems that our society as a whole is infatuated with computers and databases and in the IT industry, customer relationship management is one of the hottest areas. In my opinion, the problem is that companies take a default position of collecting all the information they can about their customers without thinking of the legal obligations or troubles that it might cause. We have an “all or nothing” approach to customer loyalty cards.

But there’s a third way. It’s entirely possible for companies to get most of the benefits from their customer loyalty cards while incurring none of the legal headaches. All they have to do is simply stop associating their customer loyalty cards with individuals.

The problem with my VIC card is not that it has a unique number on it, the problem is that it has my name and address associated with it. So imagine a third kind of customer loyalty card, let’s call it the “deidentified customer loyalty card.” Suppose my Harris-Teeter simply gave away cards with the unique numbers on them but did not collect the names and addresses of the people who held the cards. This would still allow Harris Teeter to build some fairly sophisticated profiles about my buying habits and would enable them to market to me with instant coupons issued at the cash register, etc. It would still allow them to sell trend data to consumer goods manufacturers which they could analyze for business trends etc. But gone would be those pesky subpoenas to hand over data about a person’s buying habits because they simply would not have the information. Likewise, the “failure to notify” lawsuits would go away for the same reasons. And finally, Harris Teeter could truly say that they are protecting the privacy of their customers because all of the information is deidentified from the start.

To muddy the waters a bit, Harris Teeter could collect some general demographic data while still protecting keeping the information deidentified. I’m not likely to care if Harris Teeter knows that card number #4-098911769-1 is associated with a white male living in zip code 27517 and is in the 30-40 age group. And such information would not targetable by lawyers.

To summarize, anonymous customer loyalty cards are great for generating repeat business but can’t be used for business analytics because there is literally no data associated with them. Profile-based customer loyalty cards give value to the customer and can help generate repeat business while at the same time, generating a wealth of customer profile information that can be analyzed and used for various business purposes. However, profile based customer loyalty cards are a land mine of potential legal lawsuits and consumer fears. Deidentified customer loyalty cards could still be valuable to customers, generate repeat business and could still generate profile information that is almost as valuable as the profile-based cards, while removing the many thorny legal issues and reassuring customers of their privacy.

In this case, the innovation is not a new technology. Sometimes innovation comes from giving careful thought to what you are doing. Sometimes innovation comes from collecting less data, not more.

–Calvin Powers

Security Hacker Tips from “I, Robot”

Tuesday, August 24th, 2004

I finally got a chance to see I, Robot last week. Frankly, I had many qualms about going to see this film. For one thing, every movie with Will Smith seems to end up being a Will Smith love-fest rather than, you know, an actual story. My main objection, at least before seeing the movie was that they didn’t use Harlan Ellison’s scrieen play. But I’m a sucker for summer sci-fi action flick so I decided to go anyway. Little did I know I’d be professionally insulted as well as underwhelmed.

I’ll try not spoil the story too much, for those of you who might want to go see the show. A key location in the movie is the world headquarters of “U.S. Robotics” the makers of commercial domestic Robots. The entire building is secured by Artificial Intelligence system called Vicki. (I forget the cute acronym.) They make a Big Deal out of the fact that Vicki is constantly monitoring everything in the U.S. Robotics buildings for security. OK fine. I can suspend my disbelief in AI for the sake of a move. The sentient Artificial Intelligence is a time honored trope in the science fiction.

But as things start to get tense in the movie, Our Hero, Detective Dell Spooner (Will Smith) and his geek robot psychologist, Dr. Susan Calvin (Bridget Moynahan) find themselves desperately needing to get into the U.S. Robotics building. Unfortunately, it’s surrounded by thousands of NS5 Robots who have inexplicably turned into bad guys.

As Del and Susan hide just out side the reach of the Evil Robots, they ponder how the heck they are going to get into the building, past Vicki’s perimeter security. Finally, Dr. Calvin comes up with a brilliant plan. I don’t remember the exactl dialog, but it was something like, “I know, we’ll sneak in through the service tunnel. It’s not monitored by Vicki because it’s only used for service!!”. And that’s what they do, they pry open a conveniently located man hole, hop into the service tunnel and sneak into the building. So there ya go, a security hacker lesson from I, Robot.

Um, OK. In other words, the screen writer wrote himself into a corner he couldn’t get out of so he wrote a plot hole that violated the most fundamental tenet of security, which is that YOU PLAN YOUR SECURITY ON WHAT COULD HAPPEN, NOT WHAT USUALLY HAPPENS.

Any security consultant who would suggest that a service tunnel doesn’t need to be monitored because it isn’t usually used by humans to get into the building would be laughed out of business. It’s like a police officer saying that you don’t need locks on your doors because most of the time burglars don’t try to walk in your front door. It’s like telling a company they don’t need firewalls protecting their intranet because most people interact with the company’s web site.

So we can all get a chuckle at the screen writer’s sloppy plotting and feel smug about the mature computer security industry. Even the most technologically phobic executive understands the basic needs of physical and network security in their company’s environment. We’ve got a rich industry of firewalls, authentication systems, authorization systems, intrusion detection systems, etc. etc.

It occurs to me that IT industry hasn’t yet adopted the same rigor in our thinking about privacy management. Ask any IT professional in a company about the tools they use to protect the prvacy of the personal information they are entrusted with and they’re likely to mumble something about having a privacy notice on their web site. Maybe they’ll talk about using SSL when transferring data from a browser to a browser to a server. And the really forward thinking folks may be able to articulate a strategy for encrypting personal informations when it’s stored.

All of these things are good, and I’d not speak against any of them. But do they really protect the privacy of their customers. How do the stewards of personal information know that they aren’t using data in ways that directly violate the promises they make to their customers? As the good folks at Hooked On Phonics found out the hard way, the FTC is starting to crack down on companies that violate the privacy promises that they make.

It’s most likely that the folks at Hooked On Phonics were not deliberately being malicious. It was just a case where one department in a company used sensitive personal information without any prior knowledge about the promises made by other departments in the company. All to often, the only preventative measure the companies have in place is to circulate a memo reminding people of the company privacy policy. In other words the typical privacy management strategy in a company is based on what usually happens, not what could happen, which is just as big a hole in its IT infrastructure plans as the plot holes in I, Robot.

- Calvin Powers

Finally Someone Remembers The Privacy Act

Friday, June 25th, 2004

The Chicago Sun Times reported in article on June 24th 2004 regarding the airline industry’s disclosure of passenger information records to the the Transportation Security Administration and its contractors.

This is hardly a new story. It originally broke in September 2003 when JetBlue admitted it had handed over the information of over 5 million passengers, in direct violation of it’s stated privacy policy. Since then, more and more airlines have sheepishly admitted to having done so also. According to the Sun-Times article, 4 major airlines and 2 major reservation systems have admitted to doing the same thing.

The thing that’s new about this story has to do with the claim that this activity is in direct violation of Federal law.

Up to this point, discussion about this fiasco has been limited to the fact that these disclosures are in violation of the airline’s publicly stated privacy policy. As a result the FTC has been investigating these acts as “deceptive trade practices” based on complaints from the Electronic Privacy Information Center and others.

Check out the full complaint at:
http://www.epic.org/privacy/airtravel/jetblue/ftccomplaint.html

The Sun-Times article quotes Sentor Joe Lieberman of Connecticut, top Democrat on the Senate Governmental Affairs Committee as saying that the Transportation Security Administration ”may have violated” the “Privacy Act”.

The Sun-Times article is not specific about which “Privacy Act” Senator Lieberman was referring to. But I believe it’s a safe bet that he was referring to The Privacy Act of 1974 (http://www.usdoj.gov/foia/privstat.htm).

Senator Liberman specifically mentions a failure to notify the data subjects that their information had been collected. But in my opinion, the more interesting requirements in the Privacy Act of 1974 is the requirment that agencies who collect information about individuals must:

“publish in the Federal Register upon establishment or revision a notice of the existence and character of the system of records, which notice shall include–

(A) the name and location of the system;
(B) the categories of individuals on whom records are maintained in the system;
(C) the categories of records maintained in the system;
(D) each routine use of the records contained in the system, including the categories of users and the purpose of such use;
(E) the policies and practices of the agency regarding storage, retrievability, access controls, retention, and disposal of the records;
(F) the title and business address of the agency official who is responsible for the system of records;
(G) the agency procedures whereby an individual can be notified at his request if the system of records contains a record pertaining to him;
(H) the agency procedures whereby an individual can be notified at his request how he can gain access to any record pertaining to him contained in the system of records, and how he can contest its content; and
(I) the categories of sources of records in the system;”

One can’t help but wonder if the TSA, and by extension, the Homeland Security Department, has even considered these requirements, much less complied with them. The Sun-Times article reports that an official from the Homeland Security Department said that the agency is investigating.

I think this will be an interesting test of the Homeland Security Departmnents commitment to privacy and look forward to seeing how they respond to Lieberman’s challenge.