The ECPA and Personal Health Record Systems

December 11th, 2008

Yesterday, William Yasnoff discussed whether or not the Electronic Communications Protection Act (ECPA) provided federal privacy protection for Personal Health Record (PHR) systems. Here at The Privacy Place, we have previously focused on whether the Health Insurance Portability and Accountability Act (HIPAA) applies to PHRs (short answer: no), but today I would like to take a moment to talk about the ECPA.  If you are interested in our coverage of HIPAA and PHRs, I would point you to our post on Microsoft’s HealthVault and our post on Google’s Google Health project.

Let’s start with some background on the ECPA.  The ECPA was passed in 1986 as an amendment to the Wiretap Act of 1968 and primarily deals with electronic surveillance.  The purpose of the Wiretap Act was to make it illegal for any person to intercept oral communications like telephone calls.  The first title of the ECPA extends the original Wiretap Act to prevent the interception of electronic communications.  The second title of the ECPA (commonly called the Stored Communications Act) adds protection for stored communications and prevents people from intentionally accessing stored electronic communications without authorization.  The ECPA has been amended three times since it was passed.  First, it was amended by the Communications Assistance to Law Enforcement Act (CALEA) in 1994.  Second, it was amended by the USA PATRIOT Act in 2001.  Third, it was amended by the USA PATRIOT Act reauthorization acts in 2006.

Now, Yasnoff makes several claims in his post, which I will discuss in order.  First, he claims that there are no exceptions in the ECPA and that this means whichever organization holds your information must get your permission to release it.  This is categorically not true.  There are many exceptions in the ECPA, but for the sake of simplicity, I will limit this discussion to the two main exceptions of the original Wiretap Act, both of which were retained by the ECPA.

The first exception allows interception when one of the parties has given prior consent.  This could mean that the government can legally access your communications if your PHR service provider consents prior to the communication.  Thus, Yasnoff’s strong statement that PHRs “MUST GET YOUR PERMISSION” (emphasis from original statement) is simply incorrect.

The second exception allows interceptions if they are done in the ordinary course of business.  This could mean that your data would be accessible by third parties such as an information technology vendor that maintains the software.  Effectively, this is a somewhat broader exception than the exception found in HIPAA for Treatment, Payment, and Operations, which Yasnoff found to be wholly unacceptable for protecting patient privacy.

Second, Yasnoff claims that the ECPA “is not long or complicated – I urge you to read it yourself if you have any doubts.”  This statement as well is categorically untrue.  Paul Ohm, who was previously an attorney for the Department of Justice and is currently an Associate Professor of Law at the University of Colorado Law School, has publicly challenged Tax Law experts that the ECPA is more complicated than the U.S. Tax Code.

Bruce Boyden, an Assistant Professor of Law at the Marquette University Law School, wrote a chapter in Proskauer on Privacy discussing electronic communications and the ECPA. In it he details many of the nuanced aspects of the ECPA, including the three subsequent amendments to the ECPA. With regard to the first title (Interception) he says:

To “intercept” a communication means, under the act, “the aural or other acquisition of the contents of any wire, electronic, or oral communications through the use of any electronic, mechanical, or other device.” The application of this definition to electronic communications has at times been particularly difficult, and courts have struggled with a number of questions: What exactly qualifies as the acquisition of the contents of a communication, and how is it different from obtaining a communication while in electronic storage under the Stored Communications Act? Does using deception to pose as someone else constitute and interception? Does using a person’s own device to see messages intended for them qualify?

Boyden later talks about limitations to the second title (Stored Communications):

[T]here are two key limitations in section 2701 [of the ECPA].  First, it does not apply to access of any stored communication, but only those communications stored on an electronic communications service facility as defined under the act.  Second, the definition of “electronic storage” in the act does not encompass all stored communications, but only those in “temporary, intermediate storage” by the electronic communication service or those stored for backup protection.

These seem like rather important exceptions which continue to refute Yasnoff’s claim that there are no exceptions in the ECPA, but to his second point, this seems pretty complicated.  At least, it certainly doesn’t seem as simple as just finding some information that has been communicated to and stored by a PHR service provider, which was Yasnoff’s implication.

Boyden has also discussed whether automated computer access to communications is a violation of the ECPA.  The discussion is more complicated than it may appear at first and there’s an interesting discussion of it over on Concurring Opinions.

Broadly, several organizations feel that current US privacy law, including the ECPA, is discombobulated. The Electronic Frontier Foundation believes that fixing the ECPA is one of the top five priorities in their privacy agenda for the new administration. The Center for Democracy and Technology would like to see the new administration pass consumer privacy legislation and a “comprehensive privacy and security framework for electronic personal health information.” The ACLU would like to see the new administration “harmonize privacy rules.” I submit that these organizations do not feel that the ECPA provides clear and adequate privacy protections for PHR systems.

Yasnoff’s third claim is that PHRs which are “publicly available” receive stronger protections under the ECPA than those that are “private.”  In fact, Yasnoff says:

Only those that are “publicly-available” are included. While this clearly would apply to generally available web-based PHRs, systems provided only to specific individuals by employers, insurers, and even healthcare providers are less likely to be considered “publicly-available.” Therefore, ECPA protection is limited. So you are only covered if you use a PHR that is available to anyone.

This statement is either completely backwards as it relates to the ECPA or, perhaps more likely, not a factor for ECPA protection at all.  The EFF’s Internet Law Treatise has an article describing the differences in public communications versus private communications:

“[T]he legislative history of the ECPA suggests that Congress wanted to protect electronic communications that are configured to be private, such as email and private electronic bulletin boards,” as opposed to publicly-accessible communications. See Konop, 302 F.3d at 875, citing S. Rep. No. 99-541, at 35-36, reprinted in 1986 U.S.C.C.A.N. 3555, 3599.

Thus, the public accessibility of the PHR service is not important. The pressing concern is whether the communication itself was meant to be public or private. If it was public, then the ECPA simple doesn’t apply. It if was private, then whatever protections the ECPA does afford, would apply.

By now it must be clear that I disagree with William Yasnoff’s assessment of the ECPA’s application to PHRs.  I did, however, want to point out one interesting privacy protection that the ECPA offers which HIPAA does not: a private right of action. 

Basically, a private right of action allows citizens to file civil lawsuits in an attempt to recover losses caused by violations of a law.  The ECPA has a private right of action clause, while the HIPAA does not.  HIPAA’s lack of a private right of action has caused some criticism.  On the other hand, the ECPA’s private right of action has also been criticized as unnecessary and wasteful.  Perhaps it is a stretch, but this was the only possible improvement in privacy protection that I was able to find to support Yasnoff’s argument regarding the use of the ECPA to provide privacy protections for PHRs.

I would like to conclude by saying as directly as possible that the ECPA does NOT provide clear or adequate privacy protection for personal health information given to PHR systems. Privacy in general and healthcare privacy in particular are hotly debated current concerns for many organizations. I believe it is likely that the Obama administration and the next session of Congress will attempt to address the privacy concerns raised by organizations like the EFF, the CDT, and the ACLU. In the meantime, however, do not use a PHR service under the assumption that the ECPA protects the privacy of your medical records.

Camera phones and our privacy

October 4th, 2008

By Jessica Young and Aaron Massey

This season’s premiere of Grey’s Anatomy showed interns using camera phones to take pictures of their resident’s injury. This episode aired only days after a story broke about an incident at the University of New Mexico Hospital. Two employees at the University of New Mexico Hospital had used their cell phones to take pictures of patients and then posted these pictures online. These two employees were fired because these actions were a violation of the hospital’s policy.

The University of New Mexico Hospital is not the first hospital to experience problems with cell phone cameras in a hospital. In March 2008, Resnick Neuropsychiatric Hospital at UCLA banned cell phones in the hospital to protect the rights of its patients because of past incidents in the hospital. San Diego’s Rady Children’s Hospital has banned cell phones in patient areas after pictures of children were found on an employee’s phone and computer. Other hospitals have also experienced problems with employees using camera phones in ways that violate patient privacy. Although there are policies are in place, enforcement is difficult.

Privacy law in the United States is historically tied to innovations in cameras. Warren and Brandeis wrote their famous article, “The Right to Privacy,” in response to the invention of the portable “instantaneous photography.” These fears have been reborn now that most people carry cell phones with them at all times and a majority of these phones have cameras within them.

Newer phones are capable of easily sharing pictures and videos with others – regardless of location. As a result, candid pictures can be taken at unexpected times and in someone’s worst moments. For example, a customer at a grocery store recently had an embarrassing picture taken in a moment of anger after the store couldn’t process his credit card. Within moments, the picture was online and generating comments. In the article linked above, Harmon discusses the use of the candid camera phone:

“In recent weeks the devices have been banned from some federal buildings, Hollywood movie screenings, health club locker rooms and corporate offices. But the more potent threat posed by the phonecams, privacy experts say, may not be in the settings where people are already protective of their privacy but in those where they have never thought to care.”

The recent incidents with cell phone cameras at hospitals are troubling examples of why people should be concerned about privacy in places they previously “never thought to care.” Hopefully people will become more aware of cell phone use and capabilities as it relates to individuals’ privacy—not just in a hospital but everywhere.

2008 Privacy Values Survey Completed

September 29th, 2008

Our 2008 Privacy Values Survey ended this morning at 12:01 am on September 29, 2008. Thank you to the more than 2,000 survey respondents over the course of the survey.

Thank you for your interest! Please check back in a few months to see the survey results.

Previous survey results can be found in the following publications:

Earp, J.B.; Antón, A.I.; Aiman-Smith, L.; Stufflebeam, W.H., “Examining Internet privacy policies within the context of user privacy values,” IEEE Transactions on Engineering Management, vol.52, no.2, pp. 227-237, May 2005

Carlos Jensen, Colin Potts, Christian Jensen, “Privacy practices of internet users: Self-reports versus observed behavior,” International Journal of Human-Computer Studies, vol. 63, no. 1-2, pp. 203–227, 2005.

Vail, M. W.; Earp, J. B.; Antón, A. I., “An Empirical Study of Consumer Perceptions and Comprehension of Web Site Privacy Policies,” IEEE Transactions on Engineering Management, vol.55, no.3, pp.442-454, Aug. 2008

ThePrivacyPlace.Org Privacy Survey

September 23rd, 2008
Privacy Survey 2008

ThePrivacyPlace.Org Privacy Survey is Underway!

Researchers at ThePrivacyPlace.Org are conducting an online survey about privacy policies and user values. The survey is supported by an NSF ITR grant (National Science Foundation Information Technology Research) and was first offered in 2002. We are offering the survey again in 2008 to reveal how user values have changed over the intervening years. The survey results will help organizations ensure their website privacy practices are aligned with current consumer values.
We need to attract several thousand respondents, and would be most appreciative if you would consider helping us get the word out about the survey, which takes about 5 to 10 minutes to complete. The results will be made available via our project website (http://www.theprivacyplace.org/).
Prizes include
$100 Amazon.com gift certificates sponsored by Intel Co.
and
gifts from IBM and Blue Cross and Blue Shield of North Carolina
On behalf of the research staff at ThePrivacyPlace.Org, thank you!

More at Stake Than Just Your Password

September 23rd, 2008

By Jeremy Maxwell and Dr. Annie I. Antón

Hackers recently broke into Governor Palin’s personal Yahoo email account and, subsequently, several of personal emails and family photos were posted on the internet [See: BBC Article].
This recent case reminds us that we must be careful with the information we divulge online as well as the information that is requested of us online. Consider that the responsible hacker was able to guess Governor Palin’s answers to the security questions that Yahoo used by doing some simple Internet searching [See: PCWorld].

This attack could be considered a social engineering attack [See: Social Engineering Fundamentals]–– social engineering attacks are not technical attacks, but instead aim to trick the victim into divulging personal information. Phishing and trojan horses are also examples of social attacks. The Governor Palin attack, however, is similar to the attack described by Herbert Thompson, where an attacker can gain access to user accounts simply by using information available on the internet, usually using some sort of password resetting service that asks personal questions to validate the identity of the user. If this private information is well known, than anyone could impersonate the identity of the victim. Sources of information can include public records such as driving or court records, blogs, social networking websites, personal websites, etc. The lesson here is to avoid posting private information in a public setting. Most people would not post their Social Security number or the password to their email account on their blog, but the information they do post might be enough.

So before you post the name of your first pet on Facebook or MySpace or on your blog, think about whether it can be used to fraudulently impersonate you at a later date.

[Update: Fixed minor grammar error]

Readability of Internet Privacy Policies

September 5th, 2008

By Dr. Annie I. Antón and Gurleen Kaur

Erik Sherman’s September 4, 2008, BNET blog post, Privacy Policies are Great — for PhDs, analyzes the readability of common Internet privacy policies including Google, Microsoft and Yahoo.  His study supports the findings, published by ThePrivacyPlace.Org researchers in IEEE Security & Privacy.  Our studies showed that privacy policies are inaccessible to the very end-users they are intended to inform. 

Our first study, published in 2004, analyzed 40 online privacy policy documents from nine financial institutions to examine their clarity and readability.  Our findings revealed that compliance with existing legislation was, at best, questionable.

Our second study, published in 2007, analyzed 24 healthcare privacy policy documents from nine healthcare Web sites both pre- and post-HIPAA (Health Insurance Portability and Accountability Act).  Our findings revealed that HIPAA’s introduction has led to more descriptive privacy policies, but many remain difficult to read.

Last month, ThePrivacyPlace published an empirical study in IEEE Transactions on  Engineering Management that reveals that users perceive traditional, paragraph-form policies to be more secure than other policy representations, but that user-comprehension of paragraph-form policies is poor in comparison to other policy representations.  

Google’s New Browser: Chrome

September 2nd, 2008

Google recently announced their new open source browser, called Chrome, via a comic book. Although slated for release sometime today, the link mentioned in the comic book (http://www.google.com/chrome) appears to be down is now up! The 38-page comic book is surprisingly informative, mildly entertaining, and certainly a unique way to release a new product, but don’t let the playfulness of the announcement fool you. Chrome has many important features, including a privacy-enhancing feature called “Incognito.”

Incognito is a user-visible feature that enables a private browsing mode. Private browsing is a relatively simple concept with tangible benefits to privacy. Under normal operation, a browser will store information about a user’s browsing history. Stored information could include sites visited, data downloaded, searches conducted, or even personal information entered. Under private browsing mode, that same browser simply doesn’t store this type of information. Essentially, a browser has no memory of what users do when private browsing is enabled.

Although private browsing is conceptually simple, it is not easy to implement because everything the browser does is affected by private browsing. Apple’s Safari browser has had a private browsing mode since version 2.0 (April 2005). Currently in version 3.1.2, Safari still is the only major browser to have a built-in private browsing mode. However, Safari’s private browsing mode isn’t perfect.

Private browsing was a planned feature for Firefox 3.0, but was dropped before the release because the developers “didn’t want to put something in that was half baked.” The Mozilla Wiki describes the current state of this feature and provides a link to a Firefox plugin called Stealther, which provides some private browsing features.

Microsoft has announced that they will include a private browsing feature, called InPrivate, in their next version of Internet Explorer. Microsoft’s effort seems to be even more ambitious than simply not storing data locally. For example, a Microsoft blog post describes a feature, called InPrivate Blocking, that would add the ability to block browsing information that would normally flow to third party sites.

Clearly, private browsing mode is not a trivial engineering task, but Chrome has some fundamental advantages over the “big three” that may simply make real private browsing easier to implement and maintain. Since Chrome will have Incognito on its first release there is less code that needs to be re-engineered to respect a private browsing mode. Also, Chrome uses a separate process for each tab, whereas a traditional browser only has a single process for all of its tabs. Multiple processes make it easier to sandbox tabs. As a result of these strict separations, it could be possible that Chrome would allow individual tabs to go “Incognito” while others act normally.

It is difficult to predict what sort of impact Chrome will have on the browser market, web application development, or Internet privacy, but if Chrome will have any impact, then it must compete with the “big three.” They are big for a reason, and a comic book isn’t going to solve that problem.

[ Update: Google has officially released Chrome at the following URL: http://www.google.com/chrome ]

Privacy Survey

August 11th, 2008
Privacy Survey 2008

ThePrivacyPlace.Org Privacy Survey is Underway!

Researchers at ThePrivacyPlace.Org are conducting an online survey about privacy policies and user values. The survey is supported by an NSF ITR grant (National Science Foundation Information Technology Research) and was first offered in 2002. We are offering the survey again in 2008 to reveal how user values have changed over the intervening years. The survey results will help organizations ensure their website privacy practices are aligned with current consumer values.

We need to attract several thousand respondents, and would be most appreciative if you would consider helping us get the word out about the survey, which takes about 5 to 10 minutes to complete. The results will be made available via our project website (http://www.theprivacyplace.org/).

Prizes include
$100 Amazon.com gift certificates sponsored by Intel Co.
and
gifts from IBM and Blue Cross and Blue Shield of North Carolina

On behalf of the research staff at ThePrivacyPlace.Org, thank you!

Are Google Health’s Privacy Practices Healthy?

June 20th, 2008

by Jessica Young and Annie I. Antón

On May 19, 2008, Google launched Google Health [1], a new Personal Health Record (PHR) web portal that allows patients to gather and organize their medical records while keeping their physicians up to date about their health condition. As with other PHRs, like Microsoft’s HealthVault, Google Health does not appear to be covered by federal or state health privacy laws. According to the Google Health Terms of Service, Google is not a “covered entity” as defined in the Health Insurance Portability and Accountability Act (HIPAA); as such, “HIPAA does not apply to the transmission of health information by Google to any third party” [2].

Researchers at ThePrivacyPlace.org have evaluated privacy policies and privacy breaches since its founding in 2001. In particular, The Privacy Place researchers are addressing the extent to which information is protected in financial and health care systems that must comply with relevant laws and regulations.

Google Health is not a covered entity as defined in HIPAA. Thus, any personal health data that you submit to Google Health will not be afforded the same legal protections required of health care providers under HIPAA. Until state and federal agencies establish and enforce laws to protect the privacy of personal health records maintained by non-covered entities, individuals should carefully consider the risks involved in submitting sensitive health information to Google Health and other PHRs such as HealthVault. PHRs are not subject to the same privacy and security laws to which traditional medical records are subject to in the United States.

As with our analysis of Microsoft’s HealthVault [3] in October 2007, we encourage patients to carefully consider and question the privacy practices as articulated in the Google Health privacy policy, terms of service, and frequently asked questions. To further explore this new service, we analyzed and evaluated the protections and vulnerabilities involved in using Google Health. The Google Health Help Center provides a link and U.S. Postal Address—but no other contact method, such as email address or phone number —for users to submit a “Question About Privacy” [4]; clicking on this link displays a web page with a form for inquiries that states, “[w]e hope this information [Privacy Policy] will help you make an informed decision about sharing your personal information with us” [5]. Unfortunately, Google has been unresponsive to our questions regarding its Google Health privacy policy.

We sent four questions regarding Google Health’s privacy practices via the Google Health Help Center [5] on May 23, 2008. On June 4, 2008, we submitted the same four questions but this time via Google’s Web Search Help Center [6], where users are invited to submit questions specifically about Google’s privacy practices. It has been over three weeks since our first inquiry and we have yet to receive a response of any kind to any of our questions. Patients are concerned about the privacy of their health information [7]. A lack of prompt replies to questions regarding health privacy is disconcerting and suggests that privacy is not a priority for those managing Google Health or manning the Google Help Center.

We focused on three questions of Microsoft’s HealthVault in our previous analysis [3]. Here we examine these same three questions within the context of Google Health.

Will your health information be stored in other countries without appropriate legal oversight, skirting many of the protections afforded by the HIPAA?

The three Google Health privacy-related documents provide no insights about where personal health information will be stored. As we received no answers to our inquiries to the Google Health Help Center, we turned to the general Google Privacy Policy, which states, “Google processes personal information on our servers in the United States of America and in other countries” [8]. Users should always be concerned about the location of their data because different countries have different data protection standards and laws. If your data is breached in some way, the physical location of the server on which it was stored will affect the recourses that will be available to you.

Will your health care records be merged with other personal information about you that was previously collected within the context of non-health related services?

No, according to the documents we reviewed. Google Health’s Privacy Policy states: “The [record] log information will be used to operate and improve service and will not be correlated with your use of other Google services” [9]. This is also addressed in the FAQ by the statement that “no personal or medical information in your Google Health profile is used to customize your Google.com search results or used for advertising” [10]. At this point in time, it appears that Google Health information will not be merged with information from other Google services without your consent.

Are the access controls to your health records based not only on your consent, but also on the principle of least privilege?

Google Health allows users to grant read/write access to their information to other third-party sites and/or individuals. The Google Health Privacy Policy states: “you [as a Google Health user] can revoke sharing privileges at any time. When you revoke someone’s ability to read your health information, that party will no longer be able to read your information, but may have already seen or may retain a copy of the information” [9]. Thus, users should determine their access control rules as soon as possible when setting up their accounts. Access control rules and the ability to change these rules become immaterial once private health information reaches an unauthorized or unintended agent. Google is not clearly implementing the principle of least privilege, because it appears that others may be able to grant read/write access to your health information, leaving the door open for data breaches.

References

[1] Google Health.

[2] Google Health Terms of Service, April 28, 2008

[3] A.I. Antón. Is That Vault Really Protecting Your Privacy?, ThePrivacyPlace.org Blog, October 9, 2007.

[4] Google Health Help Center Contacting Support page.

[5] Google Health Help Center Contact Us [Question about privacy] page.

[6] Google Web Search Help Center.

[7] National Consumer Health Privacy Survey, California Health Care Foundation, 2005.

[8] Google Privacy Policy, October 14, 2005.

[9] Google Health Privacy Policy, no date provided.

[10] Google Health Frequently Asked Questions, no date provided.

ThePrivacyPlace Construction

May 22nd, 2008

ThePrivacyPlace is pleased to announce that we are moving to a new hosting provider and will be revamping our site to make it more informative. Please bear with us as we work to provide better service!