Archive for 'HIPAA'

The ECPA and Personal Health Record Systems

Thursday, December 11th, 2008

Yesterday, William Yasnoff discussed whether or not the Electronic Communications Protection Act (ECPA) provided federal privacy protection for Personal Health Record (PHR) systems. Here at The Privacy Place, we have previously focused on whether the Health Insurance Portability and Accountability Act (HIPAA) applies to PHRs (short answer: no), but today I would like to take a moment to talk about the ECPA.  If you are interested in our coverage of HIPAA and PHRs, I would point you to our post on Microsoft’s HealthVault and our post on Google’s Google Health project.

Let’s start with some background on the ECPA.  The ECPA was passed in 1986 as an amendment to the Wiretap Act of 1968 and primarily deals with electronic surveillance.  The purpose of the Wiretap Act was to make it illegal for any person to intercept oral communications like telephone calls.  The first title of the ECPA extends the original Wiretap Act to prevent the interception of electronic communications.  The second title of the ECPA (commonly called the Stored Communications Act) adds protection for stored communications and prevents people from intentionally accessing stored electronic communications without authorization.  The ECPA has been amended three times since it was passed.  First, it was amended by the Communications Assistance to Law Enforcement Act (CALEA) in 1994.  Second, it was amended by the USA PATRIOT Act in 2001.  Third, it was amended by the USA PATRIOT Act reauthorization acts in 2006.

Now, Yasnoff makes several claims in his post, which I will discuss in order.  First, he claims that there are no exceptions in the ECPA and that this means whichever organization holds your information must get your permission to release it.  This is categorically not true.  There are many exceptions in the ECPA, but for the sake of simplicity, I will limit this discussion to the two main exceptions of the original Wiretap Act, both of which were retained by the ECPA.

The first exception allows interception when one of the parties has given prior consent.  This could mean that the government can legally access your communications if your PHR service provider consents prior to the communication.  Thus, Yasnoff’s strong statement that PHRs “MUST GET YOUR PERMISSION” (emphasis from original statement) is simply incorrect.

The second exception allows interceptions if they are done in the ordinary course of business.  This could mean that your data would be accessible by third parties such as an information technology vendor that maintains the software.  Effectively, this is a somewhat broader exception than the exception found in HIPAA for Treatment, Payment, and Operations, which Yasnoff found to be wholly unacceptable for protecting patient privacy.

Second, Yasnoff claims that the ECPA “is not long or complicated – I urge you to read it yourself if you have any doubts.”  This statement as well is categorically untrue.  Paul Ohm, who was previously an attorney for the Department of Justice and is currently an Associate Professor of Law at the University of Colorado Law School, has publicly challenged Tax Law experts that the ECPA is more complicated than the U.S. Tax Code.

Bruce Boyden, an Assistant Professor of Law at the Marquette University Law School, wrote a chapter in Proskauer on Privacy discussing electronic communications and the ECPA. In it he details many of the nuanced aspects of the ECPA, including the three subsequent amendments to the ECPA. With regard to the first title (Interception) he says:

To “intercept” a communication means, under the act, “the aural or other acquisition of the contents of any wire, electronic, or oral communications through the use of any electronic, mechanical, or other device.” The application of this definition to electronic communications has at times been particularly difficult, and courts have struggled with a number of questions: What exactly qualifies as the acquisition of the contents of a communication, and how is it different from obtaining a communication while in electronic storage under the Stored Communications Act? Does using deception to pose as someone else constitute and interception? Does using a person’s own device to see messages intended for them qualify?

Boyden later talks about limitations to the second title (Stored Communications):

[T]here are two key limitations in section 2701 [of the ECPA].  First, it does not apply to access of any stored communication, but only those communications stored on an electronic communications service facility as defined under the act.  Second, the definition of “electronic storage” in the act does not encompass all stored communications, but only those in “temporary, intermediate storage” by the electronic communication service or those stored for backup protection.

These seem like rather important exceptions which continue to refute Yasnoff’s claim that there are no exceptions in the ECPA, but to his second point, this seems pretty complicated.  At least, it certainly doesn’t seem as simple as just finding some information that has been communicated to and stored by a PHR service provider, which was Yasnoff’s implication.

Boyden has also discussed whether automated computer access to communications is a violation of the ECPA.  The discussion is more complicated than it may appear at first and there’s an interesting discussion of it over on Concurring Opinions.

Broadly, several organizations feel that current US privacy law, including the ECPA, is discombobulated. The Electronic Frontier Foundation believes that fixing the ECPA is one of the top five priorities in their privacy agenda for the new administration. The Center for Democracy and Technology would like to see the new administration pass consumer privacy legislation and a “comprehensive privacy and security framework for electronic personal health information.” The ACLU would like to see the new administration “harmonize privacy rules.” I submit that these organizations do not feel that the ECPA provides clear and adequate privacy protections for PHR systems.

Yasnoff’s third claim is that PHRs which are “publicly available” receive stronger protections under the ECPA than those that are “private.”  In fact, Yasnoff says:

Only those that are “publicly-available” are included. While this clearly would apply to generally available web-based PHRs, systems provided only to specific individuals by employers, insurers, and even healthcare providers are less likely to be considered “publicly-available.” Therefore, ECPA protection is limited. So you are only covered if you use a PHR that is available to anyone.

This statement is either completely backwards as it relates to the ECPA or, perhaps more likely, not a factor for ECPA protection at all.  The EFF’s Internet Law Treatise has an article describing the differences in public communications versus private communications:

“[T]he legislative history of the ECPA suggests that Congress wanted to protect electronic communications that are configured to be private, such as email and private electronic bulletin boards,” as opposed to publicly-accessible communications. See Konop, 302 F.3d at 875, citing S. Rep. No. 99-541, at 35-36, reprinted in 1986 U.S.C.C.A.N. 3555, 3599.

Thus, the public accessibility of the PHR service is not important. The pressing concern is whether the communication itself was meant to be public or private. If it was public, then the ECPA simple doesn’t apply. It if was private, then whatever protections the ECPA does afford, would apply.

By now it must be clear that I disagree with William Yasnoff’s assessment of the ECPA’s application to PHRs.  I did, however, want to point out one interesting privacy protection that the ECPA offers which HIPAA does not: a private right of action. 

Basically, a private right of action allows citizens to file civil lawsuits in an attempt to recover losses caused by violations of a law.  The ECPA has a private right of action clause, while the HIPAA does not.  HIPAA’s lack of a private right of action has caused some criticism.  On the other hand, the ECPA’s private right of action has also been criticized as unnecessary and wasteful.  Perhaps it is a stretch, but this was the only possible improvement in privacy protection that I was able to find to support Yasnoff’s argument regarding the use of the ECPA to provide privacy protections for PHRs.

I would like to conclude by saying as directly as possible that the ECPA does NOT provide clear or adequate privacy protection for personal health information given to PHR systems. Privacy in general and healthcare privacy in particular are hotly debated current concerns for many organizations. I believe it is likely that the Obama administration and the next session of Congress will attempt to address the privacy concerns raised by organizations like the EFF, the CDT, and the ACLU. In the meantime, however, do not use a PHR service under the assumption that the ECPA protects the privacy of your medical records.

Camera phones and our privacy

Saturday, October 4th, 2008

By Jessica Young and Aaron Massey

This season’s premiere of Grey’s Anatomy showed interns using camera phones to take pictures of their resident’s injury. This episode aired only days after a story broke about an incident at the University of New Mexico Hospital. Two employees at the University of New Mexico Hospital had used their cell phones to take pictures of patients and then posted these pictures online. These two employees were fired because these actions were a violation of the hospital’s policy.

The University of New Mexico Hospital is not the first hospital to experience problems with cell phone cameras in a hospital. In March 2008, Resnick Neuropsychiatric Hospital at UCLA banned cell phones in the hospital to protect the rights of its patients because of past incidents in the hospital. San Diego’s Rady Children’s Hospital has banned cell phones in patient areas after pictures of children were found on an employee’s phone and computer. Other hospitals have also experienced problems with employees using camera phones in ways that violate patient privacy. Although there are policies are in place, enforcement is difficult.

Privacy law in the United States is historically tied to innovations in cameras. Warren and Brandeis wrote their famous article, “The Right to Privacy,” in response to the invention of the portable “instantaneous photography.” These fears have been reborn now that most people carry cell phones with them at all times and a majority of these phones have cameras within them.

Newer phones are capable of easily sharing pictures and videos with others – regardless of location. As a result, candid pictures can be taken at unexpected times and in someone’s worst moments. For example, a customer at a grocery store recently had an embarrassing picture taken in a moment of anger after the store couldn’t process his credit card. Within moments, the picture was online and generating comments. In the article linked above, Harmon discusses the use of the candid camera phone:

“In recent weeks the devices have been banned from some federal buildings, Hollywood movie screenings, health club locker rooms and corporate offices. But the more potent threat posed by the phonecams, privacy experts say, may not be in the settings where people are already protective of their privacy but in those where they have never thought to care.”

The recent incidents with cell phone cameras at hospitals are troubling examples of why people should be concerned about privacy in places they previously “never thought to care.” Hopefully people will become more aware of cell phone use and capabilities as it relates to individuals’ privacy—not just in a hospital but everywhere.

VentureBeat and ZDNet comment on HealthVault

Tuesday, October 23rd, 2007

Our recent coverage of HealthVault has received some attention from other news outlets.

VentureBeat author David P. Hamilton has been covering HealthVault. He began with an attempt to review HealthVault that ended in frustration attempting to register a password. His next post was a review of HealthVault itself. Recently he posted his thoughts regarding our coverage of HealthVault.

Our comments also received some attention from Dana Blankenhorn at ZDNet. Robin Harris, another ZDNet author, believes that HealthVault is a sick joke. ZDNet also has some screenshots of HealthVault in action for those who may not have the time to play around with the site themselves. ZDNet also has a news article about Microsoft’s efforts to get health records online.

All of the articles are well worth reading if you are concerned about the privacy implications of electronic health records.

Is That Vault Really Protecting Your Privacy?

Tuesday, October 9th, 2007

Last week, Microsoft announced a new PHR (Patient Health Records) system called HealthVault. HealthVault is a web-based portal that enables end-users to upload their health records on the web. Unfortunately, what people don’t realize is that HealthVault and similar PHR systems are not subject to or governed by law. When the Health Insurance Portability and Accountability Act (HIPAA) was enacted, we did not envision that private software firms would eventually want to create databases for our health records. As a result, HealthVault and other PHR systems are not subject to the same privacy and security laws to which traditional medical records are subject to in the United States because they are not “covered entities” as specified in the HIPAA.

Over the course of the past 7 years, researchers at ThePrivacyPlace.org have evaluated over 100 privacy statements for financial and healthcare web portals. In addition, we focus on evaluating the extent to which the privacy of sensitive information is protected in these systems as well as the extent to which system comply with relevant regulations.

Even though physicians and the press are excited about the introduction of these new PHR systems [1], there are questions that I urge the public to ask before entrusting their sensitive health records to any PHR system. My concerns are based on a careful evaluation of the HealthVault privacy statements [2, 3]. Microsoft appears to have sought the counsel of physicians who believe that patient consent is the best indicator of privacy protections. Unfortunately, most physicians do not understand the subtleties buried within healthcare privacy statements within the context of the software that implements those statements. For this reason, I now list three primary questions that one should ask before entrusting their health records to HealthVault or any other PHR system:

Will your health information be stored in other countries without appropriate legal oversight, skirting many of the protections afforded by the HIPAA?

The HealthVault privacy statement explicitly states that your health records may be off-shored to countries that do not afford the same privacy protections for sensitive information that we do in the United States. In particular, if information is disclosed or altered, do you have any legal recourse or remedy?

Will your health care records be merged with other personal information about you that was previously collected within the context of non-health related services?

Within the context of HealthVault, the answer to this question is yes. Microsoft explicitly states that they will merge the information they have previously collected from you via non-health related services with your HealthVault information. Moreover, it is unclear what information Microsoft already has about us other than our names and contact information and precisely what information third parties may access. Furthermore, we don’t know if that information is accurate or complete. Thus, use of the merged information may not be what we expect.

Are the access controls to your health records based not only on your consent, but also on the principle of least privilege?

Although HealthVault requires patient consent for any accesses and sharing of your health records, access controls leave the door wide open for data breaches. HealthVault enables individuals to grant access to other people and programs that can further grant read/write access to your health record. The only safeguard is a history mechanism to provide an accounting of accesses if you suspect that your information has been breached after the fact. A better approach would be for Microsoft to proactively enforce contractual obligations via audits and monitoring mechanisms.

The hype surrounding HealthVault’s privacy protections among those in the medical community must be balanced with the reality of the information security and privacy practice expressed in its public privacy statements. It is critical to address these privacy concerns in the design of PHR systems before we deploy them with vulnerabilities that will ultimately lead to yet another rash of data breaches.

References

[1] Steve Lohr. Microsoft Rolls Out Personal Health Records, New York Times, 4 October 2007.

[2] Microsoft HealthVault Search and HealthVault.com Beta Version Privacy Statement, October 2007.

[3] Microsoft HealthVault Beta Version Privacy Statement, October 2007.

Transparency: The Forgotten Tool

Monday, May 8th, 2006

Those of us who come to the privacy management arena from a computer security background tend take an extremely narrow and focused view of how technology can protect privacy. We love to debate each other on esoteric subjects cryptographic key strengths, the merits of strong two-factor authentication, trust models in networked systems and all sorts of deep technologies. As someone who worked in public key infrastructure technologies for several years and firewall technology before that, no one is a bigger fan of emerging security technology than I. These are all good and useful topics to be discussing and theses sorts of technologies are important foundations of a networked world.

Traditionally we think of privacy enhancing technologies has tools for hiding, obfuscating, and controlling disclosure. But in terms of an overall approach to privacy management we should also think about how technology can be used to creates visibility and awareness of informations security practices.

This point was made quite well recently by Harriet Pearson, VP of corporate affairs and Chief Privacy Office for IBM, in an interview with Computer World.

Read the rest of this entry »

U.S. deployment of electronic medical records “disappointingly slow”

Monday, August 1st, 2005

President Bush called for the adoption of electronic health records in his 2004 State of the Union and extolled their benefits, saying that “by computerizing health records, we can avoid dangerous medical mistakes, reduce costs, and improve care.” In 2005’s State of the Union, he reiterated his desire to seek “improved information technology to prevent medical error and needless costs.”

Recently, a Congress health subcommittee has been reviewing the efforts to deploy electronic medical records (EMRs) and has found progress to be “disappointingly slow” (according to Rep. Johnson, R-Conn.). The hearing (detailed in this CNET article) reviewed the pros and cons of moving towards paperless systems and included testimony from several health and privacy experts. Particular concern was noted for HIPAA’s lack of coverage of non-insurance health transactions; this means that medical records used in these transactions are not governed by the strict security and privacy requirements. It was noted that the lack of a uniform federal privacy standard may also make the deployment of an EMR system much more complicated.

Insurer Goes off SSN-Based IDs

Tuesday, March 29th, 2005

I do not carry my insurance card with me every day because my Social Security Number was printed on the card. In case I lost my wallet some day, all of my personal information (including name, SSN, DoB, home address, which will be more than enough for identity theft) will be available to whoever got my wallet. I cannot afford the risks. But, there are good news for New York State residents. Excellus Blue Cross Blue Shield of New York State has begun issuing new alpha-numeric identification numbers to its policyholders, replacing their old Social Security number-based policy ID system. The switch is scheduled to be completed by the end of May. That’s absolutely great news. I hope the Blue Cross Blue Shield of North Carolina can do the same thing for their customers so that I can carry my insurance card with me without worrying about what will happen if I lose my insurance card.

Review states that HIPAA interferes with patient care

Saturday, October 9th, 2004

The following excerpt is from an article at timesreporter.com: A review of the various effects the Health Insurance Portability and Accountability Act (HIPAA) has had on the medical industry and patients. While patients appreciate the stronger privacy protection, the medical community has found that compliance with the new law can interfere with patient care.

Personally, I can relate to these findings. A few months ago, a good friend of mine had a heart attack and was hospitalized for several days. I was visiting him on the 3rd day of his hospital stay when a hospital administrator approached him. She asked him several HIPAA related questions and asked him to sign various wavers. For example: Can the hospital disclose his personal information to friends and family members? Or, Can the doctors discuss his medical treatments and condition with family members and friends? Now, keep in mind that he had been on morphine and various other drugs since he was admitted and just finished an angioplasty procedure at this point in time. While I thought it was terribly inappropriate to ask him these questions while he was in no way cognisant enough to make such a decision, I asked myself when would be an appropriate time? Should they stop treating his pain long enough to let him sign the HIPAA wavers? Would that be humane? How else would he reach an informed decision? And where does the line for protection of personal privacy become unrealistic and/or ridiculous?

As a privacy and security researcher, I cannot agree with the hospitals actions in this matter. I realize that there will be continued resistance, compromises, and inconvenience in the pursuit of protecting our individual privacy; but if we don’t persist, we surely cannot progress.

HIPAA Prohibits researchers from reviewing medical records

Thursday, September 30th, 2004

Researchers who used to search medical records for potential participants in their clinical trials of new medications or medical treatments must now rely on doctors for patient referrals. As a researcher, I fully understand how this can be viewed as hindrance by medical researchers. However, as a public citizen I’m happy to see that HIPAA is having an impact on those trying to access my sensitive medical information without my knowledge. ThePrivacyPlace.Org recently released an Analysis of Web Site Privacy Policy Evolution in the Presence of HIPAA that you may find interesting.

For more information on HIPAA prohibiting researchers from reviewing medical records, see: Privacy rule builds biomedical research bottleneck.

— aia

Medical Privacy

Wednesday, September 22nd, 2004

A few months back I received an email from a person who said they have my medial reports from Blue Cross Blue Shield of NC. This was a person with the exact same name as mine. I was shocked to see the carelessness that was shown on part of the employees at the Student Health Center at my University. On checking with them I found out that my social security number was transferred to the other person’s records and vice versa. It seems to me that at many places privacy and general “best practices” are not being given the regard people expect to see.
On a similar note, recently, an Everett, Washington hospital employee mistakenly faxed confidential patient data to the city’s newspaper when the employee transposed numbers for two physicians with the same last name. More information on this case can be found at :
Hospital works to cut number of fax problems

— Neha Jain