Author Archive

In Defense of Smart Phone Security by Default

Sunday, October 19th, 2014

The Apple iOS8 phone and the latest Google Android phone claim to establish landmark privacy protections by establishing encryption by default. According to Apple and Google, they will be unable to “open” the phone for anyone, not even law enforcement. These new measures have been sharply criticized by the Director of the FBI and the Attorney General. As a software engineering professor, I’ve devoted my career to teaching students how to develop (a) secure, (b) privacy preserving, and (c) legally compliant software systems. I’m not qualified to debate whether or not this move by Apple and Google is lawful or constitutional. However, as a technologist I can assert that applying security best practices will yield a system that can withstand intrusions and denial of service attacks, limits access to authenticated and authorized users, etc.

The recent “encryption by default” design decision by Apple and Google is currently being discussed in software engineering and security classes across our nation, and perhaps across the globe. By default, privacy and security researchers, technologists and activists applaud this decision because it is raising the bar for truly implementing security best practices. It’s a bitter pill to swallow for professors who teach students to develop secure, privacy preserving, and legally compliant software, to have our students be told on the job, “Oh, that stuff you learned about security back in school? We only want you to secure the system part way, not all the way. So, leave in a back door.” Such a position undermines those academic institutions seeking to prepare tomorrow’s security and privacy workforce in an ever-changing world where sophisticated criminals are getting smarter and their offensive techniques are surpassing our ability to stay ahead.

From my experience working with government agencies, I thoroughly understand the desire to “catch the bad guys” and value the ability to prevent malicious criminal activity by individuals or nation states. I want our government, Department of Homeland Security, Department of Defense and Intelligence Community to protect us from the unfathomable. I find myself wondering why the very institutions who promote security and privacy best practices (via, for example, centers of excellence at our nation’s top universities) are so vehemently opposed to industry actually implementing best practices. My analysis yields two observations:

  1. Taking the Easy Way Out. For law enforcement to expect companies to provide the government with back door access (even when required by law), seems to me to be the lazy approach. If one reads between the lines, one could infer that the government is lacking the incentives and/or the will to innovate and improve the state of the art in cyber offense. Where’s the spirit of the scientists and engineers who enabled man to walk the moon? Where’s the American will to innovate, to surpass the state of the art, and be the best? Why let other nations beat us at our own game? The only way we can get better at offense is by facing the best possible defense. At a time when other nation states are getting so sophisticated, we risk not developing our own capabilities if we rely on an easy backdoor rather than honing our own skills. We need to keep ourselves sharp by learning how to confront the state of the art systems. If we aren’t staying ahead of the curve then other countries and their intelligence services will have reason to develop capabilities beyond our agencies when we’re relying on these factors.
  2. Creating a Backdoor for Use in Other Countries. If the United States expects companies to provide a back door to gain access to systems and the data that resides in those systems, then other governments will, too. We can’t well expect Apple or Google to provide a backdoor to the U.S., but not to China or Russia. At least in the United States, we have a legal framework that requires search warrants, etc. to gain access via the backdoor. But many other countries lack these legal safeguards and will require the phone companies to enable snooping into the systems within those countries with no legal protections comparable to US system. As security engineers have learned in many other systems, you can’t build a vulnerability that is used only by the good guys and not by others.

I certainly empathize with law enforcement’s desire to gain evidence for critical investigations. But Congress and the White House have agreed that cybersecurity should be funded as a national priority. As professors of computer security, we can’t teach the importance of building secure systems and then explain to our students that we will leave tens of millions of devices insecure.

Dr. Annie I. Antón is a Professor in and Chair of the School of Interactive Computing at the Georgia Institute of Technology in Atlanta. She has served the national defense and intelligence communities in a number of roles since being selected for the IDA/DARPA Defense Science Study Group in 2005-2006.

Is That Vault Really Protecting Your Privacy?

Tuesday, October 9th, 2007

Last week, Microsoft announced a new PHR (Patient Health Records) system called HealthVault. HealthVault is a web-based portal that enables end-users to upload their health records on the web. Unfortunately, what people don’t realize is that HealthVault and similar PHR systems are not subject to or governed by law. When the Health Insurance Portability and Accountability Act (HIPAA) was enacted, we did not envision that private software firms would eventually want to create databases for our health records. As a result, HealthVault and other PHR systems are not subject to the same privacy and security laws to which traditional medical records are subject to in the United States because they are not “covered entities” as specified in the HIPAA.

Over the course of the past 7 years, researchers at ThePrivacyPlace.org have evaluated over 100 privacy statements for financial and healthcare web portals. In addition, we focus on evaluating the extent to which the privacy of sensitive information is protected in these systems as well as the extent to which system comply with relevant regulations.

Even though physicians and the press are excited about the introduction of these new PHR systems [1], there are questions that I urge the public to ask before entrusting their sensitive health records to any PHR system. My concerns are based on a careful evaluation of the HealthVault privacy statements [2, 3]. Microsoft appears to have sought the counsel of physicians who believe that patient consent is the best indicator of privacy protections. Unfortunately, most physicians do not understand the subtleties buried within healthcare privacy statements within the context of the software that implements those statements. For this reason, I now list three primary questions that one should ask before entrusting their health records to HealthVault or any other PHR system:

Will your health information be stored in other countries without appropriate legal oversight, skirting many of the protections afforded by the HIPAA?

The HealthVault privacy statement explicitly states that your health records may be off-shored to countries that do not afford the same privacy protections for sensitive information that we do in the United States. In particular, if information is disclosed or altered, do you have any legal recourse or remedy?

Will your health care records be merged with other personal information about you that was previously collected within the context of non-health related services?

Within the context of HealthVault, the answer to this question is yes. Microsoft explicitly states that they will merge the information they have previously collected from you via non-health related services with your HealthVault information. Moreover, it is unclear what information Microsoft already has about us other than our names and contact information and precisely what information third parties may access. Furthermore, we don’t know if that information is accurate or complete. Thus, use of the merged information may not be what we expect.

Are the access controls to your health records based not only on your consent, but also on the principle of least privilege?

Although HealthVault requires patient consent for any accesses and sharing of your health records, access controls leave the door wide open for data breaches. HealthVault enables individuals to grant access to other people and programs that can further grant read/write access to your health record. The only safeguard is a history mechanism to provide an accounting of accesses if you suspect that your information has been breached after the fact. A better approach would be for Microsoft to proactively enforce contractual obligations via audits and monitoring mechanisms.

The hype surrounding HealthVault’s privacy protections among those in the medical community must be balanced with the reality of the information security and privacy practice expressed in its public privacy statements. It is critical to address these privacy concerns in the design of PHR systems before we deploy them with vulnerabilities that will ultimately lead to yet another rash of data breaches.

References

[1] Steve Lohr. Microsoft Rolls Out Personal Health Records, New York Times, 4 October 2007.

[2] Microsoft HealthVault Search and HealthVault.com Beta Version Privacy Statement, October 2007.

[3] Microsoft HealthVault Beta Version Privacy Statement, October 2007.

Video Surveillance Should Be Included in Privacy Policies

Friday, October 7th, 2005

Video surveillance is very prominent these days and given ThePrivacyPlace.Org’s extensive analyses of privacy documents at Financial Institutions (see: Financial Privacy Policies and the Need for Standardization), I decided to examine whether financial institutions now mention the collection of information via video surveillance in their policy documents. I checked all nine institutions examined in our 2004 IEEE Security & Privacy paper. None of the nine institutions mention video surveillance in any of their privacy, security or legal statements. Given that banks, for example, collect video of all ATM transactions and of patrons that enter their institutions it seems only natural that we, as patrons, have a right to expect that these practices be included in their policy statements. Video surveillance impacts one’s sense of autonomy because the knowledge that one’s actions are being observed may alter one’s behavior, thus resulting in a loss of privacy. There are certainly merits to video surveillance at ATM machines (public safety, deterrence of crime, etc.). However, the loss of autonomy results in a potential invasion of privacy about which patrons need to be informed. This begs the questions: Why are financial institution not including their video surveillance practices in their policy statements? Are patrons not entitled to know how this video is used and whether it is aggregated with other kinds of information about them?

HIPAA Prohibits researchers from reviewing medical records

Thursday, September 30th, 2004

Researchers who used to search medical records for potential participants in their clinical trials of new medications or medical treatments must now rely on doctors for patient referrals. As a researcher, I fully understand how this can be viewed as hindrance by medical researchers. However, as a public citizen I’m happy to see that HIPAA is having an impact on those trying to access my sensitive medical information without my knowledge. ThePrivacyPlace.Org recently released an Analysis of Web Site Privacy Policy Evolution in the Presence of HIPAA that you may find interesting.

For more information on HIPAA prohibiting researchers from reviewing medical records, see: Privacy rule builds biomedical research bottleneck.

— aia

Airline Passenger Data To Be Handed over to TSA in November

Wednesday, September 22nd, 2004

Passenger information for those who flew in June of 2004 will be
turned over to the government
to evaluate a new system designed to help identify terrorists. This data is certain to have anomalies which could potentially lead to innocent citizens being erroneously labeled as terrorists and placed on a perpetual watch-list. Additionally, the fact that the government is collecting data that includes, for example, special food requests opens the door for individuals to unfairly infer things about passengers.

The TSA has posted a Privacy Impacts Assessment (PIA) for the testing phase of the Secure Flight program. See Yahoo News Article for more information.

U.S. Senate Requires Privacy Impact Reports

Tuesday, September 21st, 2004

The U.S. Senate has unanimously approved an amendment to the 2005 Homeland Security Department spending bill. The amendment requires all federal agencies that use data-mining technologies to submit a privacy impacts report to Congress. For more information, see: Senate votes for privacy study on agencies’ data-mining use.

jetBlue & Northwest Disclosures of Passenger Travel Records

Monday, September 20th, 2004

Last October, a few of us at ThePrivacyPlace.Org examined the JetBlue Airways’ policy in an attempt to better understand the revelation that JetBlue had violated its public privacy policy when it gave the travel records of five million JetBlue customers to Torch Concepts, a private contractor to the Department of Defense (DoD). This paper is scheduled to appear in IEEE Security & Privacy and is entitled, “The Complexity Underlying JetBlue’s Privacy Policy Violations.” If you don’t want to wait for the paper to appear in print, the technical report is currently available here: The Complexity Underlying JetBlue’s Privacy Policy Violations.

The Department of Homeland Security (DHS)
Privacy Office
investigated jetBlue to determine if the DoD had violated any laws. The DHS Privacy Office released a Report to the Public on Events Surrounding jetBlue Data Transfer on February 20, 2004. This report asserts that there is no evidence that jetBlue had provided directly to the Transportation Security Administration (TSA) or the U.S. Department of Transportation (DOT). Instead, that jetBlue had provided the information to Torch Concepts through its contractor (Acxiom). This objective of this investigation, was to determine whether government agencies had played a role in the privacy violation. The report states that no TSA employee had violated the Privacy Act; however, TSA employees were involved in the data transfer and failed to consider privacy policy impacts of this transfer: “The TSA employees involved acted without appropriate regard for individual privacy interests or the spirit of the Privacy Act of 1974.” The DHS report specific recommendations, including the need for comprehensive privacy training for employees and the establishment of data sharing guidelines.

It was later revealed that Northwest Airline had also disclosed the travel records of its customers as well. This privacy violation also prompted a number of complaints, including one by the Electronic Privacy Information Center (EPIC). See: Northwest Airlines’ Disclosure of Passenger Data to Federal Agencies.

On the 15th of September, the Transportation Administration dismissed the privacy complaint filed by EPIC against Northwest (see: Transportation Department dismisses privacy complaint against Northwest.

We at ThePrivacyPlace.Org will continue to investigate methods and tools that can be developed to help stop sensitive information from being disclosed when such disclosures are not in compliance with governing policies and laws. For a sample of some our efforts, check out our reports that are available on our publications page.

— Annie Antón

NSF funds automatic chatroom spies

Friday, September 17th, 2004

The NSF (National Science Foundation) is funding a project, entitled “Surveillance, Analysis and Modeling of Chatroom Communities “.

From the award abstract, it appears as though the researchers intend to develop an automated surveillance system that will collect data in Internet chatrooms to discover hidden groups in which possible terrorist activities might be discussed. The system would automatically determine who is chatting with whom as well as specific topics that are being discussed in chatrooms by specific chat room participants. Unfortunately, the abstract does not mention how the PIs will investigate the social impact of such technologies; nor does it mention how this technology may or may not violate the privacy of innocent chatroom participants.

As researchers it is critical for us to consider the broader impacts of our work on society, especially when creating technologies that can further erode what little remaining privacy public citizens can still claim.