Archive for October, 2014

In Defense of Smart Phone Security by Default

Sunday, October 19th, 2014

The Apple iOS8 phone and the latest Google Android phone claim to establish landmark privacy protections by establishing encryption by default. According to Apple and Google, they will be unable to “open” the phone for anyone, not even law enforcement. These new measures have been sharply criticized by the Director of the FBI and the Attorney General. As a software engineering professor, I’ve devoted my career to teaching students how to develop (a) secure, (b) privacy preserving, and (c) legally compliant software systems. I’m not qualified to debate whether or not this move by Apple and Google is lawful or constitutional. However, as a technologist I can assert that applying security best practices will yield a system that can withstand intrusions and denial of service attacks, limits access to authenticated and authorized users, etc.

The recent “encryption by default” design decision by Apple and Google is currently being discussed in software engineering and security classes across our nation, and perhaps across the globe. By default, privacy and security researchers, technologists and activists applaud this decision because it is raising the bar for truly implementing security best practices. It’s a bitter pill to swallow for professors who teach students to develop secure, privacy preserving, and legally compliant software, to have our students be told on the job, “Oh, that stuff you learned about security back in school? We only want you to secure the system part way, not all the way. So, leave in a back door.” Such a position undermines those academic institutions seeking to prepare tomorrow’s security and privacy workforce in an ever-changing world where sophisticated criminals are getting smarter and their offensive techniques are surpassing our ability to stay ahead.

From my experience working with government agencies, I thoroughly understand the desire to “catch the bad guys” and value the ability to prevent malicious criminal activity by individuals or nation states. I want our government, Department of Homeland Security, Department of Defense and Intelligence Community to protect us from the unfathomable. I find myself wondering why the very institutions who promote security and privacy best practices (via, for example, centers of excellence at our nation’s top universities) are so vehemently opposed to industry actually implementing best practices. My analysis yields two observations:

  1. Taking the Easy Way Out. For law enforcement to expect companies to provide the government with back door access (even when required by law), seems to me to be the lazy approach. If one reads between the lines, one could infer that the government is lacking the incentives and/or the will to innovate and improve the state of the art in cyber offense. Where’s the spirit of the scientists and engineers who enabled man to walk the moon? Where’s the American will to innovate, to surpass the state of the art, and be the best? Why let other nations beat us at our own game? The only way we can get better at offense is by facing the best possible defense. At a time when other nation states are getting so sophisticated, we risk not developing our own capabilities if we rely on an easy backdoor rather than honing our own skills. We need to keep ourselves sharp by learning how to confront the state of the art systems. If we aren’t staying ahead of the curve then other countries and their intelligence services will have reason to develop capabilities beyond our agencies when we’re relying on these factors.
  2. Creating a Backdoor for Use in Other Countries. If the United States expects companies to provide a back door to gain access to systems and the data that resides in those systems, then other governments will, too. We can’t well expect Apple or Google to provide a backdoor to the U.S., but not to China or Russia. At least in the United States, we have a legal framework that requires search warrants, etc. to gain access via the backdoor. But many other countries lack these legal safeguards and will require the phone companies to enable snooping into the systems within those countries with no legal protections comparable to US system. As security engineers have learned in many other systems, you can’t build a vulnerability that is used only by the good guys and not by others.

I certainly empathize with law enforcement’s desire to gain evidence for critical investigations. But Congress and the White House have agreed that cybersecurity should be funded as a national priority. As professors of computer security, we can’t teach the importance of building secure systems and then explain to our students that we will leave tens of millions of devices insecure.

Dr. Annie I. Antón is a Professor in and Chair of the School of Interactive Computing at the Georgia Institute of Technology in Atlanta. She has served the national defense and intelligence communities in a number of roles since being selected for the IDA/DARPA Defense Science Study Group in 2005-2006.