Image

Building Glass Houses in a World Full of Rocks

June 6, 2011
Software engineering has undergone a radical transformation over the past 20 years. The imperative for creating secure code has never been greater.

Software engineering has undergone a radical transformation over the past 20 years. Gone are the days of rigid design and process-driven development practices. Today's software development lifecycle works more like a fast food restaurant than a 30 course meal served at Alinea in Chicago. The advent of new development practices with names like XP, Agile, Scrum, has given rise to constant tinkering with software, where projects are never really done but are in a constant state of modification. Today's "Build it as fast as possible" mentality has given rise to the popularity of many of the more complex rapid application development frameworks, both open-source and commercial, designed to make the application developer more productive. In the rush to be most rapid and adaptive to the market, security is often the lone person at the ball without a dance partner.

The ongoing epidemic of data breach notifications forced by today's disclosure laws has painfully highlighted the insecurity of many of these applications. How can organizations ensure that their applications are secure, avoid the cost, public relations fallout, and stock price downturn from issuing numerous security patches? No executive wants to explain to consumers and regulators that code defects allowed attackers to steal sensitive and perhaps regulated information.

The imperative for creating secure code has never been greater given the widespread adoption of new technologies like web services, rich Internet applications, and the need to ensure the integrity of legacy, production, and mid-development applications in a network-oriented world. Companies also continue to integrate their systems with business partners to speed the exchange of information. For all of these reasons, organizations must ensure that code is secure to protect data privacy, preserve customer loyalty, safeguard sensitive information, and maintain operational integrity.

A single software flaw can lead to a massive data breach. One of the most recent (and some would argue the largest single breach ever) hit Sony Network Entertainment America in April 2011 leaving the personal data of 77 million users at risk. The ramifications of this type of breach are going to be felt for a long time and estimates put the costs at a conservative $50 million (Edwards C, Michael Riley, "Sony Data Breach Exposes Users to Years of Identify-Theft Risk" Bloomberg. 3 November 2011).

Numerous studies have found that catching and fixing code flaws earlier in the software development life-cycle costs significantly less than discovering them after deployment. Researching the cost of just one bug that ends up leading to a data breach quickly demonstrates the compounding cost of missing that single vulnerability. Studies have supported this reality for many years in the quality space and the impact of security flaws multiply the costs of the damage.

For many software engineering solutions, quality is one of the more important aspects of a profitable and sustainable solution (though I have certainly used many successful applications of dubious quality). Software quality has given rise to a whole cottage industry built to improve the quality of the applications we build. One can't expect high quality applications without first establishing the means of measuring, testing and managing that quality, as well as the robustness of the process used to build it. You will similarly make little progress with security, without addressing it in the same manner.

To seriously address secure application development requires three essential elements.

  • Consistency: When developing code, developers must engage with consistent processes, policies, and a culture of improved security. It is not good enough to have a set of policies without any way to ensure they are applied consistently.
  • Completeness: When it comes to dangerous vulnerabilities, large-scale design flaws typically exceed individual coding errors. Fixing individual vulnerabilities has little effect if confidential data is left unencrypted, if authentication is weak, or if open backdoors exist in an application.
  • Priority: When reviewing existing code, developers must identify the vulnerabilities in the code and selectively remediate the greatest risks first. It doesn't serve the greater security of an application to fix the hardest to find issues that are the least likely to occur. You have to focus on the easiest to find and most likely to occur first. The best way to do this is to prioritize the issues as they relate to the likelihood and criticality of compromise.

The process for spotting errors is not simply to better define the need for security in the development process, but to look at all the places in the code where design flaws might exist. These places are typically far more numerous than even advanced developers realize. Properly analyzing source code will often take testers to places they did not expect to go.

Commonly used approaches include manual code reviews and ethical hacking. While these approaches are both useful, neither is sufficient to cope with the breadth of existing and potential design errors, and therefore cannot ensure that the code is secure.

The most effective approach for companies developing their own software is to automate as much of the secure development process as is possible, not only to identify existing vulnerabilities, but more importantly to ensure consistency and prevent new issues from being introduced, according to the policies and security requirements of the business.

Software development teams must treat every application they commission, create, or assess, with security skepticism. This attitude represents a marked shift away from the traditional development approach, which is to analyze an application based on its speed, feature set, or ease of use (Biscick-Lockwood, Bar, "The Benefits of Adopting IEEE P1074-2005," April 2, 2006).

Ensuring consistency, completeness, and priority requires people, processes and technology. Source code vulnerability testing tools alone do not make software secure, but such tools help developers and code reviewers assess applications, even those with many millions of lines of code, to identify, prioritize and remediate the most damaging vulnerabilities.

Creating secure applications demands that organizations produce secure code and follow-up with ongoing vulnerability testing. Make secure code an exit requirement for any application, before allowing the code to be released. The swift upsurge of targeted threats should make requiring mandatory application security vulnerability testing a primary focus within all enterprises.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!