Translate

Thursday 6 December 2012

Find the Weakest Software Link Before Hackers Do

By 

You're way ahead of the game, right? After all, your company has established policies and procedures mandating how to handle internal and third-party confidential data. You've kept track of all regulatory changes and updated your policies accordingly. You require encryption, strong passwords, and the use of firewalls. You conduct regular training sessions with your employees and — using the latest front-page stories of data breaches and their resultant business interruptions, lawsuits, fines, and reputational damage — you have sufficiently terrified your employees into compliance. But have you thought of everything? Not if your risk management plan doesn't include vendor-supplied software and applications.
It probably would be difficult for many businesses to even count the number of vendors from whom they have purchased software, but the number is usually in the hundreds. For very large enterprises, the number can be in excess of 20,000. Software is used for everything from payroll, accounting, email, human resources, records, and document management. Approximately 65 percent of enterprise applications are sourced externally and 70 percent of applications developed in-house contain components licensed from vendors. Unfortunately, as noted in PwC's 2012 Security Report, up to 80 percent of vendor-supplied software and applications fail basic tests for security compliance. And the most commonly identified security vulnerabilities are among the most dangerous. Veracode Inc.'s November 2012 State of Software Security Report notes that four of the top five flaw categories detected in third-party web applications are on the "Open Web Application Security Project (OWASP) Top 10" list of most dangerous flaws, and that SQL injection, the vulnerability exploited in many of the most prominent data breaches such as LinkedIn and Yahoo, was found in 40 percent of those applications. So businesses that use and/or incorporate vendor-supplied software are definitely taking the bad with the good. All businesses need to understand that every outsourced application represents a data breach risk because of the potential for serious security flaws.
For software producers, security concerns may take a back seat to creating the next great cutting-edge product and getting it on the market ahead of the competition. Programming mistakes that create security vulnerabilities may not be exposed until after the software or application is purchased and put to use within a business enterprise. The prevalence of "bring your own device," or BYOD, to work further compounds the risk. More and more employees want the convenience of using their favorite personal device to conduct company business. Those same devices are also used to download a myriad of apps from unknown sources, many of which are insecure, at best, or malicious, at worst, and can create easy points of access for hackers to your confidential company data. Given this environment, is it any surprise that a large number of data breaches originate from vendor-supplied software?
Companies can and often do employ procedures like penetration testing to ascertain if security holes result from vendor-supplied software or other sources. Such testing, although required in the Payment Card Industry Data Security Standard, is somewhat subjective in selecting specific exploits to run, which can lead to an ad hoc approach resulting in a false sense of confidence in the test results. While companies may tend to focus on their most important systems when testing, hackers tend to attack the weakest link in the chain. In addition, penetration testing may lack efficiency and adequate scalability given the sheer number of software products and applications in most corporate inventories. Across the board assessments simply may not be practical when penetration testing.
Ideally, your company's software procurement procedures should require vendors to submit to code-level application assessments to ensure compliance with company policies and procedures before and after purchase. But surprisingly few companies address this issue with their potential suppliers, even though that probably is when the company has the most leverage to ensure compliance. In addition, verification of software security often is overlooked during merger and acquisition due diligence. Requiring vendors and other third parties to submit to meaningful security testing before your company acquires their software, and building into your purchase agreement ongoing testing requirements throughout the software's lifecycle, should be seriously considered.
But what about software your company is already using? Post-acquisition requests for vendors to verify compliance with company policies are surprisingly rare. It is estimated that fewer than one in five businesses request code-level security tests from at least one software vendor. When companies do raise security compliance issues with their vendors, they often rely on self-reporting techniques, such as vendor surveys. For obvious reasons, vendors may be reluctant to expose security weaknesses in their code. Effective security testing of vendor-sourced software by your company's IT department may be unlikely, given frequently strained IT department budgets and resources, as well as their lack of access to vendor source code.
Consequently, independent verification of software security throughout your business supply chain may be the safest course of action, and establishing a formal, automatic security testing program for vendor-supplied software should be considered. The first step of the program should establish your company policy, including the type of testing to be used and its frequency, and your company goals. Once your policy is in place, you'll need to communicate your requirements to your vendors and mandate their cooperation. Providing a detailed explanation for the program and working closely with your software vendors throughout the process should ensure cooperation and expedite achievement of your company's goals.
Companies can consider using a security specialty company to handle the vendor software security testing process for them. Veracode, for example, has developed a cloud-based Vendor Application Security Testing program to test security compliance for both traditional and web-based applications. The program validates the quality of the software, provides high-level summary results to the company, a detailed report to the software vendor, including guidance on how to remediate any deficiencies that were discovered, and confirmation as to the security of the tested software. According to Chris Eng, vice president of research at Veracode, the most important goal at the outset is establishing a practice of performing automatic security testing of a business' software portfolio and eventually ramping up the rigor of the testing in a pragmatic way as the company and its vendors grow more comfortable with the process.
As both federal and state regulations concerning the protection of confidential data continue to proliferate and evolve (See Baker & Hostetler's State Data Breach Statute Form), it has become increasingly important to ensure that your business' software supply chain is secure. Notably, in the event of post-data breach litigation, a business' failure to verify the security of its vendor-supplied software could call into question whether or not it has employed reasonable measures to secure confidential data. And further, certain laws will continue to impose liability on the owner of the lost data.
As with most problems, awareness of the security issues with vendor-supplied software is the first step towards correction. Putting into place a policy to automatically conduct both pre- and post-acquisition security testing of outside software should be seriously considered as a part of every business' data security policies and procedures.

Judy Selby is a partner at Baker & Hostetler. Email: jselby@bakerlaw.com.

No comments:

Post a Comment