Developers have been rapped in some circles for writing code with security flaws, but is such criticism justified?
Where is security on developers’ priority list?
Programmers certainly have a lot on their plates and while security has been a burning issue in recent times, it hasn’t been a top priority for developers.
A survey of more than 200 developers conducted a few years ago identified half a dozen priorities of developers. In order of importance, they were functions and features as specified or envisioned, performance,usability, uptime, maintainability and, at the bottom of the list, security.
Less help received from the quality assurance department
Without a doubt, security can get in the way of some of those priorities, which is why developers moan about it. However, developers have had to adjust to similar scenarios in the past. For example, Quality Assurance used to generate debates about what’s the right ratio of developers to test engineers. Today that’s less of an issue because now every developer takes on responsibility for testing and builds unit tests every time they add new features and functionality. QA testers haven’t totally disappeared but there are fewer of them than there used to be most of them, performing manual tests that are difficult to automate. The same thing has to be done with application security. It needs to be embedded into the workflow where in the long run it can help productivity, not hurt it.
Most developers don’t know what secure code looks like
Although there may be some resistance by developers to expanding their roles in securing software, most want to write secure code but many don’t know what that means. They know some basics validate input, check for buffer overflows, encrypt data in transit and limit privileges but many aren’t equipped to address advanced problems authentication weaknesses, application logic flaws and advanced input validation.
Although there may be some resistance by developers to expanding their roles in securing software, most want to write secure code but many don’t know what that means. Most developers don’t know what secure code looks like.
Security is a marginal topic in high-ed curriculums
That knowledge gap is not surprising because developers haven’t received a lot of training about writing secure code. Just as security isn’t high on a developer’s list of priorities, teaching students how to incorporate security thinking and awareness into code design, development and testing hasn’t been high on the priority list of universities either. In a recent study of cybersecurity education at the top 36 computer science programs in the United States, researchers found that none of the top 10 programs require a cybersecurity course for graduation and three of them don’t even offer an elective course in cybersecurity.
Security tools are a major frustration point to developers
If a code warrior doesn’t know they’re introducing security flaws into their code, there’s a tendency to believe that what they’re producing is secure. That’s especially true when reports from the security team contain vulnerabilities that are not vulnerabilities at all but false positives. So if an organization expects their developers to buy in to taking greater responsibility for security, it needs to make sure it has good tools ; vulnerability scanners, source code analyzers and savvy application security SMEs to educate developers, without prejudice, that their code needs improving.
So if an organization expects their developers to buy in to taking greater responsibility for security, it needs to make sure it has good tools; vulnerability scanners, source code analyzers and savvy application security SMEs to educate developers, without prejudice, that their code needs improving.
Tools can be another pain point for developers who want to produce secure code. It’s not uncommon to hear developers complain about the tools they have lacking the kind of sophistication they need to identify security risks and fix them. On the other hand, many developers aren’t willing to spend the time necessary to tweak those tools to get more out of them with less pain.
Mixed messages received by developers
Developers have also been sent mixed signals about their role in producing secure code. The security industry often touts new products as alternatives to secure coding. That was the case with Web Application Firewalls and RASP Runtime Application Self-Protection. The pitch for WAFs was that they could stop an attackers before they could exploit flaws in an application’s code. Theoretically, that diminishes the risk created by insecure code and relieves the pressure on developers to write flawless code. In reality, though, hackers found ways to defeat WAFs, making it still important to produce secure code.
In the same vein, RASP is being sold as the answer to flawed programming. It’s designed to see into applications and shut them down if they misbehave. That’s fine as a temporary fix, but to get the app running again, whatever’s wrong with it needs to be fixed. RASP can reduce the risks created by insecure code although it’s limited in the classes of vulnerabilities it can protect against but it’s not going to make an application as secure as it could be if its code was written with security in mind during the design and build phase.
No matter how sophisticated the tools get, they will not run themselves, they need engineers with expert security skills to run them.
Developers can also be sent mixed signals about writing secure code from their organizations. Executive buyin to writing secure code is as important as getting the developers themselves to embrace the concept. Unless management understands the value of secure coding and conveys its support of the concept through things like training and purchase of state of the art tools, then any efforts to improve the security practices of coders is unlikely to gain any traction.