Home / Blogs

Who Is Responsible for Your Application's Security?

Justin Blanchard

With the advent of devops and cloud infrastructure, developers are front and center where security is concerned. But should they be?

The dividing line between developers and IT operations used to be distinct. Developers were responsible for adding new features securely, but it was IT operations who had responsibility for infrastructure and network security. For the most part, developers didn't have to think too much about the wider security context.

With the advent of the cloud, and of devops, things changed radically. Now developers are quite capable of deploying cloud infrastructure at will, and the tools exist to make infrastructure deployment programmatic. These changes mean that developers do have to think about security, both within the scope of the code they are responsible for and within the wider organizational scope.

Developers have a different set of pressures to IT operations teams. For the IT department, security and availability are core goals. IT measures its success by how well it achieves those goals.

The situation is different for developers. Although developers now have responsibility for security, in many organizations their goals and incentives are not aligned with security best practices.

A developer may be tasked with creating a feature for an application, and even with overseeing testing, staging, and production infrastructure to support that feature. Operational security falls to the developer, but if pressure is on the developer to get the feature to market as quickly as possible, rather than as securely as possible, there's clear scope for problems to arise.

Forbes Contributor Tom Gills sees the solution in removing security from the remit of developers altogether:

"As I see the world evolving, I believe IT needs will drive us back to a paradigm where security controls are independent of developer activity. There's a strong appetite on the part of customers to have a set of controls that are managed independently of developers and operations. I think that's a good thing."

Developers would be free to operate within a set of constraints: they can deploy servers, but the data on those servers is always encrypted, for example — the developer doesn't have to care about specific security implementations because that is handled for her.

An alternative approach would be to recognize that developers are now part of the security process. A business's security process and incentives should be designed such that developers are incentivized to put security first, not as a secondary concern.

Of course, such a thoroughgoing change necessitates management buy-in and organizational changes to implement incentives and development processes that optimize for security, rather than speed of development.

As Mandy Huth points out:

"senior leadership needs to be asking the question, "what security gaps exist in our product and how do we proactively address them, so we don't have to deal with redress after the fact?"

Whichever approach eventually comes to dominate, there's no denying that the relationship of developers to application and infrastructure security has changed, and the organizations that employ them need to change too.

By Justin Blanchard, Chief Marketing Officer at Server Mania
Follow CircleID on
Related topics: Cloud Computing, Cybersecurity
SHARE THIS POST

If you are pressed for time ...

... this is for you. More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

Vinton Cerf, Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Share your comments

Management incentives as well Todd Knarr  –  Jun 14, 2016 12:01 PM PST

I think you also have to look at the goals and incentives of management. Most often management doesn't prioritize security because the organization faces no real penalty for security failures. In the US when a data breach exposes credit-card information, for example, the majority of the time the only liability for the company is paying for a year of identity monitoring for those customers whose card numbers were exposed and (with one exception that I know of) the most it'll be liable for is the amount of purchases which can be proven to be fraudulent with the burden of proof on the cardholder. I don't think we'll see the management buy-in and attitude change needed until organizations face having to pay the full costs (direct and indirect) of all parties affected by the breach (the cardholders, the merchants and the banks) unless the organization can show all best practices were employed in the most effective way (eg. full-database encryption is a best practice but it doesn't protect against a compromise of the application server itself so it's use alone shouldn't get an organization out of liability when the application server is compromised).

To post comments, please login or create an account.

Related

Topics

New TLDs

Sponsored byAfilias

Cybersecurity

Sponsored byVerisign

Domain Names

Sponsored byVerisign

IP Addressing

Sponsored byAvenue4 LLC

DNS Security

Sponsored byAfilias