Welcome!

Containers Expo Blog Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Zakia Bouachraoui, Elizabeth White

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Finally, the Killer PKI Application

Web Services as an application - and a challenge

Enterprise PKI has a bad name. Complex, costly, difficult to deploy and maintain - all these criticisms have dogged this technology since it first appeared. To the dismay of so many CIOs, few applications have stepped up to make effective use of PKI. But this may soon change: Web services promotes a security model that demands the flexibility that an enterprise PKI deployment can offer.

The Trend Away from Channel-Level Security
If you lumped all the existing, production-level Web services applications together, and categorized their security models, you would probably discover some interesting trends. First, an awful lot of these don't address security at all, which probably owes more to the relative immaturity of Web services technology than to a conscious choice on the part of developers. The bulk of the remainder will simply delegate security entirely to SSL - or in some cases, a VPN connection.

SSL isn't a bad choice. It provides confidentiality and integrity. Automatic sequence numbering stands guard against replay attacks. Servers are always authenticated using a certificate that binds the server's DNS name to the Subject, a strategy to defeat man-in-the-middle and impersonation attacks. This does rely heavily on the integrity of the DNS system, but by and large it is viewed as an acceptable risk. SSL even offers optional client-side certificate authentication, which is powerful, though in practice rarely implemented.

Probably the most unheralded quality of SSL is channel continuity. Once a session is set up - and once the client and server mutually authenticate (with the client using a certificate under SSL, through HTTP authentication, or an application-level means such as forms) - a level of trust is established on the open socket so that it is available for multiple transactions without repeating this lengthy process each time. There is great value in a transparently maintained security context, and it is easy to take for granted.

Of course, one of the reasons behind SSL's success on the Web was that, although it utilizes public key cryptography, it doesn't need full-blown PKI. Most SSL-enabled Web servers use certs issued by the "browser cartel," those CAs fortunate enough to have their root certificates automatically installed within the trust store of the most popular browsers. And with the exception of a few early consumer banking products - which have largely been abandoned - almost nobody steps up to the baroque logistics of client-side certificates on the Web. The ability to delegate PKI to a third party greatly simplified security on the Web; this was one of the reasons SSL became good enough for most online transactions, even when challenged in the early days by technically elegant, though complex, solutions like SET (Secure Electronic Transaction).

But SSL's greatest weakness is that it is oriented toward synchronous transactions, requiring a direct connection between participants. It's like an encrypted telephone conversation, which is probably something alien to you and me, but I suppose that James Bond uses it regularly. Both parties need to be available, multiple passes are necessary to set up a secure context, and all of the information - the critical points alongside the mundane ("how's the weather in London?") - is encrypted wholesale, which can be a costly processor burden.

This is why SSL is an insufficient security model for Web services. Despite the name - an unfortunate one that is probably one of the great misnomers in the history of technology - Web services isn't really about the Web. In one realization, it does use existing Web infrastructure, including HTTP transport, Web application servers, etc. However, Web services is fundamentally a one-way messaging paradigm for computer communications, composed around a simple XML message structure with an extensible header model.

Web service messages may not piggyback on HTTP at all. They might flow across a message-oriented middleware (MOM) such as IBM's MQSeries, or be carried asynchronously by that other ubiquitous infrastructure, SMTP. SOAP messages are designed to flow through a network of intermediates, not unlike IP packets being passed between routers. Intermediates may be required to view header information to make processing decisions based on application-level protocol. A channel-based security model, one that encrypts everything and requires synchronous responses from a receiver, simply isn't appropriate in such a Web services architecture.

Security in the Message
The solution to this problem, as put forth in standards by OASIS and the W3C, is to absorb security into the message itself. That is, provide a means of authentication, integrity, and confidentiality that is integral to the message, and completely decoupled from transport channels. Thus, the message security remains consistent and trustworthy whether it flows over regular HTTP across a P2P network using proprietary protocols, is persisted to a file, or even printed onto a piece of paper. Ironically, it's closer to the time-honored cryptographic tradition of writing encrypted information into a message and sending it via a messenger than it is to Mr. Bond's fancy, synchronous encrypting telephone. This may strike you as lower tech, but a security model that supports asynchronous messaging has great architectural advantages.

In the Oasis Web Services Security (WSS) standard, each SOAP message stands alone, and can have security applied uniquely. It includes mechanisms for encrypting any content in the message at a very finely grained level. For example, rather than applying a cipher to the entire message, only those parts that are deemed necessary to cryptographically secure need be encrypted, such as a credit card number. This means that public parts of a message, such as header fields that might be relevant to an intermediate making a routing decision, can be left in the clear.

Of course, any part of a SOAP message is subject to modification by an attacker as it traverses potentially hostile networks. To address this, WSS provides a mechanism to sign message content, with a granularity identical to encryption. Thus, not only can a message author encrypt the credit card element, they can sign it to ensure that no substitution in transit goes undetected. The same protection can be extended to unencrypted, public elements, such as timestamps inserted into the header.

A Role for PKI
WSS goes to great lengths to remain flexible and not to specify a particular encryption/signing technology. It's certainly possible to build a WSS-compliant system based on shared secrets that are exchanged out-of-band of the WSS specification (though this disqualifies a transaction from any claims to nonrepudiation, as well as subjecting it to a toxic list of potential security flaws). Nevertheless, you would be hard-pressed to find a vendor's WSS implementation that isn't based on public key infrastructure. Furthermore, everyone is building systems predicated to have key pairs on both sides of a transaction: at the message producer (client), and the message consumer (server). So PKI is back.

This is good news if you spent a lot of money a few years back on a large, enterprise-wide PKI rollout. It was painful, and probably unrecognized, but now the investment may finally be realized. If you avoided PKI until now, Web services may be the application that forces your organization to swallow this often bitter pill.

The Typical Pattern
To understand why PKI is so essential to the typical WSS implementation, it helps to examine a common interaction model (see Figure 1). A single message is secured for transmission between two parties. This is a sessionless scenario, meaning that there is no prenegotiated, temporary security token shared between parties. In other words, there is no shared secret between the producer and the consumer, such as a key used for symmetric encryption and HMAC signing. Emerging standards, such as WS-Secure Conversation and WS-Trust, provide for negotiated security tokens and define well-known key derivation mechanisms similar to SSL's session key scheme. In this instance, we are illustrating how a message can be secured directly using only the key pair and certificate held by the producer, and the certificate for the consumer.

Figure 2 shows a map of the message exchanged between the producer and the consumer. In this simplified message, the body has been encrypted using a two-step process described in WSS. First the producer generates a random symmetric key to encrypt the body content, using a symmetric algorithm like triple-DES or AES. A specialized security header describes the exact algorithm and key length. Note the contrast here to SSL, which supports negotiation of cipher suites and key lengths. This is largely to accommodate a diversity of clients, any of whom may be subject to cryptography export restrictions. Here we assume a prior, out-of-band agreement on cipher capabilities.

So how does this shared secret become, well, shared? It's pretty simple. The producer encrypts this symmetric key with the consumer's public key, ensuring that only that party can decrypt the message. This encrypted key is then embedded in the security header, with a reference to the key pair needed to unlock it (often, this is implemented using the subject key identifier field from the receiver's certificate). In the security header, anyone can read the encrypted key, but only the designated receiver can decrypt it, and use it to further decipher the message content. Thus, no complex, multipass protocol is required to negotiate a security session key. Each message stands alone.

Encryption, however, is only one component of the security story, albeit an important one. As it stands, the encrypted message body is subject to substitution by a malicious party, as are critical header fields such as the timestamp, which is necessary for servers to uniquely identify messages and apply an effective replay defense. Furthermore, the consumer has no means to authenticate the message producer: encryption for a particular receiver does not identify the author - the message could have come from anyone.

To address this shortcoming, our message producer calculates digests of the encrypted message body and critical header fields, and places these into yet another block in the security header. It signs this block, aggregating the digested components into a single, simultaneous integrity/origin authentication statement. The producer includes its certificate (or a reference to it) in the security header so that the receiver can validate the signature and follow any certificate chain in the certificate to a trust anchor. Now the consumer can have confidence that a specific producer authored the message, it was not altered in transit, and most importantly, that it was designated specifically for this consumer.

What is important to recognize here is that all parties in this transaction have key pairs and certificates. Without PKI, the model doesn't work.

Conclusion
Clearly, Web services is a great opportunity for PKI, but it's also a great challenge. Most vendors' toolkits have a deliberately vague coupling to commercial PKI systems. As always, it's what the standards loosely describe that becomes the source of problems. Interfacing to a particular key store type or location, coercing servers to check CRLs or use OCSP can be troublesome. It's best to start proactively, rolling out your PKI system before its services are demanded by a Web services application. And the demand will come. SSL is sufficient for Web-like, client/server application, but large enterprise computing is built on asynchronous messaging; this is where Web services will shine, and where PKI will become essential.

More Stories By Scott Morrison

K. Scott Morrison is the Chief Technology Officer and Chief Architect at Layer 7 Technologies, where he is leading a team developing the next generation of security infrastructure for cloud computing and SOA. An architect and developer of highly scalable, enterprise systems for over 20 years, Scott has extensive experience across industry sectors as diverse as health, travel and transportation, and financial services. He has been a Director of Architecture and Technology at Infowave Software, a leading maker of wireless security and acceleration software for mobile devices, and was a senior architect at IBM. Before shifting to the private sector, Scott was with the world-renowned medical research program of the University of British Columbia, studying neurodegenerative disorders using medical imaging technology.

Scott is a dynamic, entertaining and highly sought-after speaker. His quotes appear regularly in the media, from the New York Times, to the Huffington Post and the Register. Scott has published over 50 book chapters, magazine articles, and papers in medical, physics, and engineering journals. His work has been acknowledged in the New England Journal of Medicine, and he has published in journals as diverse as the IEEE Transactions on Nuclear Science, the Journal of Cerebral Blood Flow, and Neurology. He is the co-author of the graduate text Cloud Computing, Principles, Systems and Applications published by Springer, and is on the editorial board of Springer’s new Journal of Cloud Computing Advances, Systems and Applications (JoCCASA). He co-authored both Java Web Services Unleashed and Professional JMS. Scott is an editor of the WS-I Basic Security Profile (BSP), and is co-author of the original WS-Federation specification. He is a recent co-author of the Cloud Security Alliance’s Security Guidance for Critical Areas of Focus in Cloud Computing, and an author of that organization’s Top Threats to Cloud Computing research. Scott was recently a featured speaker for the Privacy Commission of Canada’s public consultation into the privacy implications of cloud computing. He has even lent his expertise to the film and television industry, consulting on a number of features including the X-Files. Scott’s current interests are in cloud computing, Web services security, enterprise architecture and secure mobile computing—and of course, his wife and two great kids.

Layer 7 Technologies: http://www.layer7tech.com
Scott's linkedIn profile.
Twitter: @KScottMorrison
Syscon blog: http://scottmorrison.sys-con.com

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...