Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Blog Feed Post

OAuth Client Broker Tooling

OAuth handshakes are geared towards a client application driven by a user

In terms of OAuth enterprise tooling, a lot of focus is given to OAuth-enabling APIs exposed by the enterprise itself. Naturally, the demand for this reflects today’s reality where the enterprise is increasingly playing the role of an api provider. However, many enterprise integration use cases involving cloud-based services puts the enterprise in the role of API consumer, rather than provider. And as the number of enterprise applications consuming these external APIs grows, and the number of such external APIs themselves grows, point-to-point OAuth handshakes become problematic.

Another challenge relating to consuming these external APIs is that OAuth handshakes are geared towards a client application driven by a user. The protocol involves a redirection of that user to the API provider in order to authenticate and express authorization. Many enterprise integration (EI) applications do not function in this way. Instead their behavior follows a machine-to-machine transaction type; they operate at runtime without being driven by a user. Wouldn’t it be great if these EI apps could benefit from the OAuth capabilities of the APIs and still operate in headless mode? The so-called ‘two-legged’ OAuth pattern provides a work around for this challenge but requires the client app to hold resource owner credentials, which is problematic, especially when replicated across every client app.

To illustrate how an enterprise API management solution can help manage this challenge, I demonstrate an OAuth tooling geared towards brokering a client-side OAuth session with the Salesforce API using the Layer 7 Gateway. By proxying the Salesforce API at the perimeter using the Layer 7 Gateway, my EI apps do not have to worry about the API provider OAuth handshake. Instead, these EI apps can be authenticated and authorized locally using the Enterprise identity solution of choice and the Layer 7 Gateway manages the OAuth session on behalf of these applications. The benefits of this outbound API proxy are numerous. First, the OAuth handshake is completely abstracted out of the EI apps. In addition, the enterprise now has an easy way to manage control of which applications and enterprise identities can consume the external API, control of the rates of consumption and monitor usage over time. The API can itself be abstracted and the proxy can transform API calls at runtime to protect the consuming apps from version changes at the hosted API side.

To set this up on the Layer 7 Gateway, you first need to register a remote access to your Salesforce instance. Log into your Salesforce instance and navigate to Setup -> App Setup -> Develop -> Remote Access. From there, you define your remote access application. The callback URL must match the URL used by the Layer 7 Gateway administrator at setup time in the Layer 7 Gateway. Make sure you note the Consumer Key and Consumer Secret as they will be used during the OAuth handshake setup; these values will be used by your Layer 7 OAuth broker setup policy.

Using the Layer 7 Policy Manager, you publish your broker setup policies to manage the OAuth handshake between the Gateway and your Salesforce instance. Note that the OAuth callback handling must listen at a URL matching the URL defined in Salesforce. These policies use the consumer key and consumer secret associated with the registered remote access in your Salesforce instance. The secret should be stored in the Gateway’s secure password store for added security. Use templates from Layer 7 to simplify the process of setting up these policies.

Once these two policies are in place, you are ready to initiate the OAuth handshake between the Layer 7 Gateway and the Salesforce instance. Using your favorite browser, navigate to the entry point defined in the admin policy above. Click the ‘Reset Handshake’ button. This will redirect you to your Salesforce instance. If you do not have a session in place on this browser, you will be asked to authenticate to the instance, then you are asked to authorize the client app (in this case, your Layer 7 Gateway). Finally, you are redirected back to the Layer 7 Gateway admin policy which now shows the current OAuth handshake in place. The admin policy stores the OAuth access token so that it can be used by the api proxy at runtime.

Your Layer 7 Gateway is now ready to act as an OAuth broker for your EI apps consuming the Salesforce API. You can publish a simple policy to act as this proxy. This policy should authenticate and authorize the EI app and inject the stored OAuth access token on the way out. Note that this policy can be enhanced to perform additional tasks such as transformation, rate limiting, caching, etc.

Although this use case focuses on the Salesforce API, it is generally applicable to any external API you consume. You can maintain an OAuth session for each API you want to proxy in this Gateway as well as perform identity mapping for other external access control mechanism, for example AWS HMAC signatures.

Read the original blog entry...

More Stories By Francois Lascelles

As Layer 7’s Chief Architect, Francois Lascelles guides the solutions architecture team and aligns product evolution with field trends. Francois joined Layer 7 in the company’s infancy – contributing as the first developer and designing the foundation of Layer 7’s Gateway technology. Now in a field-facing role, Francois helps enterprise architects apply the latest standards and patterns. Francois is a regular blogger and speaker and is also co-author of Service-Oriented Infrastructure: On-Premise and in the Cloud, published by Prentice Hall. Francois holds a Bachelor of Engineering degree from Ecole Polytechnique de Montreal and a black belt in OAuth. Follow Francois on Twitter: @flascelles

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...