Why bundle encryption and authorization? - Ionic Security
Why bundle encryption and authorization?

Why bundle encryption and authorization?

Ionic Machina™ is an authorization platform that has been co-engineered with a key management system (KMS). At Ionic, we view encryption as simply one form of authorization enforcement. It’s certainly not the only one, and depending on the use case, perhaps not even the most important one. But bundling encryption and authorization simplifies both development and overall architecture. Authorization decisions backed by a KMS also inherit the confidentiality, integrity, availability, and pure scale natively enabled by that technology.

For example, in a default installation, Machina manages up to a trillion Data Encryption Keys (DEKs) so that you can encrypt each separate piece of data without risking running out of keys. That means that if one of your DEKs is compromised, you have a very limited blast radius of that breach, since the DEK was only used to encrypt a very small amount of data.

This design simplifies fine-grained authorization decisions. When you are using encryption as an enforcement point, an authorization grant means you receive the DEK. If you don’t receive the DEK, you’re already locked out of access, so enforcement is automatic. 

It also makes it very easy for developers to implement cryptography correctly, even for developers who don’t have a crypto background. Instead of going through a large number of steps (making an authorization decision in one place, then tying that into a KMS lookup to retrieve the DEK and then decrypting the ciphertext), a developer in a Machina-enabled environment will just issue a single Machina SDK call:

ciphertext.decrypt()

This call will extract a lookup record for the DEK from the ciphertext, issue an access request to that DEK, and, assuming access is granted, hand over the key material to the SDK which will decrypt it for the application. Metadata and other attributes that are relevant to the authorization decision are already attached to the DEK on the server side, and are evaluated in the authorization decision. Also note that there is no key object that the developer has to deal with in this simple example, though for more involved use cases, the developer does have access to add/delete/modify mutable attributes on the DEK, as well as granular control over crypto algorithms and modes.

External Authorization Management (EAM) workflow diagram shows arrow pointing from Box A, Client, to Box B, Service with Machina integration. Arrow is labeled 1: authenticated req. Arrow points from Box B, Service, to Box C, Machina. Inside Machina, there are three boxes: Encryption Key Management, Attribute-Based Access Controls, and Dynamic Policies. The third arrow in the workflow points from Machina to Service, and the arrow is labeled 2: req context. Arrow from Service to Box D, Resource (Local or Remote) is 4: fetch response. Arrow from Resource back to Service labeled 5: resource response. Finally, the last arrow points from Service back to Box A, Client, 6: resource response.
External Authorization Management (EAM) workflow

You can already see some of the architectural simplicity inherent in this approach since a significant portion of authorization decisions involve access to sensitive data. Architecting encryption and authorization separately—not to mention building in an attribute store to manage attribute-based authorization decisions—exponentially increases the numbers of calls and layers for decisions where encryption is required to meet risk or compliance mandates. It can also degrade performance.

But what if you’re not using encryption in your use case? In this case, you won’t be using the DEKs for anything, but they will still be used internally in the decision-making process. For every resource you protect, a DEK is assigned to guard its access. Metadata and other attributes sit on this DEK, and those attributes will be reasoned over in the policy engine that ultimately gives an allow/deny answer to an authorization request.

Let’s say that user Alice logs into a web application that uses Machina as its External Authorization Manager. Alice then requests access to a protected resource, and the web app calls out to Machina for an authorization decision. At this point, Machina has the following pieces of information to make a decision:

  • Environmental attributes (e.g., date and time, IP and/or location the request is coming from, user agent, version, OS)
  • Alice’s user attributes (e.g., risk score from a SIEM or UEBA, group memberships, roles, last time her PIV card was read)
  • Attributes on the protected resource (e.g., metadata, when was it protected, by whom)

The resource attributes are tied to the DEK, so this object is used internally. The actual key material is probably never going to be used in this case, but the auxiliary information on the same object will be.

Even though the key material is not being used for encryption in this example, the transaction is still backed by the confidentiality, integrity, and availability of a highly scalable key management infrastructure. DEKs are stored as AES-GCM ciphertext at rest for confidentiality and replicated synchronously and asynchronously so that no key is ever lost. 

The keys themselves are wrapped and delivered using AES-GCM, and at each authorization point, a check can be done to make sure the message received is the one that was sent. A mathematical check here guarantees integrity since if any part of the secured envelope has been modified, then the checksum will fail, indicating tampering. 

Machina has a 99.99% uptime availability guarantee for Enterprise tier customers and is trusted for the creation or retrieval of keys with low latency and linear scalability at the global level (trillions of keys across millions of tenants, in thousands of locations). 

In conclusion, Machina leverages lessons learned from the identity and access management space to centralize and simplify the authorization to resources. Authentication protocols like OAuth come with a strong sense of subject identity and attending capabilities like multifactor authentication that are very necessary even though they are not required for every scenario. A solution like Machina brings a strong sense of resource identity to authorization decisions, along with additional capabilities like encryption that are required in some cases but not all. A solution that can bundle encryption with authorization avoids the complexities inherent in sensitive data workflows while still empowering pure authorization decisions.