Zero-Trust Architecture: Identity Is the New Perimeter
Zero-Trust is not a single standard, product, or component that can be implemented to solve a security problem. A quick Google search on the term "Zero Trust" reveals 699M results with content ranging from security companies of all sizes, to major cloud providers, log management system vendors, and systems integrators. It's a popular topic and has been used as a buzzword for some time. However, I think the Zero-Trust Architecture (ZTA) concept is gaining momentum, and organizations are considering how to practically approach the topic.
National Institute of Standards and Technology (NIST) published a paper titled "Implementing a Zero Trust Architecture" that defines Zero-Trust:
"Zero trust is a set of cybersecurity principles used to create a strategy that focuses on moving network defenses from wide, static network perimeters to focusing more narrowly on subjects, enterprise assets (i.e., devices, infrastructure components, applications, virtual and cloud components), and individual or small groups of resources. A ZTA uses zero trust principles to plan and protect an enterprise infrastructure and workflows. By design, a ZTA environment embraces the notion of no implicit trust toward assets and subjects, regardless of their physical or network locations (i.e., local area networks versus the internet). Hence, a ZTA never grants access to resources until a subject, asset, or workload are verified by reliable authentication and authorization."
That last part there, "ZTA never grants access to resources until a subject, asset, or workload are verified by reliable authentication and authorization," means that no access should be granted unless the identity of the requestor has been verified. The requestor's identity must be known, whether it's a user, device, or system.
The components responsible for identity, context, and policy verification are referred to as Zero Trust Network Access (ZTNA) products. They create a boundary around the application or service itself rather than the network. This protects the application or service from being publicly discoverable and access is only granted after fine-grained verification of the identity, context and access policies.
If the traditional model was to tightly control access at the perimeter, the new approach, the Zero-Trust approach, is to tightly control access based on identity. The identity is the new perimeter.
A Zero-Trust approach puts emphasis on the Authentication and Identity Verification of users, systems, and devices. There's much more to this than just a username and password. Though passwords and API keys are widely used, they are frequently compromised, lost, and stolen. There's the sticky note on the keyboard or the accidental commit of an API key to GitHub. Low tech mistake or high tech exploit, it's all the same. If the static secret that identifies a user or a system is lost, we're in trouble.
For ZTA to work, we need robust mechanisms to verify the identity, and we require strong authentication mechanisms. If there's any level of suspicion, there must be procedures that can kick in to strengthen the trust in the identity.
A natural step towards stronger trust of the identity is to apply Multi-Factor Authentication (MFA). It's a common approach these days but not always trivial for organizations to implement.
A solution like the Curity Identity Server offers many different options for authentication out-of-the-box and also provides a workflow approach to the Authentication flow. This makes it easy to implement MFA where multiple authentication options can be chained.
Username and password-based authentication degrade the user experience with frequent verifications. Instead, look for passwordless authentication options such as WebAuthn with YubiKey, or similar, Duo Authenticator, where the user gets a push notification on a mobile app. There are even biometric options.
An interesting approach to determine a stronger trust in the identity is to leverage geo-location information. This doesn't necessarily have to be done to disallow access based on a specific location (although that's also certainly possible) but could be used more to detect anomalies in the user behavior and trigger stronger authentication options. This could be due to the user authenticating from a country they have never authenticated from before. Or it could be that the user is authenticating from a location that would have been impossible to travel to based on their previous location, i.e., they performed an impossible journey.
Geo-location checks don't have to deny access. Instead, they could enforce extra identity verifications by triggering additional forms of authentication. Simultaneously, the system could log geo-location data and send alerts when necessary.
Similar to using geo-location data, date/time information could be used as input information in the authentication process. Again, not to deny access but more to proactively create a stronger trust in the identity.
Would a user really be logging in at 3 AM? Probably not, but obviously, it could happen. If a login outside of regular hours does occur, perhaps there should be additional verifications.
An exciting area to watch around the topic of identity verification is Continuous Authentication. This is the act of constantly verifying the user. There are different technologies applied in this area. One way is to leverage the unique way a user types on a keyboard — the time in between certain characters is measured and becomes associated with a profile. If a deviation of this profile is detected, it is highly probable that a different person is accessing the system. In such a case, the user should probably be prompted for extra authentication to further validate the identity.
Authentication is at the center of a ZTA. But what's next? Once the identity of the user or system is verified, there must be a mechanism to determine precisely what is allowed. How much access should the given identity have? This is where authorization comes in.
Access control rules are needed to apply a least privilege approach. However, this is difficult to implement using roles alone. A more fine-grained approach is required to allow any type of attribute as an input in making an access decision. A decoupled authorization engine that leverages a centrally defined policy used by all applications and services is a great approach. Consider something like Open Policy Agent (OPA) with a declarative policy language and broad support for integration of enforcing the authorization.
The use of roles probably won't fully diminish, but it will be challenging to express corner cases using roles alone. These situations could lead to role explosion, which will be very hard to manage. Attributes that define an identity are always available anyways and can be leveraged in conjunction with roles.
The future will move from directly managing roles and attributes for users towards token-based management, where tokens hold claims that are the building blocks for authorization decisions. Tokens are presented when access is needed, and a policy-based authorization engine (like OPA) can leverage the claims in the tokens (and other attributes) to make access decisions. A Zero-Trust Architecture is a Token-Based Architecture.
What's a Good Approach?
A ZTA is not something that's architected, configured, deployed, and then left alone. It's not necessarily a moving target per se, but a ZTA must evolve and change. There will always be new apps, new use cases, and new business cases to adapt to. Don't paint yourself into a corner with application-native code that is difficult to manage and maintain.
Consider leveraging industry standards for future-proofing the overall architecture. Look for highly flexible and configurable solutions to handle any and all types of scenarios that offer extensibility with minimal effort.
Start small and grow. It's impossible to ZTA-ify your entire environment in one go. Start by identifying a set of critical applications or APIs with high-security requirements. Onboard those to use a centralized token-based architecture leveraging an OAuth/OIDC server. Once that's in place, it will be easier to onboard additional applications and APIs.