• Products
    • IdentityServer
    • IdentityServer for Redistribution
    • Backend for Frontend (BFF) Security Framework
  • Documentation
  • Training
  • Resources
    • Company Blog
    • Featured Articles
    • About
      • Company
      • Partners
      • Careers
      • Contact
    Duende Software Blog
    • Products
      • IdentityServer
      • IdentityServer for Redistribution
      • Backend for Frontend (BFF) Security Framework
      • Open Source
    • Documentation
    • Training
    • Resources
      • Company Blog

        Stay up-to-date with the latest developments in identity and access management.

      • Featured Articles
      • About
        • Company
        • Partners
        • Careers
        • Contact
      • Start for free
        Contact sales

      Essential Moments in the OAuth and OpenID Connect Timeline

      Joe DeCock Principal Software Engineer at Duende Software Joe DeCock Khalid Abuhakmeh Customer Success Engineer at Duende Software Khalid Abuhakmeh Brett Hazen Engineer at Duende Software Brett Hazen

      published on May 20, 2025

      Like all technological achievements, individuals’ cumulative efforts and contributions lead to something great. The current state of application security results from many talented individuals who take security seriously, coming together to build a technology from which we all benefit. Without them, we’d likely all be building bespoke security solutions with varying levels of vulnerability and interoperability. Thank goodness for standards!

      In this post, we take a trip down memory lane and explore some of the standards created over the last 15 years, both in the IETF and the OpenID Foundation, that make OAuth and OpenID Connect what they are today.

      The OAuth 2.0 and OpenID Connect Timeline

      While Internet Engineering Task Force (IETF) and OpenID working groups created many standards, we picked a few that have significantly impacted security implementations in application development, specifically around OAuth and OpenID Connect.

      OAuth 2.0 and OpenID Connect Timeline Visual

      OAuth 2.0 - April 2010 - October 2012 (RFC 6749)

      In early 2010, contributors from Microsoft, Yahoo!, and Google proposed a specification called OAuth Web Resource Authorization Profiles (OAuth WRAP) to replace OAuth 1.0, which developers found too hard to use because of its requirement to canonicalize HTTP messages. It used bearer tokens, rather than proof-of-possession tokens - another simplification over OAuth 1.0. It was contributed to the IETF OAuth working group, and became OAuth 2.0 in October 2012. The OAuth 2.0 Authorization Framework enables third-party applications to obtain limited access to resources at an HTTP endpoint. The specification enables resource owners to approve interactions between themselves, the third-party application, and the protected resource.

      You’ve seen the consent screen when you’ve logged into services authenticated using a social login. For developers, tooling for services like GitHub or GitLab uses OAuth to access secured elements such as repositories, issue trackers, and other resources.

      OpenID Connect - March 2010 - February 2014

      The OpenID Connect standard is the most widely used login protocol today. It spans use cases from mobile to Web to enterprise to cloud, and can be deployed at high security levels. It’s the protocol everyone uses every day without knowing it, because it’s plumbing - not a consumer brand.

      As you can see from the timelines, it was built along side of OAuth 2.0, JWT, JWS, JWE, JWK, JWA and other protocols and data formats - all informing one another during standards development. It’s hard to imagine an online security world without OpenID Connect!

      JSON Web Token (JWT) - October 2010 - May 2015 (RFC 7519)

      It’s hard to envision a time in web-based security when JSON Web Tokens (JWTs) weren’t present, but that time did exist. The JSON Web Token is almost synonymous with modern security practices, and imagine it turned 10 years old in May 2025. Development of the JWT, JWS, JWE, JWK, and JWA specs began in October 2010. The format allows parties to provide secure JSON payloads that are encoded and signed using a JSON Web Signature (JWS) and/or encrypted using JSON Web Encryption (JWE). It offers a simple yet straightforward secured claims representation that has helped make security interoperability a reality.

      “We designed JWT to be a simple, general JSON-based token format. Of course, we kept our identity use cases in mind - particularly ID Tokens. Little did we know at the time that JWTs would end up being used for all kinds of other things. For instance, I learned in 2020 that JWTs are helping combat fraudulent and unwanted telephone calls by securing Caller ID information. One sign of a successful standard is that it is used for things that the inventors never envisioned!”
      – Mike Jones, author of several widely-used OpenID, OAuth, JOSE, COSE, ACE, and SecEvent specs

      OpenID Certification Program - July 2014 - April 2015

      The OpenID Foundation started developing a certification program for OpenID Connect implementations starting in July 2014, soon after the specification was completed in February 2014, and launched the certification program in April 2015. Certification is critical both for interoperability and security. Certification improves interoperability by ensuring that certified implementations faithfully follow the specification. And Certification improves security by ensuring that implementations correctly implement the features needed to secure deployments.

      The certification program has since expanded to cover FAPI 1.0, FAPI 2.0, and several other groups of OpenID specifications.

      Proof Key for Code Exchange or PKCE - July 2013 - September 2015 (RFC 7636)

      As OAuth gained popularity, it has unfortunately attracted the attention of malicious actors. In July 2013, Nat Sakimura and several other authors proposed Proof Key for Code Exchange (PKCE) to protect against authorization code interception attacks. In short, if an attacker obtains the authorization code, they can use it to generate an access token.

      PKCE solves this problem by introducing a cryptographically generated random key. We’ll let the authors explain.

      To mitigate this attack, this extension utilizes a dynamically created cryptographically random key called “code verifier”. A unique code verifier is created for every authorization request, and its transformed value, called “code challenge”, is sent to the authorization server to obtain the authorization code. The authorization code obtained is then sent to the token endpoint with the “code verifier”, and the server compares it with the previously received request code so that it can perform the proof of possession of the “code verifier” by the client. This works as the mitigation since the attacker would not know this one-time key since it is sent over TLS and cannot be intercepted.
      – RFC 7636

      In short, even if an attacker were to intercept the authorization code, they would not be able to redeem it for an access token because they would not have the original secret.

      OAuth 2.0 Authorization Server Metadata - November 2015 - June 2018 (RFC 8414)

      The ability to publish authorization server metadata significantly increased the ease of interoperability among OAuth components produced by different parties. Rather than having to obtain AS configuration information, such as endpoint URLs, from developer documentation, it can be obtained programmatically from the published metadata. This is less error-prone and more convenient.

      Mutual-TLS Client Authentication and Certificate-Bound Access Tokens - October 2016 - February 2020 (RFC 8705)

      Some organizations value the security and peace of mind that certificates provide. While certificates are not as easy to manage as string-based secrets, the infrastructure surrounding them makes it very difficult to replicate a fraudulent version.

      In October 2016, several authors introduced the concept of certificate-bound access and refresh tokens using mutual Transport Layer Security authentication with X.509 certificates. Using certificates enforces the trusted relationship between client and server, as the authorization server only issues tokens to the certificate-owning client.

      Resource Indicators for OAuth 2.0 - March 2016 - February 2020 (RFC 8707)

      As OAuth developers’ needs evolve, so do specifications. In March 2016, an enhancement to OAuth 2.0 was proposed to allow clients to specify the resource server where an issued access token will be utilized. This additional information allows the authorization server to make more informed decisions about which policies to apply, which can lead to changes to the token and its contents.

      While a seemingly minor change to the specification, this change gives those implementing authorization servers more options to deliver secure solutions to a growing number of consumers.

      JWT-Secured Authorization Request (JAR) - February 2014 - August 2021 (RFC 9101)

      The final OpenID Connect Core specification standardized signed Request Objects in February 2014. Before that, information passed in an authorization request was in parameters encoded in the URI. As you can imagine, this had limits and security implications, as URIs have size limitations, and auditing systems can accidentally log sensitive information that attackers can exfiltrate. RFC 9101 copied the signed Request Object definition from OpenID Connect Core into an OAuth specification, making signed requests also available when not using OpenID Connect.

      This specification sends request parameters by value in JWT format in the HTTP request body or by reference using a URL referencing the JWT containing the request parameters. These JWTs can be signed and optionally encrypted, and they are not subject to size limits when passed by reference.

      While some older implementations still exist, many authorization servers have adopted the JWT-secured authorization request due to its advantages.

      Pushed Authorization Requests or PAR - September 2019 - September 2021 (RFC 9126)

      Until this RFC was proposed, communication between authorization servers and clients always occurred through stateless communication via a user agent, such as a browser. While this is still secure, there are limitations.

      This Pushed Authorization Requests (PAR) RFC created a mechanism for a client to push request information to an authorization server using a backchannel mechanism. This requires the server to maintain the state of the pending requests.

      PAR allows the authorization server to authenticate the client before any user interaction happens. The increased confidence in the identity of the client during the authorization process allows the authorization server to refuse illegitimate requests much earlier in the process, which can prevent attempts to spoof clients or otherwise tamper with or misuse an authorization request.

      For ASP.NET Core Developers, our very own Joe DeCock implemented the .NET implementation of PAR.

      Demonstrating Proof of Possession (DPoP) - March 2019 - September 2023 (RFC 9449)

      Replay attacks are a common vector for attackers. If an attacker can access a valid token, they can impersonate the original caller. Demonstrating Proof of Possession (DPoP) defines a proof-of-possession mechanism that protects all involved parties from replay attacks targeting access tokens and, in some cases, refresh tokens.

      Demonstrating Proof of Possession (DPoP) is an application-level mechanism for sender-constraining OAuth [RFC6749] access and refresh tokens. It enables a client to prove the possession of a public/private key pair by including a DPoP header in an HTTP request. The value of the header is a JSON Web Token (JWT) [RFC7519] that enables the authorization server to bind issued tokens to the public part of a client’s key pair. Recipients of such tokens are then able to verify the binding of the token to the key pair that the client has demonstrated that it holds via the DPoP header, thereby providing some assurance that the client presenting the token also possesses the private key. In other words, the legitimate presenter of the token is constrained to be the sender that holds and proves possession of the private part of the key pair.

      By binding tokens to cryptographic keys, whose private keys are not disclosed by the protocol, attackers cannot use any intercepted tokens, because they do not possess the private keys securing them. The client always retains a private key that is never transmitted to the server, although proof of its possession is sent in the form of a signature. This gives us a higher-security option than the bearer tokens that were previously always used with OAuth 2.0. Private and public key pairs are great [RFC 6750].

      FAPI 2.0 - March 2021 - February 2025

      FAPI 2.0 is a security profile covering several standards that organizations should adopt to develop a security posture appropriate to protecting high-value resources. These resources are typically found in financial services, e-health and e-government, but are not limited to those fields. If you work with sensitive data or want additional security, FAPI 2.0 is something you should explore. The FAPI 2.0 approach comes with extra overhead and maintenance costs that may be inappropriate for some security implementations. FAPI 2.0 became final in February 2025. It builds on the lessons learned building and deploying FAPI 1.0, which was completed in March 2021. (FAPI used to stand for Financial-grade API, but for 2.0, it became simply “FAPI”, no longer being an acronym.)

      Read more about the FAPI 2.0 Security Profile if you’re interested in implementing it as your security profile. Duende IdentityServer already supports all these features, so Duende customers only need to take a few steps to enable it in their instances.

      Conclusion

      The IETF’s and the OIDF’s hard work has improved the security world, with OAuth and OpenID Connect underpinning much of modern development’s security. As Duende employees attend these meetings, we thought back at all the meaningful conversations we engaged in and figured we would choose some of our favorite and most impactful specifications over the last 15 years, but we’d love to hear about yours.

      If you have a favorite OpenID standard or RFC, and let’s face it, who doesn’t? Please let us know in the comments. As always, we’ll watch for the latest coming from the OIDF and IETF and actively participate in the working groups to bring the latest and greatest security practices to .NET.

      Duende logo

      Products

      • IdentityServer
      • IdentityServer for Redistribution
      • Backend for Frontend (BFF)
      • IdentityModel
      • Access Token Management
      • IdentityModel OIDC Client

      Community

      • Documentation
      • Company Blog
      • GitHub Discussions

      Company

      • Company
      • Partners
      • Training
      • Quickstarts
      • Careers
      • Contact

      Subscribe to our newsletter

      Stay up-to-date with the latest developments in identity and access management.

      Copyright © 2020-2025 Duende Software. All rights reserved.

      Privacy Policy | Terms of Service