The Juniper Backdoor: a Summary


On December 17, 2015, Juniper Networks released an “out-of-cycle security advisory” notifying users of their ScreenOS software that two serious issues needed immediate patching.

ScreenOS is used in Juniper’s Netscreen enterprise firewalls, which are touted as having comprehensive and integrated security features right out of the box. Juniper is a major player in this segment, and their products are particularly favored for use by government IT departments and related businesses. Any security notice was thus bound to get a lot of attention — especially given the nature of the issues described.

One of these, Juniper stated, “allows unauthorized remote administrative access” to ScreenOS devices; the other, an independent issue, “may allow a knowledgeable attacker who can monitor VPN traffic to decrypt that traffic”.

Unauthorized remote access is bad, but “administrative access” is worse — admins have the run of the system, and the implications of this are very serious indeed. Juniper included some helpful clues to look for in logfiles to see if this had occurred on a given machine, though sophisticated attackers would certainly erase or modify those logs.

Different but also dire is any compromise of VPNs — virtual private networks. These are configured in ScreenOS to allow different devices to talk securely to each other, and decryption of VPN traffic across them is clearly a bad thing. Alarmingly, the advisory also stated that “there is no way to detect that this vulnerability was exploited”.

The cryptographic community and interested observers have been trying to wrap their heads around what happened and what this means for us all.

This article is an attempt to walk through the major issues and occurrences in the still-unfolding Juniper backdoor story, and touch on the reasons it is well worth examining.


The US National Security Agency has two related but opposite tasks: to help build and strengthen information security for US citizens, and to defeat that same sort of security when deemed necessary to support the government. They’ve been involved with creation of security standards from the internet’s earliest days — not always without controversy. (In the early 1990s they were notably unsuccessful in promulgating their Clipper chip — a hardware encryption solution with a built-in “backdoor”.)

By the early 2000s the NSA had followed the general trend from hardware to software cryptography and designed a cryptographic algorithm called Dual_EC_DRBG, which they promoted as a new, trustworthy and secure random number generator (RNG).

Cryptography relies on several chunks of math to work, and trustworthy random numbers are very close to the bedrock. There are many, many ways to build RNGs. Some, like Dual_EC_DRBG, use constant values to generate random numbers, and fixed values like these are usually generated transparently, tested openly and proven to be sound before being put into use.

Dual_EC_DRBG was different, in that some of the crucial mathematical constants it uses are of obscure origin.

Obscurity has its uses in security, but not here — in crypto, hiding important data like this is poor form indeed. It is the equivalent of providing a flowchart which filters everything through a box marked “Trust me” — and a very good rule of thumb in cryptography has always been to trust nothing which can’t be proven independently.

Dual_EC_DRBG was thus something of a black box when examined closely by experts in the field, but as far back as 2007 (before it was even in general release) it was flagged as cryptographically unsound. Those NSA-created constants were found to be suspiciously weak, allowing a potential backdoor into the design. It was surmised they would produce a random-seeming but not-actually-random ‘nonce’ — the ur-random number which is the basic seed all other cryptographic functions rely on. Someone knowing how the constants were originally derived would find guessing the nonce vastly easier, and knowing the nonce defeats any cryptography built upon it.

Some researchers theorized, as a side note, that the Dual_EC_DRBG nonce was particularly vulnerable if a 32 byte nonce size was selected. (Most RNGs can be set to produce a smaller nonce — 20 bytes is quite common, and would be exponentially more difficult to crack than one using 32 bytes.)

All in all, researchers found Dual_EC_DRBG to be vulnerable to passive theft of information — a textbook example of a “kleptographic backdoor” — and one which would raise no warning flags whatsoever.


Dual_EC_DRBG was thus from its introduction a flawed design with a potential backdoor, but the only party who could take advantage of the backdoor (at this point) would be the folks who knew how those constants were derived — the NSA. As a design exercise it seemed like a interesting approach, but as a proposal for actual use it seemed like a non-starter. Security human Bruce Schneier predicted that no one would ever actually utilize Dual_EC_DRBG for real-world applications. Between the suspected backdoor and the algorithm’s speed issues (“three orders of magnitude slower than its peers,” as he put it) Schneier considered it to be insurmountably flawed. (He did note that the backdoor could at least be closed by replacing the NSA constants with newly-generated and openly-published ones.)

It was rather a surprise when Dual_EC_DRBG was adopted by the National Institute of Standards and Technology (NIST) as one of their four officially-approved cryptographic standards. NIST approval is crucial, since it leads to what is known as Federal Information Processing Standards (FIPS) validation, and thus approval for use by government agencies and related contractors. Another surprise was that NIST mandated use of those suspicious, NSA-created constants. Wary security folks could certainly swap out the NSA constants if they wished — but this would break FIPS compliance.

Although thus approved, Schneier’s predictions for Dual_EC_DRBG uptake were fairly well borne out — it served as an available but very-rarely-implemented option in most software of the era. The only notable exception was RSA Security, who embraced Dual_EC_DRBG in their products as early as 2004, even before NIST acceptance.

(There were also questions raised years later, due to the Edward Snowden revelations, regarding an alleged $10 million payment from the NSA to RSA Security — but we’re getting ahead of ourselves.)


ScreenOS used a perfectly acceptable, FIPS-compliant RNG (ANSI X.9.31) in versions up to 6.1. For reasons very much open to speculation, Juniper Networks did choose to implement Dual_EC_DRBG, starting in version 6.2.0r1 of their ScreenOS software in 2008. However, ScreenOS uses it in a very, very peculiar way:

  • ScreenOS was reworked to use two RNGs: the first, Dual_EC_DRBG, was supposed to generate a nonce to ‘seed’ another and separate RNG (ANSI X9.31).
  •  Juniper also selected different constants, replacing the worrisome NSA defaults — unlike the option Schneier suggested, however, these were not publicly published, and Juniper have never explained where these new constants originated or how they were derived.
  •  At the same time, Juniper deliberately increased the size of the nonce Dual_EC_DRBG feeds to ANSI X9.31 — from 20 bytes to 32, which is both larger than necessary and exactly matches the optimal size for exploitation arrived at in 2007.
  • Finally, the nonce is apparently generated before it is actually required and stored in a “pregeneration table” — which is bound to make access easier than to a nonce generated on the fly.

Thus, Juniper replaced a straightforward, working, FIPS-approved RNG method with a much more convoluted and opaque design, one with numerous design choices with, if not fully-fledged attack surfaces, some very troublesome weak spots. For instance, swapping out one set of opaque constants for another leaves any (suspected) backdoor functionality in place, only shifting accessibility from one actor (the NSA) to another (Juniper themselves).

(Close readers will also note that changing out the NSA constants made Dual_EC_DRBG non-FIPS-compliant. Piping output directly to ANSI X9.31 would technically restore compliance.)

If this all sounds unnecessarily complex, you’re not alone — “very bizarre” is how Professor Stephen Checkoway described it. Despite this, ScreenOS 6.2 can still be described with a straight face as both cryptographically secure and FIPS-compliant, as long as ANSI X9.31 is actually working as the active RNG.

Yeah, about that — well, add one more item to the bullet list above:

  • In the same 6.2.0r1 update, a bug was introduced which, it turns out, completely bypasses ANSI X9.31. Instead of the ANSI generator handing the cryptographic heavy lifting, ScreenOS defaults to a backup RNG.

The backup RNG is Dual_EC_DRBG.

Dual_EC_DRBG, which as implemented by Juniper, now uses their own mystery constants and a nonce size perfectly suited for exploitation.


So, as of 2008, due to a complex design and an astoundingly on-target bug, Juniper devices using the updated ScreenOS were (and, as of this writing, remain) vulnerable to compromise by anyone with access to the Juniper-assigned constants.

This is far from an ideal situation, but at least all the decisions and flaws so far at least appear to have emanated from Juniper themselves. As best researchers can determine, this remained the case through September 2012, when yet another peculiar occurance can be added to our list.

In a ScreenOS update (6.2.0r15), those Juniper-created constants were replaced by yet another set. This was, apparently, the act of a third party, and was, apparently, incorporated into the official ScreenOS update without Juniper’s awareness, let alone approval.

Changing just the constants changes the entire nature of the backdoor. The third party knowing these could now defeat Juniper’s security. Because of Juniper’s questionable design decisions, very little work was even required by the third party. They merely had to re-key the existing backdoor’s lock — everything else was already pre-configured for their use. Unless notice was taken that the new constants were in place, the backdoor would continue to function silently without any overt sign of trouble to users.

Notice, it appears, was not taken. The change definitely seems to have passed under Juniper’s radar.


This was the (deeply insecure) state of ScreenOS in 2013, when (as you may recall) Edward Snowden decided to shine a bit of light into some shadowy corners of the information security world.

One of the classified programs Snowden revealed is the SIGINT Enabling Project, with a goal to “insert vulnerabilities into commercial encryption systems”. Another fun disclosure was a claim that “eventually, N.S.A. became the sole editor” of the Dual_EC_DRBG standard”, while another document specifically listed Juniper systems, including Netscreen firewalls, as exploitable by the British spy agency GCHQ.

(That 10 million dollar transfer from the NSA to RSA was also part of these disclosures.)

Snowden’s leaks only deepened existing distrust in the cryptographic community of anything tarred at any point with the NSA’s brush. This accounts for why NIST reversed course in September 2013 and removed Dual_EC_DRBG from its list of approved standards — they in fact strongly suggested abandoning Dual_EC_DRBG entirely, for all purposes.

In response, Juniper published a very subfusc notice in the same month, to the effect that their only product using Dual_EC_DRBG (ScreenOS) doesn’t use it “in a way that should be vulnerable to the possible issues described by NIST”.

As we now know, at the time this item was published Juniper’s own constants had been replaced and the bug that bypassed ANSI X9.31 was not yet discovered or described. Their press release was arguably correct — the precise issue that led to NIST dropping the hammer on Dual_EC_DRBG was not a factor, due to Juniper’s changes. What was not realized was that those same changes had made ScreenOS much more vulnerable, and to unknown actors instead of the NSA.


ScreenOS was thus already compromised as of the September 2013 notice, but under conditions which an attacker might find vexing:

  • The traffic has to be captured, including the 32 byte nonce which is the key to unlocking the encrypted messages.
  • That captured data must then be processed — although some research has shown this might be a fairly speedy process, it wouldn’t quite be doable in real time.

Perhaps the third party which swapped out the constants found these requirements too darned frustrating and retraced their steps to plant a new hack in ScreenOS. Alternatively, yet another (fourth) party might have independently discovered the same security hole — we really don’t know yet.

What is certain is that starting with ScreenOS update 6.3.0r17, a new change was made in the official Juniper updates. This one introduced a much more direct backdoor — a hard-coded password which made access via Secure Shell (SSH) to any ScreenOS device trivial.

Just to be clear, this is a completely different problem from the swapped constants. SSH creates an encrypted terminal session, and gives a secure remote line into the targeted box to run commands as if they were directly connected to the machine. The new backdoor lets an attacker use SSH to log in with root privileges on the target system. This is as terrible as it gets — there is virtually nothing to the target system that can’t be done in this scenario.

Thus, starting in April 2014, updates to ScreenOS (versions 6.3.0r17 and later) included both the compromised constants and the direct SSH backdoor.

Words cannot really describe how dangerous this is.


So this was the state of things on December 17, 2015, when Juniper released their security advisory.

It it, they refer to two separate issues: one (CVE-2015–7755) covers the SSH backdoor, while the other (CVE-2015–7756) addresses the “encryption implementation” (i.e. the swapped constants).

Dropping an alert like this must have seemed like an early Christmas present for cryptographic researchers — the well-regarded Matthew Green tweeted that he was “really invested in the idea that this Juniper encryption vulnerability is going to be amazing.”

As the story unspooled, and the full scope of the security failures became apparent, Green’s wish came true in spades. Many knowledgeable researchers dug very deeply into this wildly convoluted series of events and detailed the story which we’ve tried to sum up above (All of which we deeply appreciate — see the bibliography for some sources and suggested further reading).

And we suspect this story still has plenty of twists and turns yet to come. We can see some of what happened, but are far from clear as to why. As this is the United States, where class action lawsuits follow security failures as flowers follow rain, we expect to be reading depositions and pretrial motions related to this for the next several years.

In the short term, Juniper is certainly on the hot seat. The same security researchers who dissected ScreenOS’s many flaws couldn’t help but notice that Juniper’s security patch removed the proximate problems (the SSH backdoor and third-party constants) but not the overarching ones (the ANSI X9.31 bypass bug, the Juniper constants, et cetera). Bending to pressure, Juniper announced on January 8 2016 plans to “enhance the robustness of the ScreenOS random number generation subsystem” — i.e., abandon Dual_EC_DRBG. — and replace it with the same RNG system employed in their Juno OS products. (Of course, GCHQ was targeting Juno OS for exploitation back in 2011 — just noted in passing.)

In the larger sense, though, Juniper’s catastrophic security failure provides a very clear lesson, one which cannot be repeated often enough.


Wishing this wasn’t true doesn’t matter. Good intentions don’t matter. Passing laws that run counter to math, common sense and the common good? Doesn’t matter. Every backdoor built into a cryptographic system will, inevitably, be exploited. Evidence suggests it won’t be by the good guys it was put in place for, and experience suggests it will be years before the exploitation is even discovered — if it’s discovered at all.

Unknown parties have been able to secretly break the encryption and directly access Juniper devices with root access, for years. This includes an unknown number of devices used by the US and foreign governments, major corporations and many other organizations. This is a security breach of vast and serious scope. The full repercussions will not be known for years, if ever — and it was only able to occur because of decisions made by Juniper.

Juniper chose to use an algorithm with a serious potential backdoor. Every design decision they chose made their products more insecure, and only paved a smooth fast road for their attackers — paved it so well, in fact, that further examination will raise serious questions regarding intent, liability and legality. However, the algorithm Juniper selected was originally designed by the NSA.

The NSA knowingly produced a flawed and exploitable algorithm, then strongly promoted its use despite reasoned and prescient warnings about issues with their design.

And the NSA is funded by citizens of the United States of America — who depend on that agency to keep them and their information as safe as humanly possible.


We’re living in a scary time, sure, and we understand the impulse behind the call for encryption backdoors — half of what happens to make the internet work seems like magic, so why not add a golden key to let us read only the bad guy’s mail? Unfortunately, the math just doesn’t work like that — encryption defends us all, and there are no runes we can scrawl on our routers to protect only some of the traffic going across it. The safety of everyone using the internet — your kids, your mom, yourself — depends on using the best, strongest encryption possible, whenever possible.

Juniper should remind us all what can happen when we don’t. will follow this story with interest. Check back with us, and remember – a safer internet is a better internet.

Sources and further reading

Out of Cycle Security Bulletin: ScreenOS: Multiple Security issues with ScreenOS (CVE-2015–7755, CVE-2015–7756)
Secret Documents Reveal N.S.A. Campaign Against Encryption
On the Juniper Backdoor
Juniper’s Backdoor
Notes from Juniper presentation at RWC
New Discovery Around Juniper Backdoor Raises More Questions About the Company
Advancing the Security of Juniper Products
Juniper drops NSA-developed code following new backdoor revelations