Eva: Please note that this page displays only the opinion of the author. I do not think that CAcert could and should make such claims, especially in the arbitration area. Speaking with my experience as an arbitrator: it does not work like this.

I - Threat Modelling the Secret Cell - Deeper Story leading to Assumptions

Why CAs are targeted

CAcert runs a Certification Authority (CA). Being a CA means we are a provider of security tools to the people. As a provider of security, this also means there is some danger that people can use the security in alliance with other dangerous things.

As Dan Geer puts it, all security tools are dual-use munitions (Geer). This of course is true of all CAs, and is also true of many things: someone can buy a hammer and beat someone, or can use a pen and pencil to mail a demand of evil deeds.

For right or for wrong, CAs are targeted as being legitimate subjects of espionage, and hammers and pens are not. The foundation for this statement probably has to do with cryptography, and the fact that the certificates issued are more or less essential by browser fiat to permit you to do private and secure communications. This places CAs in a very powerful position -- browser vendors have dictated that the CA and only the CA must give you permission to conduct your browsing affairs in private.

Which means that, to the espionage agencies of the world, the CA is one of the crown jewels to which they dearly aspire control (NSA). The CA can open up the communications of many.

More Background

Some competitors for security such as steel window and door builders are not that interesting; but cryptography has a special place in history.

The important time for us is the Second World War. In WWII the Allies developed a war-winning advantage in reading the Axis's codes, which makes for interesting history: Bletchley Park, the Bomba as the forerunner of the Computer, Alan Turing and all that (Harris).

After the war, the USA government ("USG") felt that this balance-tipping weapon had to be preserved at all costs. Amongst other things, the Americans banned exports of encryption cryptography as a 'dual-purpose munition'. Most of the Western Countries followed along ("Wassenaar"), but they paid more or less lip-service to the concept as it amounted to on the one hand, banning mathematics and on the other a bit of a futile barrier to anyone who really cared.

And then came the Internet. For various historical reasons, the CA found itself in the position of providing an intermediated access to cryptography -- permission -- to much of ecommerce. Although on the surface benign, the keys and permissions became the crux that allowed the USG to maintain some perception of security over the wild west of the Internet in the early days. Much effort was spent in ensuring the success of the CA concept.

CAs therefore deal in providing cryptographic security. Although slightly different in approach (CAs do signing and identity whereas their customers do the encryption), this is not enough to change the basic situation. As a result, we can assume that:

For this reason, CAs are also typically nationally-focused.

How this effected CAcert - the early days

As a tiny, ineffective "open" player, we might have slipped under the radar. But we did not, and we saw several approaches. These approaches fell into two sorts:

First, the interested sysadms. In the early days, it became clear that CAcert couldn't just let anyone touch the systems. So the original team instituted a background check in order to filter out people.

This background check discovered some interesting things. A rather high proportion of applicants were related to national government cryptographic endeavours. We could see this connection in two ways: Either, an interest in cryptography at work meant interest in out-of-work time. Or, these were deliberate attempts to get inside.

Are National Security Employees Helpful or Harmful?

Which was it? It was of course hard to tell for sure, but luckily we came across a logical argument that solved the question. It goes like this: all government cryptographic workers are subject to security clearances and security policies (Anderson). In those processes, their loyalty to their employer, first & foremost, is established, and in the policies exists a line that states words to effect

From that, we could conclude: even if a person was just interested in CAcert for the sake of crypto interest, they still had to report up the chain of command. By law, and by custom -- typically the people who work in these spheres are very loyal and unquestioning in their adoption of patriotism. As a contrast, your average bureaucrat from the Dept. of Fisheries isn't going to cut the mustard, as they are likely more liberal minded, believing in civil society, whereas anyone who works in the military, intelligence, etc, is likely an unquestioning servant of the state.

Things might be changing as Gen Y takes hold (Stross), but we have to look at the wider picture: even if they were free & interested only at the beginning, these people would always be owned, and could be manipulated at leisure later on.

We therefore had to act as if anyone with such a conflict of interest would eventually fall to that conflict. Therefore, they should be excluded from any critical roles, however benevolent they seemed at the beginning.

As an example of the dilemma, we did have some people who said they had not reported to their security officer, and would not. The problem with this is that we cannot then proceed to entrap them into what are effectively crimes against their own employer. As we care for our people (Principles), it's not a wise choice to attempt to play double agent games. Not to mention, we're up against the experts here…

At this point, a simple background check might have sufficed but we hit another roadblock. The original check was conducted by the existing team, and they asked all the questions they could think of. Some of them were quite good, and some of these questions were problematic. Even illegal under some contexts.

For example, having read a few spy novels and understood how honey traps were conducted, the original team explored these aspects. This led them to ask questions as to sexual exposure, and specifically whether they were homosexual or otherwise vulnerable. From history, this is a very valid question, but even in background checks it is controversial - as the history of Alan Turing testifies.

On reflection, we put a stop to the old system, and a new system was designed. This time around, however, we had a lot more information, and not just from readings of the literature of cold war espionage. So let's discuss that.

The standard approach to build the secret cell

In our research into threats to CAs, we discovered the preferred way to breach a company, and especially a CA. The story goes like this:

How is it that the government can afford to supply such gifted people, so easily? The trick is this. Many governments offer a 20 year pension program. After this period, people are ready to retire; they have worked for military, intelligence or other government programs for a couple of decades, and have had enough (and so have their families). They are ready for retirement on the pension. But they have an opportunity for another 20 years of fun productive work! Further, as they have their pension, and they are used to government wages, they aren't necessarily asking top dollar.

It's win-win all the way, seemingly, which brings out:

Variants

There are variants of course, but the above is the story we discovered from our research, so it became our template.

It is a valid question then to ask whether the story was reality or fantasy. We were pretty convinced of it for several reasons, being corroboration, historical examples, and also our own experience.

From these experiences, a typical approach might go like this: A techie approaches the team, and is background checked. During that background check, his conflict of interest ("CoI") is revealed: "I once worked for an intelligence agency." Good! But the person then asks that his CoI be kept secret hush-hush because it is a privacy issue, or he has to keep his day job, or he did not want anyone to cause trouble for him at work. On the surface, this is a reasonable request.

The CoI is then discussed within CAcert's team leaders, and is analysed with the benefit of cold hard daylight. At that point, it is generally realised that we see the signs of an approach here, and the decision is made that the person can never work in a critical capacity.

This approach is real, and it leads us to a very important point:

The intelligence agencies hate sunlight (Applebaum) ! This is actually a lot easier to ensure than at first blush. In the western world, we love privacy, we adore hush-hush skunkworks, and we give high credibility to NDAs, even ones with illegal clauses in them. We love being party to secret deals! Hold on to these thoughts, and remember them when someone asks you to join a conspiracy.

We therefore had pretty convincing evidence that the approach known as the secret cell existed, and was being used against us, and happily we can add recent revelations. Until Edward Snowden blew the whistle on the activities of the NSA, it was too hard to talk about. Now however, we even have names for it: the secret cell or the cooperative endpoint.

And it's worth talking about it because that helps our people to defend against it. Tally forth...

What have we learnt about the construction of secret cells?

We can extract some more general principles here. Firstly, the CEO has to go along with it. This means a lot -- it means that the company has given permission for this to go ahead. That means from the government's point of view that this is not espionage, not black-ops, but rather it is a voluntary arrangement from the company.

If the company does not go along with it, then it would be almost certainly crime: espionage, or at a minimum, criminal hacking, deception, extortion, etc. Courts might become involved. That means publicity and damages, and that does not suit the overall surveillance mission.

However, legally, once permission is granted, there is no liability comeback! If the breach is later discovered, the company can do nothing because it permitted the process. The fact that 99.99% of employees were fooled, and that they lied on the part of the government to their customers or others such as auditors or shareholders, is not at issue. Employees are totally out of any legal options, because they aren't the company.

Worse, the ones who are in on the deal will keep mute, because their jobs and reputations will depend on it. Employees can't even name the secret cell because that would breach privacy, their own NDAs, and put them up against the as yet unexposed insiders who allowed it to happen. They likely can't even research who is involved.

The result is that the shareholders, customers and other stakeholders are totally screwed. There isn't even a need for the government to even apologise, they can simply shrug and say the company permitted it, so it must have been the right thing! Every company does its Patriotic Duty, right??? Governments cannot be pinned for the mere civil liability of this, for many many reasons, and going after the government for crime is tough -- it is only the government that can prosecute the crime.

Where does that leave the cooperating company? Hung out to dry -- it has been literally caught for many crimes and deceptions, and it has no come back.

This is a beautiful deal for the government, it cannot be sweeter! It is so enticing that it leads us to another principle of the secret cell:

Willingly or otherwise, wittingly or otherwise, coerced or otherwise. Somewhere, somehow, someone must have known and acquiesced to the approach (Hagelin).

As mentioned above, it also has to be kept secret. This is obviously important because the intelligence product that has been acquired (email, keys, etc) is far more valuable it if is secret. It also preserves the value of the entire arrangement if all secret cells are secret -- remember we are the conspiracy theorists, right? If there is no confirming evidence, we are all fruitcakes.

But, keeping it secret is also however tricky in a large corporation, so it is up to the CEO to protect that secret some how. Secrecy can be preserved in many ways. Skunk works, lying and deception. Or it can be done by a variant of the shell game -- the work can be shifted to a division that has little contact with anyone else, but is otherwise trusted.

Finally, the secret cell consists of multiple people. Typically in a security conscious department, the risky components will be spread across several people. This technique of good governance is called "separation of roles." For example in CAcert, the sysadms cannot visit the machines in the rack without an access engineer present.

So, for the secret cell to function, several people have to be inserted, and they have to cover all roles. The conspiracy has to be complete.


References & Footnotes

(Anderson) "Meeting Snowden in Princeton," Ross Anderson 2015 https://www.lightbluetouchpaper.org/2015/05/02/meeting-snowden-in-princeton/. Anderson writes: "Interference with crypto in academia and industry is longstanding. People who intern with a clearance get a “lifetime obligation” when they go through indoctrination (yes, that’s what it’s called), .... There are also specific programmes to recruit cryptographers, with a view to having friendly insiders in companies that might use or deploy crypto."

(Geer) "Tradeoffs in Cyber Security," Dan Geer 2013 http://geer.tinho.net/geer.uncc.9x13.txt

(NSA) "NSA's SIGINT Strategy 2012-2016," NSA 2012 http://cryptome.org/2013/11/nsa-sigint-strategy-2012-2016.pdf, states:

(Harris) The best way to get a feel for those times is to read fiction. Try _Enigma_ by Richard Harris or _Cryptonomicon_ by Neal Stephenson. The films _Enigma_ and the recent _The Imitation Game_ are also good.

(Stross) "Snowden leaks: the real take-home," Charlie Stross 2013 http://www.antipope.org/charlie/blog-static/2013/08/snowden-leaks-the-real-take-ho.html

(Principles) "Principles of the CAcert Community," CAcert 2007 (http://svn.cacert.org/CAcert/principles.html)

(Applebaum) 30c3 talk by Jacob Applebaum To Protect And Infect:

(Hagelin) "How NSA and GCHQ spied on the Cold War world," Gordon Corera, BBC 2015-06-28.

(Shell Game) The shell game is a confidence trick or short con using three cups, upturned, one covering a pea, and mixed around rapidly. A skilled operator extracts the pea and places it in any cup or not at all, encouraging a mark or victim to bet in which cup it is. Shills, or accomplices, surround the mark and assist in the fraud.

Related Reading (The Intercept) "Core Secrets," The Intercept https://firstlook.org/theintercept/2014/10/10/core-secrets/

Risks/SecretCells/ThreatsAndAssumptions (last edited 2015-08-01 02:04:55 by SunTzuMelange)