From Aqua Wigeon, 6 Years ago, written in Plain Text.
Embed
  1. -----BEGIN PGP SIGNED MESSAGE-----
  2. Hash: SHA512
  3.  
  4.    Undercover communication
  5.  
  6.    It should be obvious by now, that the only way to communicate
  7.    stealthily and securely is to avoid raising suspicion to the
  8.    level at which the authorities might consider it worthwhile
  9.    to put you under active surveillance (e.g., park a van with
  10.    TEMPEST equipment by your apartment).
  11.  
  12. It has long been my view that, if the authorities have enough information
  13. on you to enable them to park a surveillance van outside your home, then
  14. you have failed utterly, and the battle is already lost.
  15.  
  16. Notwithstanding that, I still had to laugh at the story posted in Slashdot
  17. the other day about he kid who was being investigated who noticed that, when
  18. searching for WiFi connections, one of the SSIDs was "FBI SURVEILLANCE VAN".
  19.  
  20.    Moreover, the medium for such a communication must be the Internet,
  21.    since since it is the only publicly available medium that has seen
  22.    any serious development of anonymous and/or secure communication.
  23.  
  24. Agreed.
  25.  
  26.    Let's go over some specific methods of clandestine information
  27.    exchange over the net:
  28.  
  29.    Encrypted e-mail
  30.  
  31.    Although apparently secure, this method puts the communicating
  32.    parties at great risk of detection. E-mail servers are centralized,
  33.    and accounts are easily associated with message transmission times
  34.    and locations. Once a single member of the communication network
  35.    becomes suspect, the whole network is immediately exposed. This
  36.    holds for all similar server-dependent protocols.
  37.    
  38. Encrypted email does not prevent traffic analysis; it merely prevents anyone
  39. from trivially discovering the message contents. If you can be located, you
  40. can be compelled to decrypt your messages, whether through legal threats or
  41. the authorities simply beating the passphrase out of you -- so-called
  42. "rubberhose cryptography".
  43.  
  44. The only way this can be avoided, is to periodically change one's encryption
  45. sub-key. If the old encryption sub-keys are securely destroyed, then the
  46. previous message traffic encrypted with those keys is not recoverable. Keys
  47. can be changed according to one's level of paranoia -- weekly, bi-monthly,
  48. monthly, quarterly, or randomly.
  49.  
  50.    E-mail accessed exclusively over onion routing
  51.    
  52.    This is a much better approach than just e-mail, but it is still
  53.    susceptible to traffic analysis, and to control of the communication
  54.    channel by an external party.
  55.  
  56. Agreed.
  57.  
  58.    Usenet posts
  59.  
  60.    This is a good approach to clandestine communication. Since Usenet
  61.    is a distributed system, traffic analysis is non-trivial, and
  62.    messages can be steganographically hidden inside innocent-looking
  63.    posts (e.g., SPAM) in some high-traffic unmoderated group. Many users
  64.    will read the message, oblivious to its true contents -- thus protecting
  65.    the message recipient from scrutiny.
  66.    
  67. While I agree that the distributed nature of Usenet makes traffic analysis
  68. non-trivial, I completely disagree with the use of steganography to protect
  69. your traffic. For starters, the authorities are not unaware of the existence
  70. of steganography -- as such, it is really suitable only for rendering your
  71. message traffic oblivious to the greater public. Also, by disguising it as
  72. spam, it may be filtered-out by some news providers.
  73.  
  74. If you're going to secure your messages, then the best way to do so is to
  75. use strong encryption. The best way to hide strongly-encrypted messages is
  76. to post them to a newsgroup where strongly encrypted messages make-up
  77. virtually all of the traffic in the group. If you're looking for such a
  78. secure, high-traffic group, you really need look no further than
  79. alt.anonymous.messages (a.a.m.) -- it was designed for this very purpose.
  80.  
  81. Furthermore, as I relate in the example case below, any real volume of PGP-
  82. encrypted traffic in newsgroups other than alt.anonymous.messages /will/ be
  83. noticed.
  84.  
  85. One of the most frequent uses for alt.anonymous.messages is as the target of
  86. nymserver reply-blocks. Use of such reply-blocks renders any nymserver email
  87. address untraceable, as the encrypted mail can be picked-up from any news-
  88. server that carries alt.anonymous.messages. Furthermore, there are utilities
  89. (e.g.aamfetch, available from sourceforge) that can be used to fetch all
  90. one's messages from alt.anonymous.messages making it impossible to determine
  91. precisely what messages are being retrieved.
  92.  
  93. Nymserver accounts are setup/maintained by sending specially-constructed
  94. email messages to the nymserver. If these messages are sent via a chain of
  95. mixmaster remailers, even the nymserver operator cannot determine who owns
  96. a particular nymserver account, even if they were to start keeping logs,
  97. perhaps at the insistence of the authorities. If one uses a randomly-chosen
  98. chain of mixmaster remailers, then it is not possible to the authorities to
  99. compromise the remailers you are using -- in order to trace you, they would
  100. have to effectively compromise the entire mixmaster network.
  101.  
  102. Accordingly, the only way a nymserver account holder can then be traced is
  103. through their reply-block associated with the account. If the reply block
  104. points to alt.anonymous.messages (a.a.m.), then the authorities will reach
  105. a dead-end. They will not be able to trace the nym account owner, nor will
  106. they be able to read their message traffic.
  107.  
  108. Now, you might ask: "Just how secure is this setup?"
  109.  
  110. You'd be surprised at just how effective it is -- it was enough to thwart a
  111. combined investigation by the FBI and the Australian Federal Police (AFP),
  112. the Queensland Police Service (QPS), Europol, Interpol, The Department of
  113. Internal affairs New Zealand, and the Toronto Police Service.
  114.  
  115. Let me tell you a little story....
  116.  
  117. In just a few days, it will be exactly 3 1/2 years ago, that the American
  118. Federal Bureau of Investigation (FBI), the Australian Federal Police (AFP)
  119. and the Australian Queensland Police Service announced the existence of
  120. "Operation Achilles" which led to the breakup of what they claimed was one
  121. of the largest child pornography rings uncovered up to that time.
  122.  
  123. The individuals comprising this pedophile ring called themselves "the group"
  124. and they believed themselves untouchable, beyond the reach of the police.
  125. (For many of them, -- one-half to two-thirds, depending on which affidavit
  126. you believe -- this did, indeed, turn out to be the case. This includes the
  127. ringleader, who is known by the handle Yardbird.)
  128.  
  129. The number of persons reportedly involved varied -- one affidavit stated
  130. that there were 61 persons involved, another 45, and yet another 48. All in
  131. all, there were 22 persons arrested: 2 in the UK, 4 in Germany, 2 in
  132. Australia, and 14 in the U.S.
  133.  
  134. The FBI podcast, "Inside the FBI" states that the number of persons involved
  135. was 60, of which 22 were positively identified. You can listen to the podcast
  136. and read the transcript at the following URL:
  137.  
  138. https://www.fbi.gov/news/podcasts/inside/operation-achilles.mp3/view
  139.  
  140. Another superb source of information is the so-called "Castleman Affidavit"--
  141. this affidavit was used to justify the arrest of group member Daniel Castleman.
  142. The Castleman affidavit explains the group's methodology (or modus operandi)
  143. in detail.
  144.  
  145. It can be seen at: http://www.rep-am.com/newsdocuments/affidavit.pdf
  146.  
  147. Another good source of information is:
  148.  
  149. http://www.policyb.org/downloads/Operation_Achilles.pdf
  150.  
  151. Depending on which affidavit you believe, only about 1/3 to 1/2 of the
  152. alleged members of this pedophile ring were ever identified and apprehended.
  153.  
  154. As I said earlier, the alleged leader of this ring used the nic "Yardbird".
  155. Yardbird made a re-appearance on Usenet in both 2009 and 2010 on the date
  156. corresponding to the first and second anniversaries of the busts in 2008.
  157. His intent was to show that he was still free, and to answer people's
  158. questions.
  159.  
  160. One of the most important things Yardbird stated were that everyone in the
  161. group who used Tor and remailers remained free, while those who relied on
  162. services such as Privacy.LI were arrested and convicted.
  163.  
  164. Yardbird further commented that several members of the group, including his
  165. second-in-command Christopher Stubbings (Helen) and Gary Lakey (Eggplant)
  166. were Privacy.LI users -- in fact he stated that they used it for everything.
  167. (Helen is currently serving a 25-year sentence in the UK, while Eggplant is
  168. serving life in an Arizona prison.)
  169.  
  170. Eggplant literally became notorious because of his constant promotion of
  171. Privacy.LI -- he continually boasted that he could not be caught because
  172. Privacy.LI did not keep logs, and they were located outside of U.S.
  173. jurisdiction.
  174.  
  175. I pointed out to anyone who would listen that services such as Privacy.LI
  176. were for /privacy/ -- not for anonymity. In an ideal situation, one needs
  177. both to be private as well as anonymous. Essentially, what Privacy.LI
  178. supplied was a type of VPN service, providing an encrypted tunnel for data
  179. to travel between two endpoints--the customer's computer being one endpoint,
  180. while the Privacy.LI servers provided the other. While there was a degree of
  181. privacy, there was NO anonymity at all--so it really didn't come as a
  182. surprise that Privacy.LI's customers were among those arrested. It is also
  183. worthy of note that Privacy.LI earned a 2005 entry in cryptographer Bruce
  184. Schneier's "doghouse" as I pointed out more than once.
  185.  
  186. See: http://www.schneier.com/blog/archives/2005/07/the_doghouse_pr.html
  187.  
  188. As I pointed out repeatedly, NO service operator is going to go to prison to
  189. protect the identity of his customers -- every last one of them will roll
  190. over on you, if given the opportunity.
  191.  
  192. You might ask, "How was the existence of 'the group' discovered?"
  193.  
  194. Simple. Through one of the oldest investigative techniques of all -- the
  195. informer. The Australian police arrested a man on totally unrelated child
  196. pornography charges -- presumably as part of a plea deal, he revealed the
  197. existence of 'the group' and handed over a PGP public/private keypair and
  198. password.
  199.  
  200. Now, it is worthy of note that the Department of Internal Affairs of New
  201. Zealand had earlier informed the Australian police of the existence of PGP-
  202. encrypted traffic in a number of Usenet newsgroups.
  203.  
  204. These messages, from users with handles like "Big Block" and Subject: lines
  205. like "New Car Contracts" were rather odd, to say the least. I also noticed
  206. some of these--it was quite clear that there were a group of people
  207. communicating in private, but obviously there was no way to determine /who/
  208. was communicating, or /what/ they were communicating about.
  209.  
  210. If the Australian police had not had a lucky break, by arresting one of the
  211. members of the group on totally unrelated child pornography charges, they
  212. would, in all likelihood, /still/ be in the dark about what was going on.
  213.  
  214. Having acquired from the informer the current group PGP public/private
  215. keypair, and its passphrase meant that the police could assume this group
  216. member's identity, and furthermore, read all the encrypted traffic posted by
  217. members of the group.
  218.  
  219. So it was that Constable Brenden Power of the Queensland Police Service used
  220. this assumed identity from August 31, 2006 through December 15, 2007.
  221. Constable Power spent almost 18 months working out of FBI HQ in Washington,
  222. DC while working on this case.
  223.  
  224. In many ways, this case was unprecedented. No similar pedophile ring had
  225. ever previously employed the types of security measures that this group did;
  226. also unprecedented was the information provided by the informant, who gave
  227. the police the tools needed to infiltrate the group--without the informant's
  228. help, they could _never_ have succeeded.
  229.  
  230. Once the group was penetrated, the police were able to take advantage of a
  231. few factors:
  232.  
  233. 1) They had the informant's computer, with all its email, PGP keys and the
  234.    like. This provided a history, which made it easier to continue the
  235.    impersonation.
  236.    
  237. 2) By the time it was penetrated, the group had been operating for about 5
  238.    years. By this time, the group had jelled into a community -- people were
  239.    familiar with each other, they often let their guards down, and would
  240.    sometimes reveal tidbits of personal information. This is especially the
  241.    case when they thought their messages were secure, and beyond the ability
  242.    of the police to intercept--they would say things that they would *never*
  243.    say in the open.
  244.    
  245. So, as you can see, the group was pretty much an of open book to the police;
  246. they were completely and thoroughly penetrated. Despite that, however, the
  247. majority of the group were _still_ able to remain at large, and were neither
  248. positively identified nor arrested.
  249.  
  250. This is due to the privacy tools (i.e. tor, nymservers, remailers) that were
  251. employed. Even with everything else being an open book, those using these
  252. tools still managed to elude capture.
  253.  
  254. By now, you're probably thinking, "Why is he going on about pedophiles?"
  255. "Pedophiles are disgusting! They should all be shot!"
  256.  
  257. Leaving aside my personal feelings about pedophiles, I brought up this case
  258. as an example for several reasons:
  259.  
  260. 1) Child pornography is a serious crime in virtually every jurisdiction.
  261.    As this example demonstrates, police will work together, even across
  262.    national boundaries, to investigate these crimes. They are willing
  263.    to invest considerable time, manpower and money in pursuit of these
  264.    suspects. The only other crimes which usually merit this type of
  265.    approach are drug/gun-running or terrorism. The level of effort
  266.    expended in pursuing this group can be seen in that even FBI
  267.    executive assistant director J. Stephen Tidwell was involved.
  268.    
  269.    Normally one would not expect FBI personnel that highly placed
  270.    to be involved -- this shows the level of importance placed on
  271.    this particular investigation. (A year or so after the busts,
  272.    Yardbird himself expressed astonishment that the FBI would
  273.    consider his group such a priority.)
  274.      
  275. 2) This case is the only one that I'm aware of, where suspects were
  276.    using sophisticated tools like PGP, Tor, anonymous remailers and
  277.    nymservers.
  278.  
  279. 3) This case underscores the effectiveness of these tools even against
  280.    well-funded, powerful opponents like the FBI, Europol, and Interpol.
  281.    (N.B.: FWIW, those who were caught used either inappropriate and/or
  282.    ineffective tools and techniques to protect themselves.
  283.    
  284. 4) I fully understand most people's disgust at the types of crimes/
  285.    criminals being discussed here. That said, it is important to
  286.    remember that one simply cannot design a system that provides
  287.    protection for one class of people, but denies it for another.
  288.    You can't, for example, deploy a system that provides privacy/
  289.    anonymity for political dissidents, or whistle blowers, and yet
  290.    denies it to pedophiles -- either *everyone* is safe, or NO ONE
  291.    is safe. This may not be palatable, but these are the facts.
  292.    
  293.  
  294. Final Thoughts
  295. ==============
  296.  
  297. While this case shows the strengths of the current technologies, it
  298. nevertheless underscores that the human element cannot be disregarded. It
  299. must continually be borne in mind that the weakest element in /any/ security
  300. system is the human element. This has been true since before Sun Tzu wrote
  301. his immortal treatise, The Art of War about 2500 years ago. It is, in fact,
  302. for this reason that Sun Tzu is still studied in military academies to this
  303. very day. It is not for nothing that Sun Tzu devoted an entire chapter in
  304. his seminal work to the use of spies.
  305.  
  306. As we have seen, infiltration is still a highly effective tactic. The group
  307. was particularly susceptible to this, as the members were unknown to each
  308. other, by deliberate design. If someone were to be apprehended, they could
  309. be forced to turn over PGP private keys, passphrases, etc. These can then be
  310. used by the authorities to PGP-sign messages, which normally would be taken
  311. as proof that the messages in question are genuine and untampered-with. This
  312. is likely what happened in the case of the group.
  313.  
  314. Traditionally, espionage cells have been made up of only a handful of persons,
  315. each known to the other -- the idea behind this was to limit the damage in
  316. the case of the cell being either penetrated or exposed.
  317.  
  318. The only types of organizations that cannot be penetrated by the authorities
  319. are those close-knit, bound by blood or other kinship ties. The only possible
  320. recourse for the authorities in these cases is to try to turn someone on the
  321. inside against his fellows.
  322.  
  323. Baal <Baal@nym.mixmin.net>
  324. PGP Key: http://wwwkeys.pgp.net:11371/pks/lookup?op=get&search=0x1E92C0E8
  325. PGP Key Fingerprint: 40E4 E9BB D084 22D5 3DE9  66B8 08E3 638C 1E92 C0E8
  326. Retired Lecturer, Encryption and Data Security, Pedo U, Usenet Campus
  327. - --
  328.  
  329. Sed quis custodiet ipsos Custodes?"    --    "Who will watch the Watchmen?"
  330.                                  -- Juvenal, Satires, VI, 347. circa 128 AD
  331.  
  332. If you accept that freedom of speech is important, then you are going to
  333. have to defend the indefensible.                             -- Neil Gaiman
  334.                                  
  335. He that would make his own liberty secure must guard even his enemy from
  336. oppression.
  337.                                                             -- Thomas Paine
  338.  
  339. -----BEGIN PGP SIGNATURE-----
  340.  
  341. iQEcBAEBCgAGBQJOVwOSAAoJEAjjY4weksDowfgH/0YD0y+/rb8yeDemIgHiVKob
  342. Jz8PX9njZKADBxAREMwqGjwZ2tfOr7HDouB/moHE0ZtBvjYmON3LJZFueb661DuA
  343. 8AP5tFfJgHx95JKbt/4WWwsKzs534izVnjrL1IW1GdOuVDuooWvBJK50+b9n58p1
  344. o3Pq8N00vGwRAOXwX5ltMJ98zUzDlkVXNMPbs19u8lFdqQNoTVSYYm9rvxcVtqrK
  345. MJ/T4oozZz1/RryiOC8wGyEvl5GMAFr0pcFUegIIpjIpMpxXM2d8cqp3yPxXYU6+
  346. ZWmLQbkdgyhkRAOOIMPFWXC0+WKcy6A+xuK0bEyb7ZaJz0ibKAeo0BOgD+IqwlQ=
  347. =/sG0
  348. -----END PGP SIGNATURE-----