From Beefy Parrot, 10 Years ago, written in Plain Text.
Embed
  1. [Dailydave] The old speak: Wassenaar, Google,   and why Spender is right
  2.  
  3. Bas Alberts bas at collarchoke.org
  4. Sat Aug 1 19:52:48 EDT 2015
  5. Next message: [Dailydave] The old speak: Wassenaar, Google,     and why Spender is right
  6. Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
  7.  
  8. This will be a long and ranty one as well as the first DD post I've made
  9. in a non-Immunity capacity (I think).
  10.  
  11. So anyone that knows me on any personal level knows that I'm a non
  12. disclosure kind of guy. Now I could get into the why and how, but what
  13. it really boils down to is that I subscribe to a fairly peculiar belief
  14. system in which freedom and security are, generally speaking, mutually
  15. exclusive.
  16.  
  17. I think that in an effort to "secure" the internet, most so called privacy
  18. advocates and full disclosure zealots are actually contributing to a
  19. power structure that promotes totalitarian levels of control.
  20.  
  21. A secure internet is, by definition, a controlled internet. If you're
  22. talking about software security, anyways.
  23.  
  24. I've made that point in various forms before on this mailing list, but
  25. it has become ever so relevant in recent times due to the proposed US
  26. Wassenaar implementation.
  27.  
  28. I thought it was interesting, if not telling, that the USG aligned
  29. themselves with what is essentially a full disclosure policy. The
  30. proverbial get out of fail free card for the 1st round of proposed
  31. Wassenaar export control legislation was essentially "as long as you
  32. tell all you are okay".
  33.  
  34. As long as you tell all you are in the green. How on earth does that
  35. sentiment align with any privacy advocacy? It is absurd. Yet we see many
  36. a self confessed full disclosure zealot and privacy advocate froth with
  37. an almost sadistic glee at the idea of a government enforced full
  38. disclosure. Finally all those scumbag xdevs are forced to show their
  39. cards. Finally it will all be out in the open.
  40.  
  41. Because that is what privacy is about right? Forcing things into the
  42. open through Government control? When I see someone like Chris Evans
  43. essentially cheerleading Full Disclosure as law on his tweeter it
  44. fundamentally rubs me, as an American (HA!), the wrong way.
  45.  
  46. Then you have people like Chris Soghoian, who's entire pro-Wassenaar
  47. argument was based on non-US companies. Lest we forget that HackingTeam
  48. was actually fully Wassenaar compliant under even the strictest
  49. interpretations. Which demonstrates exactly why and how it is a moot
  50. endeavor.
  51.  
  52. I also think it's interesting how the HackingTeam thing was performed in
  53. the blackhat tradition of dumping mailspools. What is WikiLeaks if not a
  54. crowdsourced big-data analytic version of ~el8 at this point?
  55.  
  56. I think you took a wrong turn somewhere team privacy. But that's just
  57. me, I suppose.
  58.  
  59. Anyways, both sides of the disclosure fence suffer from one fatal
  60. flaw. A flaw that Brad Spengler AKA Spender has been incessantly
  61. pointing out for years and it's that bugs don't matter. Bugs are
  62. irrelevant. Yet our industry is fatally focused on what is essentially
  63. vulnerability masturbation.
  64.  
  65. I keep up with the Google Project Zero blog because I think it's
  66. hilarious to see them fawn over bugs like they're actually hacking with
  67. them.
  68.  
  69. "This is the perfect bug", "This exploit is beautiful", and many other
  70. such paraphrases are rife in a lot of the Project Zero publications.
  71.  
  72. I suppose that's what happens when you spend a couple of million dollars
  73. on tricking out a team of vulndev mercenaries, most of which were
  74. playing on the other side of the fence for many years before stock
  75. options and bonus plans took precedence over actually hacking (or
  76. facilitating such).
  77.  
  78. I'm sure there's some true believers at Google. Ben Hawkes, Chris Evans,
  79. Tavis Ormandy. They are ride or die full disclosure zealots (AFAIK) and
  80. I may not agree with them in principle, but I do appreciate and even
  81. respect the strength of their conviction.
  82.  
  83. Having said that, if you gave me a billion dollars today, what
  84. percentage of the Google security team could I employ tomorrow?
  85.  
  86. It's an interesting question I think. From an adversarial perspective
  87. that is. Say e.g. the NSA or whoever actually cared about someone fixing
  88. "hundreds!" of bugs in desktop software and the real Internet wasn't a
  89. facsimile of an early 90ies LAN party. Say that was the case.
  90.  
  91. If "they" got _real_ budget to buy out all the "top researchers" in the
  92. industry, do you honestly think it wouldn't cripple Google's effort
  93. overnight?
  94.  
  95. And that's essentially the crux of the problem. You can't fight
  96. religious wars with mercenaries. You need martyrs. When your team is
  97. for sale, it's very hard to align yourself with any sort of ethical,
  98. moral or even altruistic high ground.
  99.  
  100. And hey, again, not judging. It's a job for most. Myself included. I'm
  101. 35 and I could give less of a fuck about whether or not my homies from
  102. whatever Scandinavian country are keeping down their roots this
  103. week. Which, btw, I'm sure they are.
  104.  
  105. Anyhoo, back to the actual ranting. Ben Hawkes stated that "attack
  106. research in the public domain" is the way forward for security.
  107.  
  108. The problem with that is that the majority of his team got skilled in
  109. the non-public domain. Attack research doesn't get good in the public
  110. domain, it gets good because it is used to, you know, attack. It has to
  111. jump through hoops and quirks and work over sat hops and against
  112. thousands of targets and do all sorts of weird things that would never
  113. come up in a lab environment.
  114.  
  115. This whole modern game of public exploit vs mitigation is a circle jerk
  116. based on a seed that came from the dark, and people forget that. A lot
  117. of people currently making their bones killing bugs for Google (or
  118. whoever) got good because they spent time on teams doing
  119. actual attack research for actual attacks. Hell, some of them are near
  120. and dear friends of mine. I suppose it's the elephant in the room that
  121. noone wants to talk about.
  122.  
  123. You got your ex-vupen, ex-teso, ex-adm, ex ... well you get the idea.
  124.  
  125. Anyhoo, back to why we're all wrong and Spender is right.
  126.  
  127. At the end of the day my team, Google's team, and lots of people's teams
  128. are rooted in a culture of vulnerability masturbation. We fawn over
  129. "beautiful" bugs and OMGWOW primitives and can wax endlessly about how
  130. we understand such and such allocator to the point where you could play
  131. a game of goddamn minecraft with nothing but a heap visualizer and your
  132. allocation/deallocation primitives. 30 page dissertations on
  133. over-indexing an array and hell we'll even hold court about it at
  134. whatevercon for 60 minutes ... autographs at the bar.
  135.  
  136. And it's all bullshit. If you care about security that is.
  137.  
  138. "But to stop exploitation you have to understand it!". Sure. But here's
  139. an inconvenient truth. You are not going to stop exploitation. Ever.
  140.  
  141. You might stop my exploitation. You might stop my entire generation's
  142. exploitation. But somewhere the dark is seeding away methodologies you
  143. don't know about, and will never know about. Because somewhere hackers
  144. are hacking, and they've got shit to do. None of which includes telling
  145. you about it at blackhat or anywhere else.
  146.  
  147. That is empirically the truth.
  148.  
  149. So if you truly, deeply, honestly care about security. Step away from
  150. exploit development. All you're doing is ducking punches that you knew
  151. were coming. It is moot. It is not going to stop anyone from getting
  152. into anything, it's just closing off a singular route. One of many that
  153. ultimately falls through to the proverbial 5 dollar wrench; pending
  154. motivation, time, and available resources.
  155.  
  156. If someone _REALLY_ wants your shit, they can take a bat to your head
  157. and take it. End of exploit.
  158.  
  159. I say this as someone who's made a career out of exploit
  160. development. It's been my life for 20 years. But I make no mistake about
  161. it being a labor of love. A function of an OCD-like addiction to solving
  162. puzzles and even though I spend most of my days filling out spreadsheets
  163. these days, I still love me a good 30 page dissertation on world
  164. shattering font bugs ... even though, and trust me if I tell you most of
  165. team Google damn well knows this, many people have sat on the exact same
  166. dissertation for many years.
  167.  
  168. But if you care about systemic security. The kind where you don't give
  169. two flying fucks if Bob's desktop gets clientsided or Jane's Android
  170. doesn't know how big an mpeg4 frame oughta be, then you will stop circle
  171. jerking over individual vulnerabilities and listen to what Spender has
  172. been saying for years.
  173.  
  174. Which is: you don't chase and fix vulnerabilities, you design a system
  175. around fundamentally stopping routes of impact. For spender it is
  176. eradicating entire bug classes in his grsecurity project. For network
  177. engineers it is understanding each and every exfiltration path on your
  178. network and segmenting accordingly.
  179.  
  180. Containment is the name of the game. Not prevention. The compromise is
  181. inevitable and the routes are legion. It is going to happen.
  182.  
  183. Now as far as a way forward for YOUR security ... well I play on team
  184. offense. I'm allowed to fawn over vulnerabilities and I think xdev and art
  185. often intersect. Pretty polly payload, bro.
  186.  
  187. But if you're supposedly my adversary (i.e. on team defense) and yet you're
  188. sitting right alongside me going "oooh" and "aaah" at whichever software
  189. vulnerability then you're probably in the wrong place, or ... I suppose
  190. ... maybe just in the wrong time.
  191.  
  192. Love,
  193. Bas
  194.  
  195. --
  196. PGP Pub Key: https://www.collarchoke.org/0xBED727DF.asc
  197. Fingerprint: 5C1A 3641 8542 7DFA F871  441A 03B9 A274 BED7 27DF
  198.  
  199. Next message: [Dailydave] The old speak: Wassenaar, Google,     and why Spender is right
  200. Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
  201. More information about the Dailydave mailing list