From Smelly Teal, 11 Years ago, written in Plain Text.
Embed
  1.  
  2. .We Are All Intelligence Officers Now
  3. .Dan Geer, 28 February 14, RSA/San Francisco
  4.  
  5. Good morning.  Thank you for the invitation to speak with you today,
  6. which, let me be clear, is me speaking for myself, not for anyone
  7. or anything else.  As you know, I work the cyber security trade,
  8. that is to say that my occupation is cyber security.  Note that I
  9. said "occupation" rather than "profession."  Last September, the
  10. U.S. National Academy of Sciences concluded that cyber security
  11. should be seen as an occupation and not a profession because the
  12. rate of change is simply too great to consider professionalization.[NAS]
  13. You may well agree that that rate of change is paramount, and, if
  14. so, you may also agree that cyber security is the most intellectually
  15. demanding occupation on the planet.
  16.  
  17. The goal of the occupation called cyber security grows more demanding
  18. with time, which I need tell no one here.  That growth is like a
  19. river with many tributaries.  Part of the rising difficulty flows
  20. from rising complexity, part of it from accelerating speed, and
  21. part of it from the side effects of what exactly we would do if
  22. this or that digital facility were to fail entirely -- which is to
  23. say our increasing dependence on all things digital.  One is at
  24. risk when something you depend upon is at risk.  Risk is, in other
  25. words, transitive.  If X is at risk and I depend on X, then I, too,
  26. am at risk to whatever makes X be at risk.  Risk is almost like
  27. inheritance in a programming language.
  28.  
  29. I am particularly fond of the late Peter Bernstein's definition of
  30. risk: "More things can happen than will."[PB]  I like that definition
  31. not because it tells me what to do, but rather because it tells me
  32. what comes with any new expansion of possibilities.  Put differently,
  33. it tells me that with the new, the realm of the possible expands
  34. and, as we know, when the realm of the possible expands, prediction
  35. is somewhere between difficult and undoable.  The dynamic is that
  36. we now regularly, quickly expand our dependence on new things, and
  37. that added dependence matters because the way in which we each and
  38. severally add risk to our portfolio is by way of dependence on
  39. things for which their very newness makes risk estimation, and thus
  40. risk management, neither predictable nor perhaps even estimable.
  41.  
  42. The Gordian Knot of such tradeoffs -- our tradeoffs -- is this: As
  43. society becomes more technologic, even the mundane comes to depend
  44. on distant digital perfection.  Our food pipeline contains less
  45. than a week's supply, just to take one example, and that pipeline
  46. depends on digital services for everything from GPS driven tractors
  47. to robot vegetable sorting machinery to coast-to-coast logistics
  48. to RFID-tagged livestock.  Is all the technologic dependency, and
  49. the data that fuels it, making us more resilient or more fragile?
  50.  
  51. In the cybersecurity occupation, in which most of us here work, we
  52. certainly seem to be getting better and better.  We have better
  53. tools, we have better understood practices, and we have more and
  54. better colleagues.  That's the plus side.  But from the point of
  55. view of prediction, what matters is the ratio of skill to challenge;
  56. as far as I can estimate, we are expanding the society-wide attack
  57. surface faster than we are expanding our collection of tools,
  58. practices, and colleagues.  If your society is growing more food,
  59. that's great.  If your population is growing faster than your
  60. improvements in food production can keep up, that's bad.  So it is
  61. with cyber risk management: Whether in detection, control, or
  62. prevention, we are notching personal bests, but all the while the
  63. opposition is setting world records.  As with most decision making
  64. under uncertainty, statistics have a role, particularly ratio
  65. statistics that magnify trends so that the latency of feedback from
  66. policy changes is more quickly clear.  Yet statistics, of course,
  67. require data, to which I will return in a moment.
  68.  
  69. In medicine, we have well established rules about medical privacy.
  70. Those rules are helpful; when you check into the hospital there is
  71. a licensure-enforced, accountability-based, need-to-know regime
  72. that governs the handling of your data.[PHI]   Most days, anyway.
  73. But if you check in with Bubonic Plague or Typhus or Anthrax, you
  74. will have zero privacy as those are "reportable conditions," as
  75. variously mandated by public health law in all fifty States.  So
  76. let me ask you, would it make sense, in a public health of the
  77. Internet way, to have a mandatory reporting regime for cybersecurity
  78. failures?  Do you favor having to report cyber penetrations of your
  79. firm or of your household to the government?  Should you face
  80. criminal charges if you fail to make such a report?  Forty-eight
  81. States vigorously penalize failure to report sexual molestation of
  82. children.[SMC]  The (US) Computer Fraud and Abuse Act[CF] defines
  83. a number of felonies related to computer penetrations, and the U.S.
  84. Code says that it is a crime to fail to report a felony of which
  85. you have knowledge.[USC]  Is cybersecurity event data the kind of
  86. data around which you want to enforce mandatory reporting?  Forty-six
  87. States require mandatory reporting of cyber failures in the form
  88. of their data breach laws, while the Verizon Data Breach Investigations
  89. Report[VDB] found, and the Index of Cyber Security[ICS] confirmed,
  90. that 70-80% of data breaches are discovered by unrelated third
  91. parties.  If you discover a data breach, do you have an ethical
  92. obligation to report it?  Should the law mandate that you fulfill
  93. such an obligation?
  94.  
  95. Almost everyone here has some form of ingress filtering in place
  96. by whatever name -- firewall, intrusion detection, whitelisting,
  97. and so forth and so on.  Some of you have egress filtering because
  98. being in a botnet, that is to say being an accessory to crime, is
  99. bad for business.  Suppose you discover that you are in a botnet;
  100. do you have an obligation to report it?  Do you have an obligation
  101. to report the traffic that led you to conclude that you had a
  102. problem?  Do you even have an obligation to bother to look and, if
  103. you don't have or want an obligation to bother to look, do you want
  104. your government to require the ISPs to do your looking for you, to
  105. notify you when your outbound traffic marks you as an accomplice
  106. to crime, whether witting or unwitting?  Do you want to lay on the
  107. ISPs the duty to guarantee a safe Internet?  They own the pipes and
  108. if you want clean pipes, then they are the ones to do it.  Does
  109. deep packet inspection of your traffic by your ISP as a public
  110. health measure have your support?  Would you want an ISP to deny
  111. access to a host, which might be your host, that is doing something
  112. bad on their networks?  Who gets to define what is "bad?"
  113.  
  114. If you are saying to yourself, "This is beginning to sound like
  115. surveillance" or something similar, then you're paying attention.
  116. Every one of you who lives in a community that has a neighborhood
  117. watch already has these kinds of decisions to make.  Let's say that
  118. you are patrolling your street, alone, and there have been break-ins
  119. lately, there have been thefts lately, there has been vandalism
  120. lately.  You've lived there for ten years and been on that neighborhood
  121. watch for five.  You are on duty and you see someone you've never
  122. seen crossing the street first from one side then the other, putting
  123. a hand on every garden gate.  What do you do?  Confront them the
  124. way a polite neighbor would?  Challenge them the way a security
  125. guard would?  Run home to lock your own doors and draw your drapes?
  126. Resign from the neighborhood watch because you are really not ready
  127. to do anything strenuous?
  128.  
  129. Returning to the digital sphere, we are increasing what it is that
  130. can be observed, what is observable.  Instrumentation has never
  131. been cheaper.  Computing to fiddle with what has been observed has
  132. never been more available.  As someone who sees a lot of fresh
  133. business plans, I can tell you that these days Step Six is never
  134. "Then we build a data center."  Step Six, or whatever, is universally
  135. now "Then we buy some cloud time and some advertising." This means
  136. that those to whom these outsourcing contracts go are in a position
  137. to observe, and observe a lot.  Doubtless some of what they observe
  138. will be problematic, whether on legal or moral grounds.  Should a
  139. vendor of X-as-a-Service be obliged to observe what their customers
  140. are doing?  And if they are obliged to observe, should they be
  141. obliged to act on what they observe, be that to report, to deploy
  142. countermeasures, or both?
  143.  
  144. As what is observable expands so, naturally, does what has been
  145. observed.  Dave Aitel says "There's no reason a company in this day
  146. and age can't have their own Splunk or ElasticSearch engine that
  147. allows them to search and sort a complete history of every program
  148. anyone in the company has ever executed."[DA]  Sometime in the last
  149. five to ten years we passed the point on the curve where it became
  150. much cheaper to keep everything than to do selective deletion.  When
  151. you read the Federal Rules of Civil Procedure with respect to
  152. so-called e-discovery, you can certainly conclude that total retention
  153. of observed data is a prudent legal strategy.  What is less clear
  154. is whether you have a duty to observe given that you have the
  155. capacity to do so.  All of which also applies to what others can
  156. observe about you.
  157.  
  158. This is not, however, about you personally.  Even Julian Assange,
  159. in his book _Cypherpunks_, said "Individual targeting is not the
  160. threat."  It is about a culture where personal data is increasingly
  161. public data, and assembled en masse.  All we have to go on now is
  162. the hopeful phrase "A reasonable expectation of privacy" but what
  163. is reasonable when one inch block letters can be read from orbit?
  164. What is reasonable when all of your financial or medical life is
  165. digitized and available primarily over the Internet?  Do you want
  166. ISPs to retain e-mails when you are asking your doctor a medical
  167. question (or, for that matter, do you want those e-mails to become
  168. part of your Electronic Health Record)?  Who owns your medical data
  169. anyway?  Until the 1970s, it was the patient but regulations then
  170. made it the provider.  With an Electronic Health Record, it is
  171. likely to revert to patient ownership, but if the EHR belongs to
  172. you, do you get to surveil the use that is made of it by medical
  173. providers and those that recursively they outsource to?  And if
  174. not, why not?
  175.  
  176. Observability is fast extending to devices.  Some of it has already
  177. appeared, such as the fact that any newish car is broadcasting four
  178. unique Bluetooth radio IDs, one for each tire's valve stem.  Some
  179. of it is in a daily progression, such as training our youngsters
  180. to accept surveillance by stuffing a locator beacon in their backpack
  181. as soon as they go off to Kindergarten.  Some of it is newly
  182. technologic, like through the wall imaging, and some of it is simply
  183. that we are now surrounded by cameras that we can't even see where
  184. no one camera is important but they are important in the aggregate
  185. when their data is fused.  Anything, and I mean anything, that has
  186. "wireless" in its name creates the certainty of traffic analysis.
  187.  
  188. As an example relevant to rooms such as this, you should assume
  189. that all public facilities will soon convert their lighting fixtures
  190. to LEDs, LEDs that are not just lights but also have an embedded,
  191. chip-based operating system, a camera, sensors for CO/CO2/pollutant
  192. emissions, seismic activity, humidity & UV radiation, a microphone,
  193. wifi and/or cellular interfaces, an extensible API, an IPv4 or v6
  194. address per LED, a capacity for disconnected "decision making on
  195. the pole," cloud-based remote management, and, of course, bragging
  196. rights for how green you are which you can then monetize in the
  197. form of tax credits.[S]  I ask again, do you or we or they have a
  198. duty to observe now that we have an ability to do so?  It is, as
  199. you know, a long established norm for authorities to seize the video
  200. stored in surveillance cameras whether the issue at hand is a smash
  201. and grab or the collapse of an Interstate highway bridge.[M]  What
  202. does that mean when data retention is permanent and recording devices
  203. are omnipresent?  Does that make you the observed or the observer?
  204. Do we have an answer to "Who watches the watchmen?"[J]
  205.  
  206. By now it is obvious that we humans can design systems more complex
  207. than we can then operate.  The financial sector's "flash crashes"
  208. are the most recent proof-by-demonstration of that claim; it would
  209. hardly surprise anyone were the fifty interlocked insurance exchanges
  210. for Obamacare to soon be another.  Above some threshold of system
  211. complexity, it is no longer possible to test, it is only possible
  212. to react to emergent behavior.  Even the lowliest Internet user is
  213. involved -- one web page can easily touch scores of different
  214. domains.  While writing this, the top level page from cnn.com had
  215. 400 out-references to 85 unique domains each of which is likely to
  216. be similarly constructed and all of which move data one way or
  217. another.  If you leave those pages up, then because many such pages
  218. have an auto-refresh, moving to a new subnet signals to every one
  219. of the advertising networks that you have done so.  How is this
  220. different than having a surveillance camera in the entry vestibule
  221. of your home?
  222.  
  223. We know, and have known for some time, that traffic analysis is
  224. more powerful than content analysis.  If I know everything about
  225. to whom you communicate including when, where, with what inter-message
  226. latency, in what order, at what length, and by what protocol, then
  227. I know you.  If all I have is the undated, unaddressed text of your
  228. messages, then I am an archaeologist, not a case officer.  The
  229. soothing mendacity of proxies for the President saying "It's only
  230. metadata" relies on the ignorance of the listener.  Surely no one
  231. here is convinced by "It's only metadata" but let me be clear: you
  232. are providing that metadata and, in the evolving definition of the
  233. word "public," there is no fault in its being observed and retained
  234. indefinitely.  Harvard Law professor Jonathan Zittrain famously
  235. noted that if you preferentially use online services that are free,
  236. "You are not the customer, you're the product."  Why?  Because what
  237. is observable is observed, what is observed is sold, and users are
  238. always observable, even when they are anonymous.
  239.  
  240. Let me be clear, this is not an attack on the business of intelligence.
  241. The Intelligence Community is operating under the rules it knows,
  242. most of which you, too, know, and the goal states it has been tasked
  243. to achieve.  The center of gravity for policy is that of goal states,
  244. not methods.
  245.  
  246. Throughout the 1990s, the commercial sector essentially caught up
  247. with the intelligence sector in the application of cryptography --
  248. not the creation of cyphers, but their use.  (Intelligence needs
  249. new cyphers on a regular basis whereas commercial entities would
  250. rather not have to roll their cypher suites at all, much less
  251. regularly.)  In like manner commercial firms are today fast catching
  252. up with the intelligence sector in traffic analysis.  The marketing
  253. world is leading the way because its form of traffic analysis is
  254. behavior-aware and full of data fusion innovation -- everything
  255. from Amazon's "people who bought this later bought that" to 1 meter
  256. accuracy on where you are in the shopping mall so that advertisements
  257. and coupons can appear on your smartphone for the very store you
  258. are looking in the window of, to combining location awareness with
  259. what your car and your bedroom thermostat had to say about you this
  260. morning.  More relevant to this audience, every cutting edge data
  261. protection scheme now has some kind of behavioral component, which
  262. simply means collecting enough data on what is happening that
  263. subsequently highlighting anomalies has a false positive rate low
  264. enough to be worth following up.
  265.  
  266. If you decide to in some broad sense opt out, you will find that
  267. it is not simple.  Speaking personally, I choose not to share
  268. CallerID data automatically by default.  Amusingly, when members
  269. of my friends and family get calls from an unknown caller, they
  270. assume it is me because I am the only person they know who does
  271. this.  A better illustration of how in a linear equation there are
  272. N-1 degrees of freedom I can't imagine.  Along those same lines,
  273. I've only owned one camera in my life and it was a film camera.
  274. Ergo, I've never uploaded any photos that I took.  That doesn't
  275. mean that there are no digital photos of me out there.  There are
  276. 3+ billion new photos online each month, so even if you've never
  277. uploaded photos of yourself someone else has.  And tagged them.  In
  278. other words, you can personally opt out, but that doesn't mean that
  279. other folks around you haven't effectively countermanded your intent.
  280.  
  281. In short, we are becoming a society of informants.  In short, I
  282. have nowhere to hide from you.
  283.  
  284. As I said before and will now say again, the controlling factor,
  285. the root cause, of risk is dependence, particularly dependence on
  286. the expectation of stable system state.  Yet the more technologic the
  287. society becomes, the greater the dynamic range of possible failures.
  288. When you live in a cave, starvation, predators, disease, and lightning
  289. are about the full range of failures that end life as you know it
  290. and you are well familiar with each of them.  When you live in a
  291. technologic society where everybody and everything is optimized in
  292. some way akin to just-in-time delivery, the dynamic range of failures
  293. is incomprehensibly larger and largely incomprehensible.  The wider
  294. the dynamic range of failure, the more prevention is the watchword.
  295. Cadres of people charged with defending masses of other people must
  296. focus on prevention, and prevention is all about proving negatives.
  297. Therefore, and inescapably so, there is only one conclusion: as
  298. technologic society grows more interconnected, it becomes more
  299. interdependent within itself.  As society becomes more interdependent
  300. within itself, the more it must rely on prediction based on data
  301. collected in broad ways, not in targeted ways.  That is surveillance.
  302. That is intelligence practiced not by intelligence agencies but by
  303. anyone or anything with a sensor network.
  304.  
  305. Spoken of in this manner, official intelligence agencies that hoover
  306. up everything are simply obeying the Presidential Directive that
  307. "Never again" comes true.  And the more complex the society they
  308. are charged with protecting becomes, the more they must surveil,
  309. the more they must analyze, the more data fusion becomes their only
  310. focus.  In that, there is no operational difference between government
  311. acquisition of observable data and private sector acquisition of
  312. observable data, beyond the minor detail of consent.
  313.  
  314. David Brin was the first to suggest that if you lose control over
  315. what data can be collected on you, the only freedom-preserving
  316. alternative is that everyone else does, too.[DB1]  If the government
  317. or the corporation or your neighbor can surveil you without asking,
  318. then the balance of power is preserved when you can surveil them
  319. without asking.  Bruce Schneier countered that preserving the balance
  320. of power doesn't mean much if the effect of new information is
  321. non-linear, that is to say if new information is the exponent in
  322. an equation, not one more factor in a linear sum.[DB2]  Solving
  323. that debate requires that you have a strong opinion on what data
  324. fusion means operationally to you, to others, to society.  If,
  325. indeed, and as Schneier suggested, the power of data fusion is an
  326. equation where new data items are exponents, then the entity that
  327. can amass data that is bigger by a little will win the field by a
  328. lot.  That small advantages can have big outcome effects is exactly
  329. what fuels this or any other arms race.
  330.  
  331. Contradicting what I said earlier, there may actually be a difference
  332. between the public and the private sector because the private sector
  333. will collect data only so long as increased collection can be
  334. monetized, whereas government will collect data only so long as
  335. increased collection can be stored.  With storage prices falling
  336. faster than Moore's Law, government's stopping rule may thus never
  337. be triggered.
  338.  
  339. In the Wikipedia article about Brin, there is this sentence, "It
  340. will be tempting to pass laws that restrict the power of surveillance
  341. to authorities, entrusting them to protect our privacy -- or a
  342. comforting illusion" thereof.[W]  I agree with one of the possible
  343. readings of that sentence, namely that it is "tempting" in the sense
  344. of being delusional.  Demonstrating exactly the kind of good
  345. intentions with which the road to Hell is paved, we have codified
  346. rules that permit our lawmakers zero privacy, we give them zero
  347. ability to have a private moment or to speak to others without
  348. quotation, without attribution, without their game face on.  In the
  349. evolutionary sense of the word "select," we select for people who
  350. are without expectation of authentic privacy or who jettisoned it
  351. long before they stood for office.  Looking in their direction for
  352. salvation is absurd.  And delusional.
  353.  
  354. I am, however, hardly arguing that "you" are powerless or that
  355. "they" have taken all control.  It is categorically true that
  356. technology is today far more democratically available than it was
  357. yesterday and less than it will be tomorrow.  3D printing, the whole
  358. "maker" community, DIY biology, micro-drones, search, constant
  359. contact with whomever you choose to be in constant contact with --
  360. these are all examples of democratizing technology.  This is perhaps
  361. our last fundamental tradeoff before the Singularity occurs: Do we,
  362. as a society, want the comfort and convenience of increasingly
  363. technologic, invisible digital integration enough to pay for those
  364. benefits with the liberties that must be given up to be protected
  365. from the downsides of that integration?  If risk is that more things
  366. can happen than will, then what is the ratio of things that can now
  367. happen that are good to things that can now happen that are bad?
  368. Is the good fraction growing faster than the bad fraction or the
  369. other way around?  Is there a threshold of interdependence beyond
  370. which good or bad overwhelmingly dominate?
  371.  
  372. We are all data collectors, data keepers, data analysts.  Some
  373. citizens do it explicitly; some citizens have it done for them by
  374. robots.  To be clear, we are not just a society of informants, we
  375. are becoming an intelligence community of a second sort.  Some of
  376. it is almost surely innocuous, like festooning a house with wireless
  377. sensors for home automation purposes.  Some of it is cost effectiveness
  378. driven, like measuring photosynthesis in a corn field by flying an
  379. array of measurement devices over it on a drone.  I could go on,
  380. and so could you, because in a very real sense I am telling you
  381. nothing you don't already know.  Everyone in this and other audiences
  382. knows everything that I have to say, even if they weren't aware
  383. that they knew it.
  384.  
  385. The question is why is this so?  Is this majority rule and the
  386. intelligence function is one the majority very much wants done to
  387. themselves and others?  Is this a question of speed and complexity
  388. such that citizen decision making is crippled not because facts are
  389. hidden but because compound facts are too hard to understand?  Is
  390. this a question of wishful thinking of that kind which can't tell
  391. the difference between a utopian fantasy, a social justice movement,
  392. and a business opportunity?  Is this nowhere near such a big deal
  393. as I think it is because every day that goes by without a cascade
  394. failure only adds evidence that such possibilities are becoming
  395. ever less likely?  Is the admonition to "Take care of yourself" the
  396. core of a future where the guarantee of a good outcome for all is
  397. the very fact that no one can hide?  Is Nassim Taleb's idea that
  398. we are easily fooled by randomness[TF] at play here, too?  If the
  399. level of observability to which you are subject is an asset to you,
  400. then what is your hedge against that asset?
  401.  
  402. This is not a Chicken Little talk; it is an attempt to preserve if
  403. not make a choice while choice is still relevant.  As The Economist
  404. in its January 18 issue so clearly lays out,[TE] we are ever more
  405. a service economy, but every time an existing service disappears
  406. into the cloud, our vulnerability to its absence increases as does
  407. the probability of monopoly power.  Every time we ask the government
  408. to provide goodnesses that can only be done with more data, we are
  409. asking government to collect more data.
  410.  
  411. Let me ask a yesterday question: How do you feel about traffic jam
  412. detection based on the handoff rate between cell towers of those
  413. cell phones in use in cars on the road?  Let me ask a today question:
  414. How do you feel about auto insurance that is priced from a daily
  415. readout of your automobile's black box?  Let me ask a tomorrow
  416. question: In what calendar year will compulsory auto insurance be
  417. more expensive for the driver who insists on driving their car
  418. themselves rather than letting a robot do it?  How do you feel about
  419. public health surveillance done by requiring Google and Bing to
  420. report on searches for cold remedies and the like?  How do you feel
  421. about a Smart Grid that reduces your power costs and greens the
  422. atmosphere but reports minute-by-minute what is on and what is off
  423. in your home?  Have you or would you install that toilet that does
  424. a urinalysis with every use, and forwards it to your clinician?
  425.  
  426. How do you feel about using standoff biometrics as a solution to
  427. authentication?  At this moment in time, facial recognition is
  428. possible at 500 meters, iris recognition is possible at 50 meters,
  429. and heart-beat recognition is possible at 5 meters.  Your dog can
  430. identify you by smell; so, too, can an electronic dog's nose.  Your
  431. cell phone's accelerometer is plenty sensitive enough to identify
  432. you by gait analysis.  The list goes on.  All of these are data
  433. dependent, cheap, convenient, and none of them reveal anything that
  434. is a secret as we currently understand the term "secret" -- yet the
  435. sum of them is greater than the parts.  A lot greater.  It might
  436. even be a polynomial, as Schneier suggested.  Time will tell, but
  437. by then the game will be over.
  438.  
  439. Harvard Business School Prof. Shoshanna Zuboff has had much to say
  440. on these topics since the 1980s, especially her Three Laws:[ZS]
  441.  
  442. .  Everything that can be automated will be automated
  443.  
  444. .  Everything that can be informated will be informated
  445.  
  446. .  Every digital application that can be used for surveillance and
  447.    control will be used for surveillance and control
  448.  
  449. I think she is right, but the implication that this is all outside
  450. the control of the citizen is not yet true.  It may get to be true,
  451. but in so many words that is why I am standing here.  There are a
  452. million choices the individual person, or for that matter the
  453. free-standing enterprise, can take and I do not just mean converting
  454. all your browsing over to Tor.
  455.  
  456. Take something mundane like e-mail:  One might suggest never sending
  457. the same message twice.  Why?  Because sending it twice, even if
  458. encrypted, allows a kind of analysis by correlation that cannot
  459. otherwise happen.  Maybe that's too paranoid, so let's back off a
  460. little.  One might suggest that the individual or the enterprise
  461. that outsources its e-mail to a third party thereby creates by
  462. itself and for itself the risk of silent subpoenas delivered to
  463. their outsourcer.  If, instead, the individual or the enterprise
  464. insources its e-mail then at the very least it knows when its data
  465. assets are being sought because the subpoena comes to them.  Maybe
  466. insourcing your e-mail is too much work, but need I remind you that
  467. plaintext e-mail cannot be web-bugged, so why would anyone ever
  468. render HTML e-mail at all?
  469.  
  470. Take software updates:  There is a valid argument to make software
  471. auto-update the norm.  As always, a push model has to know where
  472. to push.  On the other hand, a pull model must be invoked by the
  473. end user.  Both models generate information for somebody, but a
  474. pull model leaves the time and place decisions to the end user.
  475.  
  476. Take cybersecurity technology:  I've become convinced that all of
  477. it is dual use.  While I am not sure whether dual use is a trend
  478. or a realization of an unchanging fact of nature, the obviousness
  479. of dual use seems greatest in the latest technologies, so I am
  480. calling it a trend in the sense that the straightforward accessibility
  481. of dual use characteristics of new technology is itself a growing
  482. trend.  Leading cybersecurity products promise total surveillance
  483. over the enterprise and are, to my mind, offensive strategies used
  484. for defensive purposes.  A fair number of those products not only
  485. watch your machine, but take just about everything that is going
  486. on at your end and copies that to their end.  The argument for doing
  487. so is well thought out -- by combining observational data from a
  488. lot of places the probability of detection can be raised and the
  489. latency of countermeasure can be reduced.  Of course, there is no
  490. reason such systems couldn't be looking for patterns of content in
  491. human readable documents just as easily as looking for patterns of
  492. content in machine readable documents.
  493.  
  494. Take communications technology:  Whether we are talking about
  495. triangulating the smartphone using the cell towers, geocoding the
  496. Internet, or forwarding the GPS coordinates from onboard equipment
  497. to external services like OnStar, everyone knows that there is a
  498. whole lot of location tracking going on.  What can you do to opt
  499. out of that?  That is not so easy because now we are talking not
  500. about a mode of operation, like whether to insource or outsource
  501. your e-mail, but a real opt-in versus opt-out decision; do you
  502. accept the tracking or do you refuse the service?  Paraphrasing
  503. Zittrain's remark about being a customer or being a product, the
  504. greater the market penetration of mobile communications, the more
  505. the individual is either a data source or a suspect.
  506.  
  507. Take wearable computing:  Google Glass is only the most famous.
  508. There've been people working on such things for a long time now.
  509. Folks who are outfitted with wearable computing are pretty much
  510. identifiable today, but this brief instant will soon pass.  You
  511. will be under passive surveillance by your peers and contacts or,
  512. to be personal, some of you will be surveilling me because you will
  513. be adopters of this kind of technology.  I would prefer you didn't.
  514. I am in favor neither of cyborgs nor chimeras; I consider our place
  515. in the natural world too great a gift to mock in those ways.
  516.  
  517. When it comes to ranking programs for how well they can observe
  518. their surroundings and act on what they see without further
  519. instructions, Stuxnet is the reigning world heavyweight champion.
  520. Unless there is something better already out there.  Putting aside
  521. the business of wrecking centrifuges, just consider the observational
  522. part.  Look at other malware that seems to have a shopping list
  523. that isn't composed of filenames or keywords but instead an algorithm
  524. for rank-ordering what to look for and to exfiltrate documents in
  525. priority order.  As with other democratizations of technology, what
  526. happens when that kind of improvisation, that kind of adaptation,
  527. can be automated?  What happens when such things can be scripted?
  528.  
  529. For those with less gray hair, once upon a time a firewall was
  530. something that created a corporate perimeter.  Then it was something
  531. that created a perimeter around a department.  Then around a given
  532. computer.  Then around a given datum.  In the natural world,
  533. perimeters shrink as risk grows -- think a circle of wildebeeste
  534. with their horns pointed outward, the calves on the inside, and the
  535. hyenas closing in.  So it has been with perimeters in the digital
  536. space, a steady shrinking of the defensible perimeter down to the
  537. individual datum.
  538.  
  539. There are so many technologies now that power observation and
  540. identification of the individual at a distance.  They may not yet
  541. be in your pocket or on your dashboard or embedded in all your smoke
  542. detectors, but that is only a matter of time.  Your digital exhaust
  543. is unique hence it identifies.  Pooling everyone's digital exhaust
  544. also characterizes how you differ from normal.  Suppose that observed
  545. data does kill both privacy as impossible-to-observe and privacy
  546. as impossible-to-identify, then what might be an alternative?  If
  547. you are an optimist or an apparatchik, then your answer will tend
  548. toward rules of procedure administered by a government you trust
  549. or control.  If you are a pessimist or a hacker/maker, then your
  550. answer will tend towards the operational, and your definition of a
  551. state of privacy will be my definition: the effective capacity to
  552. misrepresent yourself.
  553.  
  554. Misrepresentation is using disinformation to frustrate data fusion
  555. on the part of whomever it is that is watching you.  Some of it can
  556. be low-tech, such as misrepresentation by paying your therapist in
  557. cash under an assumed name.  Misrepresentation means arming yourself
  558. not at Walmart but in living rooms.  Misrepresentation means swapping
  559. affinity cards at random with like-minded folks.  Misrepresentation
  560. means keeping an inventory of misconfigured webservers to proxy
  561. through.  Misrepresentation means putting a motor-generator between
  562. you and the Smart Grid.  Misrepresentation means using Tor for no
  563. reason at all.  Misrepresentation means hiding in plain sight when
  564. there is nowhere else to hide.  Misrepresentation means having not
  565. one digital identity that you cherish, burnish, and protect, but
  566. having as many as you can.  Your identity is not a question unless
  567. you work to make it be.  Lest you think that this is a problem
  568. statement for the random paranoid individual alone, let me tell you
  569. that in the big-I Intelligence trade, crafting good cover is getting
  570. harder and harder and for the same reasons: misrepresentation is
  571. getting harder and harder.  If I was running field operations, I
  572. would not try to fabricate a complete digital identity, I'd "borrow"
  573. the identity of someone who had the characteristics that I needed
  574. for the case at hand.
  575.  
  576. The Obama administration's issuance of a National Strategy for
  577. Trusted Identities in Cyberspace[NS] is case-in-point; it "calls
  578. for the development of interoperable technology standards and
  579. policies -- an 'Identity Ecosystem' -- where individuals, organizations,
  580. and underlying infrastructure -- such as routers and servers -- can
  581. be authoritatively authenticated."  If you can trust a digital
  582. identity, that is because it can't be faked.  Why does the government
  583. care about this?  It cares because it wants to digitally deliver
  584. government services and it wants attribution.  Is having a non-fake-able
  585. digital identity for government services worth the registration of
  586. your remaining secrets with that government?  Is there any real
  587. difference between a system that permits easy, secure, identity-based
  588. services and a surveillance system?  Do you trust those who hold
  589. surveillance data on you over the long haul by which I mean the
  590. indefinite retention of transactional data between government
  591. services and you, the individual required to proffer a non-fake-able
  592. identity to engage in those transactions?  Assuming this spreads
  593. well beyond the public sector, which is its designers' intent, do
  594. you want this everywhere?  If you are building authentication systems
  595. today, then you are already playing ball in this league.  If you
  596. are using authentication systems today, then you are subject to the
  597. pending design decisions of people who are themselves playing ball
  598. in this league.
  599.  
  600. And how can you tell if the code you are running is collecting on
  601. you or, for that matter, if the piece of code you are running is
  602. collecting on somebody else?  If your life is lived inside the
  603. digital envelope, how do you know that this isn't The Matrix or The
  604. Truman Show?  Code is certainly getting bigger and bigger.  A
  605. nameless colleague who does world class static analysis said that
  606. he "regularly sees apps that are over 2 GB of code" and sees
  607. "functions with over 16K variables."  As he observes, functions
  608. like that are machine written.  If the code is machine written,
  609. does anyone know what's in it?  The answer is "of course not" and
  610. even if they did, malware techniques such as return-oriented-programming
  611. can add features after the whitelist-mediated application launch.
  612. But I'm not talking here about malware, I am talking about code
  613. that you run that you meant to run and which, in one way or another,
  614. is instrumented to record what you do with it.  Nancy Pelosi's
  615. famous remark[NP] about her miserable, thousand page piece of
  616. legislation, "We have to pass the bill so that you can find out
  617. what is in it" can be just as easily applied to code: it has become
  618. "We have to run the code so that you can find out what is in it."
  619.  
  620. That is not going to change; small may be beautiful but big is
  621. inevitable.[BI]  A colleague notes that, with the cloud, all pretense
  622. of trying to keep programs small and economical has gone out the
  623. window -- just link to everything because it doesn't matter if you
  624. make even one call to a huge library since the Elastic Cloud (or
  625. whatever) charges you no penalty for bloat.  As such, it is likely
  626. that any weird machine[SB] within the bloated program is ever more
  627. robust.
  628.  
  629. Mitja Kolsek was who made me aware of just how much the client has
  630. become the server's server.  Take Javascript, which is to say servers
  631. sending clients programs to execute; the HTTP Archive says that the
  632. average web page now makes out-references to 16 different domains
  633. as well as making 17 Javascript requests per page, and the Javascript
  634. byte count is five times the HTML byte count.[HT]  A lot of that
  635. Javascript is about analytics which is to say surveillance of the
  636. user experience (and we're not even talking about Bitcoin mining
  637. done in Javascript that you can embed in your website.[BJ])
  638.  
  639. So suppose everybody is both giving and getting surveillance, both
  640. being surveilled and doing surveillance.  Does that make you an
  641. intelligence agent?  A spreading of technology from the few to the
  642. many is just the way world works.  There are a hundred different
  643. articles from high-brow to low- that show the interval between
  644. market introduction and widespread adoption of technology has gotten
  645. shorter as technology has gotten more advanced.  That means that
  646. technologies that were available only to the few become available
  647. to the many in a shorter timeframe, i.e., that any given technology
  648. advantage the few have has a shorter shelf-life.  That would mean
  649. that the technologies that only national laboratories had fifteen
  650. years ago might be present among us soon, in the spirit of William
  651. Gibson's famous remark that the future is already present, just
  652. unevenly distributed.  Or maybe it is only ten years now.  Maybe
  653. the youngest of you in this room will end up in a world where what
  654. a national lab has today is something you can look forward to having
  655. in only five year's time.  Regardless of whether the time constant
  656. is five or ten or even fifteen years, this is far, far faster than
  657. any natural mixing will arrange for even distribution across all
  658. people.  The disparities of knowledge that beget power will each
  659. be shorter lived in their respective particulars, but a much steeper
  660. curve in the aggregate.
  661.  
  662. Richard Clarke's novel _Breakpoint_ centered around the observation
  663. that with fast enough advances in genetic engineering not only will
  664. the elite think that they are better than the rest, they will be.[RC]
  665. I suggest that with fast enough advances in surveillance and the
  666. inferences to be drawn from surveillance, that a different elite
  667. will not just think that it knows better, it will know better.
  668. Those advances come both from Moore's and from Zuboff's laws, but
  669. more importantly they rest upon the extraordinarily popular delusion
  670. that you can have freedom, security, and convenience when, at best,
  671. you can have two out of three.
  672.  
  673. At the same time, it is said that the rightful role of government
  674. is to hold a monopoly on the use of force.  Is it possible that in
  675. a fully digital world it will come to pass that everyone can see
  676. what once only a Director of National Intelligence could see?  Might
  677. a monopoly of force resting solely with government become harder
  678. to maintain as the technology that bulwarks such a monopoly becomes
  679. democratized ever faster?  Might reserving force to government
  680. become itself an anachronism?  That is almost surely not something
  681. to hope for, even for those of us who agree with Thomas Jefferson
  682. that the government that governs best is the government that governs
  683. least.  If knowledge is power, then increasing the store of knowledge
  684. must increase the store of power; increasing the rate of knowlege
  685. acquisition must increase the rate of power growth.  All power tends
  686. to corrupt, and absolute power corrupts absolutely,[LA] so sending
  687. vast amounts of knowledge upstream will corrupt absolutely, regardless
  688. of whether the data sources are reimbursed with some pittance of
  689. convenience.  Every tax system in the world has proven this time
  690. and again with money.  We are about to prove it again with data,
  691. which has become a better store of value than fiat currency in any
  692. case.
  693.  
  694. Again, that power has to go somewhere.  If you are part of the
  695. surveillance fabric, then you are part of creating that power, some
  696. of which is reflected back on you as conveniences that actually
  697. doubles as a form of control.  Very nearly everyone at this conference
  698. is explicitly and voluntarily part of the surveillance fabric because
  699. it comes with the tools you use, with what Steve Jobs would call
  700. your digital life.  With enough instrumentation carried by those
  701. who opt in, the person who opts out hasn't really opted out.  If
  702. what those of you who opt in get for your role in the surveillance
  703. fabric is "security," then you had better be damnably sure that
  704. when you say "security" that you all have close agreement on precisely
  705. what you mean by that term.
  706.  
  707. And this is as good a place as any to pass on Joel Brenner's
  708. insight:[JB]
  709.  
  710.    During the Cold War, our enemies were few and we knew who they
  711.    were.  The technologies used by Soviet military and intelligence
  712.    agencies were invented by those agencies.  Today, our adversaries
  713.    are less awesomely powerful than the Soviet Union, but they are
  714.    many and often hidden.  That means we must find them before we
  715.    can listen to them.  Equally important, virtually every government
  716.    on Earth, including our own, has abandoned the practice of relying
  717.    on government-developed technologies.  Instead they rely on
  718.    commercial off-the-shelf, or COTS, technologies.  They do it
  719.    because no government can compete with the head-spinning advances
  720.    emerging from the private sector, and no government can afford
  721.    to try.  When NSA wanted to collect intelligence on the Soviet
  722.    government and military, the agency had to steal or break the
  723.    encryption used by them and nobody else.  The migration to COTS
  724.    changed that.  If NSA now wants to collect against a foreign
  725.    general's or terorist's communications, it must break the same
  726.    encryption you and I use on our own devices...  That's why NSA
  727.    would want to break the encryption used on every one of those
  728.    media.  If it couldn't, any terrorist in Chicago, Kabul, or
  729.    Cologne would simply use a Blackberry or send messages on Yahoo!
  730.    But therein lies a policy dilemma, because NSA could decrypt
  731.    almost any private conversation.  The distinction between
  732.    capabilities and actual practices is more critical than ever...
  733.    Like it or not, the dilemma can be resolved only through oversight
  734.    mechanisms that are publicly understood and trusted -- but are
  735.    not themselves ... transparent.
  736.  
  737. At the same time, for-profit and not-for-profit entites are collecting
  738. on each other.  They have to, even though private intelligence
  739. doubtless leads directly to private law.  On the 6th of this month,
  740. the Harvard Kennedy School held a conference on this very subject;
  741. let me read just the first paragraph:[HKS]
  742.  
  743.    In today's world, businesses are facing increasingly complex
  744.    threats to infrastructure, finances, and information.  The
  745.    government is sometimes unable to share classified information
  746.    about these threats.  As a result, business leaders are creating
  747.    their own intelligence capabilities within their companies.
  748.  
  749. In a closely related development, the international traffic in arms
  750. treaty known as the Wassenaar Agreement, was just amended to classify
  751. "Intrusion Software" and "Network Surveillance Systems" as weapons.[WA]
  752.  
  753. So whom do you trust?  Paul Wouters makes a telling point when he
  754. says that "You cannot avoid trust.  Making it hierarchical gives
  755. the least trust to parties.  You monitor those you have to trust
  756. more, and more closely."[PW]  As I've done with privacy and security,
  757. I should now state my definition of trust, which is that trust is
  758. where I drop my guard, which is to say that I only trust someone
  759. against whom I have effective recourse.  Does that mean I can only
  760. trust those upon whom I can collect?  At the nation state level
  761. that is largely the case.  Is this the way Brin's vision will work
  762. itself out, that as the technology of collection democratizes, we
  763. will trust those we can collect against but within the context of
  764. whatever hierarchy is evolutionarily selected by such a dynamic?
  765.  
  766. It is said that the price of anything is the foregone alternative.
  767. The price of dependence is risk.  The price of total dependence is
  768. total risk.  Standing in his shuttered factory, made redundant by
  769. coolie labor in China, Tom McGregor said that "American consumers
  770. want to buy things at a price that is cheaper than they would be
  771. willing to be paid to make them."  A century and a half before Tom,
  772. English polymath John Ruskin said that "There is nothing in the
  773. world that some man cannot make a little worse and sell a little
  774. cheaper, and he who considers price only is that man's lawful prey."
  775. Invoking Zittrain yet again, the user of free services is not the
  776. customer, he's the product.  Let me then say that if you are going
  777. to be a data collector, if you are bound and determined to instrument
  778. your life and those about you, if you are going to "sell" data to
  779. get data, then I ask that you not work so cheaply that you collectively
  780. drive to zero the habitat, the lebensraum, of those of us who opt
  781. out.  If you remain cheap, then I daresay that opting out will soon
  782. require bravery and not just the quiet tolerance to do without
  783. digital bread and circuses.
  784.  
  785. To close with Thomas Jefferson:
  786.  
  787.    I predict future happiness for Americans, if they can prevent
  788.    the government from wasting the labors of the people under the
  789.    pretense of taking care of them.
  790.  
  791.  
  792. There is never enough time.  Thank you for yours.
  793.  
  794. -------------
  795.  
  796. [NAS] "Professionalizing the Nation's Cyber Workforce?"
  797.  www.nap.edu/openbook.php?record_id=18446
  798.  
  799. [PB] _Against the Gods_ and this 13:22 video at
  800.  www.mckinsey.com/insights/risk_management/peter_l_bernstein_on_risk
  801.  
  802. [PHI] Personal Health Information, abbreviated PHI
  803.  
  804. [SMC] "Penalties for failure to report and false reporting of child
  805. abuse and neglect," US Dept of Health and Human Services, Children's
  806. Bureau, Child Welfare Information Gateway
  807.  
  808. [CFAA] U.S. Code, Title 18, Part I, Chapter 47, Section 1030
  809.  www.law.cornell.edu/uscode/text/18/1030
  810.  
  811. [USC] U.S. Code, Title 18, Part I, Chapter 1, Section 4
  812.  www.law.cornell.edu/uscode/text/18/4
  813.  
  814. [VDB] Verizon Data Breach Investigations Report
  815.  www.verizonenterprise.com/DBIR
  816.  
  817. [ICS] Index of Cyber Security
  818.  www.cybersecurityindex.org
  819.  
  820. [DA] "What is the next step?," Dave Aitel, 18 February 2014
  821.  seclists.org/dailydave/2014/q1/28
  822.  
  823. [S] Sensity's NetSense product, to take one (only) example
  824.  www.sensity.com/our-platform/our-platform-netsense
  825.  
  826. [M] For example, the 2007 collapse of I-35 in Minneapolis.
  827.  
  828. [J] "Quis custodiet ipsos custodes?," Juvenal, Satire VI ll.347-348
  829.  
  830. [DB1] _The Transparent Society_, David Brin, Perseus, 1998
  831. [DB2] "The Myth of the 'Transparent Society'," Bruce Schneier
  832.  www.wired.com/politics/security/commentary/securitymatters/2008/03/securitymatters_0306
  833. [DB3] "Rebuttal," David Brin
  834.  www.wired.com/politics/security/news/2008/03/brin_rebuttal
  835.  
  836. [W] minor quotation from
  837.  en.wikipedia.org/wiki/The_Transparent_Society
  838.  
  839. [TF] _Fooled by Randomness_, Nassim Taleb, Random House, 2001
  840.  
  841. [TE] "Coming to an office near you," The Economist, 18 January 2014
  842.  cover/lead article, print edition
  843.  
  844. [ZS] "Be the friction - Our Response to the New Lords of the Ring," 6 Jun 2013
  845.  www.faz.net/aktuell/feuilleton/the-surveillance-paradigm-be-the-friction-our-response-to-the-new-lords-of-the-ring-12241996.html
  846.  
  847. [NS] National Strategy for Trusted Identities in Cyberspace, 2011
  848.  www.nist.gov/nstic
  849.  
  850. [NP] 2010 Legislative Conf. for the National Association of Counties
  851.  
  852. [BI] "Small Is Beautiful, Big Is Inevitable," IEEE S&P, Nov/Dec 2011
  853.  geer.tinho.net/ieee/ieee.sp.geer.1111.pdf
  854.  
  855. [SB] LANGSEC: Language-theoretic Security
  856.  www.cs.dartmouth.edu/~sergey/langsec/
  857.  
  858. [HT] Trends, HTTP Archive
  859.  www.httparchive.org/trends.php
  860.  
  861. [BJ] Bitcoin Miner for Websites
  862.  www.bitcoinplus.com/miner/embeddable
  863.  
  864. [RC] _Breakpoint_, Richard Clarke, Putnam's, 2007
  865.  
  866. [LA] "All power tends to corrupt and absolute power corrupts
  867. absolutely.  Great men are almost always bad men, even when they
  868. exercise influence and not authority: still more when you superadd
  869. the tendency or the certainty of corruption by authority."
  870. -- Lord John Dalberg Acton to Bishop Mandell Creighton, 1887
  871.  
  872. [JB] "NSA: Not (So) Secret Anymore," 10 December 2013
  873.  joelbrenner.com/blog
  874.  
  875. [HKS] Defense and Intelligence: Future of Intelligence Seminars
  876.  belfercenter.ksg.harvard.edu/events/6230/intelligence_in_the_private_sector
  877.  
  878. [WA] "International Agreement Reached Controlling Export of Mass
  879. and Intrusive Surveillance," 9 December 2013
  880.  oti.newamerica.net/blogposts/2013/international_agreement_reached_controlling_export_of_mass_and_intrusive_surveillance
  881.  
  882. [PW] "You Can't P2P the DNS and Have It, Too," Paul Wouters, 9 Apr 2012
  883.  nohats.ca/wordpress/blog/2012/04/09/you-cant-p2p-the-dns-and-have-it-too
  884.  
  885.  
  886.  
  887. =====
  888. this and other material on file under geer.tinho.net/pubs