{"id":267,"date":"2013-06-19T16:08:28","date_gmt":"2013-06-19T21:08:28","guid":{"rendered":"http:\/\/www.bitquill.net\/blog\/?p=267"},"modified":"2016-05-12T14:40:58","modified_gmt":"2016-05-12T19:40:58","slug":"expectations-of-privacy","status":"publish","type":"post","link":"https:\/\/bitquill.net\/blog\/expectations-of-privacy\/","title":{"rendered":"Expectations of privacy"},"content":{"rendered":"<p>I have stopped worrying <em>what<\/em> can be inferred about me, because I&#8217;ve accepted the simple fact that, <strong>given enough time (data) and resources, <em>anything<\/em> can be inferred.<\/strong> Consider, as an example, &#8220;location privacy.&#8221; \u00c2\u00a0A number of approaches rely on adaptively coarsening the detail of reported location (using all sorts of criteria to decide detail, from mobility patterns, to spatial query workload characteristics, etc). \u00c2\u00a0For example, instead of revealing my exact location, I can reveal my location at a city-block level. In an area like NYC, this would conflate me with hundreds of other people that happen to be on the same block, but a block-level location is still accurate enough to be useful (e.g., for finding nearby shops and restaurants). \u00c2\u00a0This might work if I&#8217;m\u00c2\u00a0reporting my location just once. \u00c2\u00a0However, if I travel from home to work, then my trajectory over a few days, even at a city-block granularity, is likely sufficient to distinguish me from other people. \u00c2\u00a0I could perhaps counter this by revealing my location at a city-level or state-level. \u00c2\u00a0Then a few days worth of data might not be enough to identify me. \u00c2\u00a0However, I often travel and data over a period of, say, a year, would likely be enough to identify me even if location detail is quite coarse. \u00c2\u00a0Of course, I could take things\u00c2\u00a0to the extreme and just reveal that &#8220;I am on planet Earth&#8221;. \u00c2\u00a0But that&#8217;s the same as not publishing my location, since this fact is true for everyone.<\/p>\n<p><!--more--><\/p>\n<p>If it&#8217;s technically possible to infer my identity (given a long enough period of observation, and enough resources and time to piece the various, possibly inaccurate, pieces of information together), someone (with enough patience and resources) will likely do it.\u00c2\u00a0Therefore, as the amount of data about me tends to infinity (which, on the Internet, it probably does), the fraction that I have to hide in order to maintain my privacy tends to one: <strong>you have long-term privacy only if you never reveal anything<\/strong>. \u00c2\u00a0There are various ways of not revealing anything. \u00c2\u00a0One is to\u00c2\u00a0<a title=\"Google's Schmidt Roasted for Privacy Comments - PCWorld\" href=\"http:\/\/www.pcworld.com\/article\/184446\/googles_schmidt_roasted_for_privacy_comments.html\" target=\"_blank\">simply not do it<\/a>. \u00c2\u00a0Another might be to keep it to yourself and never put it in any digital media. \u00c2\u00a0Yet another might be encrypting the information.<\/p>\n<p>However, not revealing anything isn&#8217;t really a solution (if a tree falls in the forest and nobody hears it&#8230; the tree has privacy, I guess). \u00c2\u00a0<strong>There is an alternative, of course: precise access control.<\/strong> Your privacy can be safeguarded by a <strong>centralized, trusted gatekeeper that controls all access to data<\/strong>. This leads to <strong>something of a paradox<\/strong>: guaranteeing privacy (access control) implies zero privacy from the trusted gatekeeper: they (have to) know and control everything. \u00c2\u00a0Many people are still confused about this. For example, a form of this dichotomy can be seen in peoples&#8217; reactions towards Facebook: on one hand, people complain about giving Facebook complete control and ownership of their data, but they also complain when Facebook essentially gives up that control by making something &#8220;public&#8221; in one way or another. [Note: there is the valid issue of Facebook changing its promises here, but that&#8217;s not my point\u00e2\u20ac\u201dpeople post certain information on Facebook and not on, say, Twitter or the &#8220;open web&#8221; precisely because they believe that Facebook guarantees them access control which, by the way, is a very tall order, leading to confusion on all sides, as I hope to convince you.]<\/p>\n<p>Although I learned not to worry about <em>what<\/em> can be inferred about me, I am perhaps somewhat worried about knowing\u00c2\u00a0<em>who<\/em> is accessing my data (and making inferences), and <em>how<\/em> they are using it. Particularly if this is done by parties that have far more resources and determination than myself. \u00c2\u00a0However, who uses my information and how is also another piece of information (data) itself. \u00c2\u00a0Although everything is information, there seems to be an asymmetry: when my information is revealed and used, it may be called &#8220;intelligence&#8221;, but when the information that it was used is revealed, it may be called &#8220;whistleblowing&#8221; or even &#8220;<a title=\"Edward Snowden hailed as hero, accused of treason \u00e2\u20ac\u201c The Guardian\" href=\"http:\/\/www.guardian.co.uk\/world\/blog\/2013\/jun\/10\/edward-snowden-revealed-as-nsa-whistleblower-reaction-live\" target=\"_blank\">treason<\/a>&#8220;. \u00c2\u00a0This asymmetry does not seem to have any technical grounding\u00e2\u20ac\u201done might make valid arguments on political, legal, moral, etc grounds, but not on technical grounds.\u00c2\u00a0Seen in this context,\u00c2\u00a0<a title=\"Mark Zuckerberg - &quot;I want to respond personally...&quot;\" href=\"https:\/\/www.facebook.com\/zuck\/posts\/10100828955847631\" target=\"_blank\">Zuckerberg&#8217;s calls for &#8220;more transparency&#8221;<\/a>\u00c2\u00a0make perfect sense\u00e2\u20ac\u201dhe&#8217;s calling for less asymmetry.<\/p>\n<p>More generally,\u00c2\u00a0<strong>privacy does not really seem to be a technical problem<\/strong>, much like <a title=\"Digital rights management - Wikipedia\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_rights_management\">DRM<\/a> isn&#8217;t really a technical problem. \u00c2\u00a0That privacy can be guaranteed by technical means seems to be a delusion and, perhaps, a dangerous one, because it gives a false sense of security. Privacy is, for the most part, a social, political and legal problem about <em>how<\/em> data can be used (any and all data!) and by <em>whom<\/em>. The apparent technical infeasibility of privacy had led me to believe that people will, eventually, <a title=\"Sun on Privacy - Wired (1999)\" href=\"http:\/\/www.wired.com\/politics\/law\/news\/1999\/01\/17538\" target=\"_blank\">get over the idea<\/a>. After all, privacy is a 200-300 year old concept (at least in the western world; interestingly, Greek did not have a corresponding word until very recently).\u00c2\u00a0I may have missed something obvious, however: if privacy is attainable via a centralized, trusted gatekeeper, then<strong> perhaps privacy is the &#8220;killer app&#8221; for centralization and &#8220;walled gardens&#8221;<\/strong>. <strong>&#8220;I want full control over your data&#8221; is tougher to sell than &#8220;I want to protect your privacy&#8221;<\/strong>.\u00c2\u00a0Which is why <a title=\"Google's Eric Schmidt Explains Why The Internet Needs A 'Delete Button' - Business Insider\" href=\"http:\/\/www.businessinsider.com\/schmidt-internet-needs-a-delete-button-2013-5\" target=\"_blank\">Eric Schmidt&#8217;s recent backpedaling<\/a>\u00c2\u00a0is somewhat worrying, even if the goal is noble (and there currently isn&#8217;t any evidence to believe otherwise).<\/p>\n<p>I don&#8217;t think there are any (technical) solutions to privacy. \u00c2\u00a0Also, enforcing transparency is perhaps almost as hard as enforcing privacy, although I have slightly more hope for the former\u00e2\u20ac\u201dbut that&#8217;s a separate discussion. \u00c2\u00a0<strong>Privacy is cat-and-mouse game, much like &#8220;piracy&#8221; and <a title=\"Digital Rights Management - Shortcomings - Wikipedia\" href=\"http:\/\/en.wikipedia.org\/wiki\/Digital_rights_management#Shortcomings\">DRM<\/a>.<\/strong> However, our expectations should be tempered by the reality of near-zero-cost transmission, collection, and storage of &#8220;inifinitely&#8221; growing amounts of information, and we should perhaps re-examine existing notions of privacy under this light. I find that many non-technical people are still surprised when I explain the simple example in the opening paragraph, even though they consider it obvious in retrospect.<\/p>\n<p>Personally, I find it safer to just assume that I have no privacy. Saves me the aggravation.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I have stopped worrying what can be inferred about me, because I&#8217;ve accepted the simple fact that, given enough time (data) and resources, anything can be inferred. Consider, as an example, &#8220;location privacy.&#8221; \u00c2\u00a0A number of approaches rely on adaptively coarsening the detail of reported location (using all sorts of criteria to decide detail, from [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[66,45],"tags":[34,58,16],"class_list":["post-267","post","type-post","status-publish","format-standard","hentry","category-other","category-scitech","tag-government","tag-opinion","tag-privacy"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p7x9xm-4j","jetpack-related-posts":[],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/posts\/267","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/comments?post=267"}],"version-history":[{"count":27,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/posts\/267\/revisions"}],"predecessor-version":[{"id":705,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/posts\/267\/revisions\/705"}],"wp:attachment":[{"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/media?parent=267"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/categories?post=267"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/bitquill.net\/blog\/wp-json\/wp\/v2\/tags?post=267"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}