Expectations of privacy

I have stopped worrying what can be inferred about me, because I’ve accepted the simple fact that, given enough time (data) and resources, anything can be inferred. Consider, as an example, “location privacy.”  A number of approaches rely on adaptively coarsening the detail of reported location (using all sorts of criteria to decide detail, from mobility patterns, to spatial query workload characteristics, etc).  For example, instead of revealing my exact location, I can reveal my location at a city-block level. In an area like NYC, this would conflate me with hundreds of other people that happen to be on the same block, but a block-level location is still accurate enough to be useful (e.g., for finding nearby shops and restaurants).  This might work if I’m reporting my location just once.  However, if I travel from home to work, then my trajectory over a few days, even at a city-block granularity, is likely sufficient to distinguish me from other people.  I could perhaps counter this by revealing my location at a city-level or state-level.  Then a few days worth of data might not be enough to identify me.  However, I often travel and data over a period of, say, a year, would likely be enough to identify me even if location detail is quite coarse.  Of course, I could take things to the extreme and just reveal that “I am on planet Earth”.  But that’s the same as not publishing my location, since this fact is true for everyone.

Read the rest of this entry »

Comments