On March 26, I spoke to the Augmented Reality Community meeting held in conjunction with the Open Geospatial Consortium (OGC) Technical and Planning Committee meeting held in Arlington, Virginia. Quite understandably, the subject of trust and Augmented Reality (A/R) quickly turned to privacy.
I had a hard time coming up with what to say on this
subject. First, because Augmented Reality can /and will involve
many diverse technologies and applications. Second, because concerns over A/R technologies often drown out discussions regarding the value of many A/R applications.
So I began my talk at the beginning - by breaking down the elements of A/R. Augmented
Reality is
defined on Wikipedia as "a live, copy
or view of a physical, real-world environment whose elements are augmented (or supplemented) by
computer-generated sensory input such as sound, video, graphics or GPS data."
Breaking it down, what are the privacy
concerns in "reality"? I would suggest there are two primary issues for the A/R community to consider in this area:
1. Historically there have been different expectation of privacy when a person is in
public then when they are are in a private place. However those expectations are beginning to change; more and more courts, regulators, policymakers, academics, privacy advocates and even technologists are redefining the what expectations of privacy in public should be reasonable give new technological capabilities.
2. What are the privacy expectations with respect to an object? Increasingly there have been growing expectations of privacy with objects, such as mobile phones. How will this translate to other objects, such as automobiles, or the outside of homes.
. Next, what are the privacy concerns associated with augmentation - "the elements are augmented (or supplemented) by
computer-generated sensory input such as sound, video, graphics or GPS data." I would suggest there are two primary issues for the A/R community to consider with respect to augmentation.
1. Are
you augmenting with public input (data) or private input (data)? Obviously there is a greater private concern associated with private data. However, there are increasingly concerns with public data as well. For example, the New York newspaper that posted a controversial interactive map of publicly available names and addresses of registered gun owners.
2. What
is the definition of “public” data? Social media is pushing the limits of what has is considered public and what is private. However, do people appreciate how available the posted information will be become and how it might be used? Are
some types of social media more “public”?
Based upon this analysis, I came
up with three questions that I believe the A/R community should consider when
building applications and use cases.
These questions can help define the framework in
which to determine the potential impact of A/R in a market/jurisdiction.
Also, should expect that the answers will change over time, and in some cases
will be “individual”-specific, such as when a minor is involved.
1. If/when
does the display of augmented public data of someone who is in
the public violate that individual’s privacy?
2. If/when
does the use or sharing of augmented public data of someone who
is in public violate that individual’s privacy?
3. Is
there ever a time when the display and/or use of augmented private data
of someone who is in private worth the potential/perceived violation of
that individual’s privacy? If
the answer is yes, when is it appropriate by whom?
No comments:
Post a Comment