Talk:ListenLog: Difference between revisions
Jump to navigation
Jump to search
Line 7: | Line 7: | ||
What might we do to address these concerns? Here are some suggestions I propose: | What might we do to address these concerns? Here are some suggestions I propose: | ||
* Clear, careful presentation of the concept. This is unlike anything people will be familiar with and they will likely resist, misunderstand, or compare it to historic privacy violations. A primary emphasis on new, previously unavailable user functionality might be a good approach. | |||
* Store the data somewhere agnostic and other than on the application or vendor servers (I recommend Berkman servers) | |||
* Send data to servers over secure sockets | |||
* Automatically capture and store the data in fully encrypted form ("locked"). As an alternative to opt-in, users could unlock the data from their device (would un-encrypt it wholesale) in order to get access to functionality or to share. Users could lock, unlock, and delete at will. | |||
** Perhaps lock and unlock different sub-sets of data, e.g. location | |||
* Don't build any ability to share data into first version | |||
* Build and focus on a single piece of previously unavailable end-user functionality | |||
[[User:Khopper|Khopper]] 15:31, 2 February 2009 (UTC) | [[User:Khopper|Khopper]] 15:31, 2 February 2009 (UTC) |
Revision as of 11:33, 2 February 2009
Users Might Object
A potential concern with ListenLog is that end-users will perceive it as yet another mechanism for aggregating attention data that will (or at least could) be used by a third party that is unwarranted or used in an unwarranted fashion. There are four aspects of this concern, depending on how the LL concept is presented and implemented:
- Knee-jerk negative reaction to the concept of personal/attention data even being monitored in the first place. This reaction is prior to any consideration given to why and for whom the data might be used.
- Assumption that whenever personal data is captured, it is captured ultimately for a vendor's benefit, even if we are told otherwise. The default assumption is that data is stored and used by the vendor who is in control of the current user experience, e.g. whomever distributes this software application or hosts this website. The assumption might be that it is used by unwarranted vendor or that even if explicit permission is given, the data might be used in an unwarranted fashion (e.g. when you provide your phone number to a vendor who then later uses it to solicit you).
- Concern that even if the data is owned or controlled by us, it is captured and stored and might eventually be compromised, e.g. by a malicious 3rd party, oppressive government, legal body, etc. Since we intend to store the data remotely, this will aggravate the concern.
- Why would I choose to do this? Since LL will like have the ability to opt-in or opt-out, why would someone choose to collect their own data? What's in it for them? There will be built-in resistance without a compelling use case.
What might we do to address these concerns? Here are some suggestions I propose:
- Clear, careful presentation of the concept. This is unlike anything people will be familiar with and they will likely resist, misunderstand, or compare it to historic privacy violations. A primary emphasis on new, previously unavailable user functionality might be a good approach.
- Store the data somewhere agnostic and other than on the application or vendor servers (I recommend Berkman servers)
- Send data to servers over secure sockets
- Automatically capture and store the data in fully encrypted form ("locked"). As an alternative to opt-in, users could unlock the data from their device (would un-encrypt it wholesale) in order to get access to functionality or to share. Users could lock, unlock, and delete at will.
- Perhaps lock and unlock different sub-sets of data, e.g. location
- Don't build any ability to share data into first version
- Build and focus on a single piece of previously unavailable end-user functionality
Khopper 15:31, 2 February 2009 (UTC)