"More than machinery, we need humanity."
There is a certain tradeoff inherent in using the technological tools of a modern society. Though people may chafe over more onerous aspects of what they are expected to give up (and for many, just reading a user agreement counts as “onerous”), many more seem relatively unaware and not particularly bothered by the required trades.
Privacy, in particular, is something that people seem to have an odd relationship with giving up (or allowing some modification of) in order to enjoy the benefits of certain devices and applications, especially as it is hard to put a price on privacy, and moreover we frequently do not truly realize how much of it we’ve given up.
Applications that run on mobile devices (smart phones, tablets, etc…) have a tricky relationship with privacy, and by “tricky relationship with privacy” I mean that they trick users into thinking that the applications have much interest in privacy. Many Apps connect with other Apps, make use of a person’s contacts, location information, and otherwise enter into areas that a person may not realize that the App is going to go (whether or not they read the user agreement), and in still other cases the App (and App maker) retain much of this harvested data even after a person has stopped using or deleted an App (more on big data [a lot more]).
While consumer and privacy advocacy groups can push for reform (and they do) a point comes at which we must ask ourselves: do we trust the companies violating our privacy when they say “oh, okay, we’ll stop doing that?” Or, is more required? More, as in legal recourse? Enter US Representative Hank Johnson (D – GA [the fourth district]) who has introduced the (well titled) Application Privacy, Protection and Security (APPS) Act (H.R. 1913). The act:
“would require app developers maintain privacy policies, obtain consent from consumers before collecting data, and securely maintain the data they collect.”
In speaking about the Act, Representative Johnson said of the citizens who helped build the legislation:
““citizens also wanted simple controls over privacy on devices, security to prevent data breaches, and notice and information about data collection on the device. The Apps Act answers the call.””
The Act seems to have the support of a variety of consumer and privacy advocates (and ThinkProgress gave the legislation a supportive nod), and a read over of Representative Johnson’s statements and a look at the bill provide reasons for some measured optimism. Legislation to better protect citizen’s privacy rights is an important step when telecoms routinely show that they do not understand privacy in the way that citizens think they should. Indeed, the APPS act is exactly the type of useful, and sensible legislation that stands (pretty much) zero chance of being passed by the House of Representatives.
But without delving into the inability of the US Congress to do much of anything there are still reasons to remain chary when thinking about the APPS act (which, admittedly, is still in a “draft” stage).
The “Discussion Draft” (which is available at the bottom of this link) provides several interesting points. In “Notice and Consent” it notes that App makers would be required to inform users of the type of data being collected, why it is being collected, and inform users as to whether or not this information is being shared with any third parties, it would also mandate that App makers inform users of the company’s data retention policy (with instructions on how to delete the data). The APPS act (this is in the “Security” section) would also put the onus on the developers to set up solid security measures to protect users’ information. Those are good aspects; however, some remain that seem a bit more worrisome.
Consider that in the “Opting Out of Data Collection and Deleting Data” section it notes that companies will be required to provide users with a way to quit an App and pull their data out with them:
“A the consumer’s election, the developer would either delete any personal data collected to the extent practicable, or case collecting data altogether.”
These “Opt out” points need to be read in the context of the “Promoting Responsible Self-Regulatory Practices,” while the latter section focuses on the difference between “personal and de-identified data” it nevertheless begins by noting that part of what the bill does is encourage companies to be responsible without actually using “federal regulation.” Again, “we trust you.”
So, the problems: it seems that an easy work around to “informing” users is to simply do what is done now, put it all in a lengthy user agreement that most people won’t read and then say “Well, we put it there, they just didn’t read it.” As for the “Opt out” I find the words “to the extent practicable” to be a kind of easy way out for companies as they can take refuge in the claim that it would not be “practicable” for them to delete the information.
After all, once the bits of information are gathered (Big data!) they can argue that it would simply be too difficult to pull out the particulars, and companies could even set up their data harvesting in such a way as to make it less “practicable” to remove the information later. The contrast between “personal and de-identified data” may also be a bit of a distraction as a group armed with enough anonymous information (Big data!) can de-anonymize it. As for “Self Regulation” I had been under the impression that much of the reason that the APPS act is needed is because these companies have demonstrated that they cannot be trusted to self regulate.
Lastly, and this is not a strike against the APPS act but a comment on the US Congress, this bill at its most basic is a form of regulation. In a rational discussion citizens and companies can understand the need for regulations (especially in cases where citizens privacy is being systematically violated), but it requires no imagination to picture this bill being scuttled on the floor of the House as being labeled as “business killing regulation (blah blah blah)!”
Alas, something truly needs to be done to better protect the privacy rights of citizens in our increasingly wired world. While the APPS act has many promising aspects it seems like (again it’s still in draft stage) rather weak tea when the opposition is (literally) pounding energy drinks. It is too easy to imagine this bill (if it gets a chance to advance at all [which is pretty unlikely]) being so watered down and full of loopholes as to make it nothing more than a new mouse being hooked up to an ancient computer.
Or worse…legislation like the APPS act could easily become exactly the type of smiling cover that is needed to sneak in the provisions of many less popular (with good reason) pieces of electronic legislation. The House of Representatives that the APPS act must clear is the same House that recently passed CISPA (again) and where bills like SOPA have found comfortable homes. Many members of the House would still love to see those bills passed (more on that here). The APPS act with its reasonable and desired regulations is the perfect place in which far more regressive regulations can be hidden (more on that here).
Furthermore, what this legislation does not (and probably could not) truly challenge is the way that most people use their app-filled devices. Regardless of whether or not the APPS act becomes law, it remains likely that most people will simply continue using their Apps thinking that the privacy tradeoff is worthwhile…or simply unaware of the tradeoff they have made so that they can share their Angry Birds scores with their friends.
So, a measured and wary round of applause for the APPS act, it could turn into something great…or it could mutate into something quite foul. It would be a shame if in several months time the Internet becomes a flurry with panic about the terrible provisions hidden in the APPS act.
Granted, at least we could call that scenario: APPSendicitis.
[Note – the unofficial soundtrack for this post is the song “Appendix Gone” by Gas Huffer]