A person standing in the middle of the street, white arrows pointing in opposite directions in front of their feet.

Choice, Coercion, and Predation

Posted

in

by

This week’s readings, as well as my viewing of the short film “A Model Employee“, have shed further light on the predatory nature of online data collection. Companies, when offering people a choice about their data collection, still attempt to manipulate that choice in their own favor. However, along with this stark view of the reality things, I see the potential for improvements.

Do We Have a Choice?

“(Un)informed Consent: Studying GDPR Consent Notices in the Field” has made it clear that, in some cases, users do have a choice about what data is collected about them online. For instance, the General Data Protection Regulation in Europe aims to make cookies “opt-in”, giving users the choice to allow cookies to collect their data. However, the article also made it clear that not every website displayed to EU users are properly complying with this law. And even with those that are, it seems people don’t realize they genuinely have a choice. Many seem to suspect that refusing cookies will not allow them access to websites. Furthermore, many of these websites do not give privacy as the default, but make people opt-in to data protection. Users who are not aware of this are likely to ignore privacy notices and continue to have their data harvested while thinking that by ignoring the notice, they are actually protecting their data.

Coercion and the Manipulation of Choice

“Dark patterns” in web services manipulate the choices that users make. “Nudging” in a cookies notification encourages people to say “yes” to cookies, even if they otherwise may not have. Sometimes, users are not even given a choice, with only an “agree” option presented. Sure, they agree to the use of cookies, then…because they had no other options. These workarounds seem to be allowing website to comply with the GDPR while still offering no additional privacy for users.

“A Model Employee” seemed to me a great example of coercion, too. Neeta is not given clear information about how her data is going to be used and is encouraged to wear the device at all times. What’s the encouragement? Money, and keeping her job. Her boss tells her that the device will help her improve. This incentive is given right after Neeta fears she’s going to lose her job. The next incentive? A bonus. Neeta’s boss tells her that if she wears the device at all times, she will benefit with a bonus, and time off, and other tempting rewards. So she agrees to wear the device…but her choice seemed heavily coerced.

Choice and Obfuscation

Stephanie Vie’s article about Candy Crush as well as the GDPR article reminded me of previous discussions about privacy and consent forms as well as the idea of surveillance capitalism. The makers of Candy Crush and other games like it collect user data and use it to profit. Vie tells us that with games like Candy Crush, “your game play data is sold back to you,” sometimes in the current game and sometimes in newly created “freemium” games. Because data mining on social games is so seamless, users are often not aware of the terms they are “agreeing” to by playing. They don’t know they have a choice, if they even do in the first place.

Vie mentions “dark patterns” as well, explaining that the terms and conditions of online games are often meant to increase the company’s short term bottom line. They do not favor the user. Sometimes these privacy policies are not accessible at all. When they are, they are often overly-long and difficult for the audience to understand. This is a recurring problem with consent forms!

Silver Lining

Despite reporting the unfortunate state of our world today, these readings also gave me some hope. They let me see that research was being done on privacy policies and consent forms. For instance, “(Un)Informed Consent…” has done great work showing what a functional (for the user) privacy opt-in should look like. Getting companies to implement such a thing is another matter entirely, but the GDPR feels like a step in the right direction. If the next iteration of this law implements research like the above, perhaps it could actually work.

Vie’s article, too, highlights possibilities. She mentions how Tumblr hosted changes to its terms publicly, allowing users to make suggestions and changes. The result was a privacy policy that uses plain language and humor. People actually paid attention to the changes presented in this way. If we try to focus more on user-centered design in these terms, perhaps things can change for the better. Maybe privacy policies can give people meaningful choices about their internet experiences. Laws like GDPR need to become more widespread and prevent “opt-in” coercion. If that happens, I think we’d be taking a small but meaningful step toward a better internet.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *