Managing privacy settings is a key aspect of cybersecurity and arguably the first line of defense against cyberthreats. Many online services offer privacy controls in some form, but users don’t bother with them, don’t understand them or they don’t find them sufficient, leaving few satisfied with the status quo. Why is it so hard to get privacy right? In this post, we’re talking about industry-wide practices that might be hampering consumers’ ability to manage their privacy the way they want.

The psychology of deceptive design

It’s no secret that everyone, especially tech companies, wants your data. While rumors like Facebook listening to your microphone are widely believed, the true “conspiracy” is far subtler. The user interfaces of modern apps are addictive and designed to make it difficult, if not outright impossible, for users to behave in ways against a company’s wishes. For example, it’s pretty common for apps to hide the “delete account” function through a nest of obscure menus, which is why we linked to the option directly in our post about deleting Facebook. In a more recent example, MoviePass got into trouble in August for misleadingly driving users to “accept” new terms and conditions that coincidentally reactivated canceled users’ accounts.

Back in 2010, Harry Brignull, a user experience (UX) consultant, noted the growth in these types of design choices and called them “dark patterns.” They’re subtle (and sometimes not-so-subtle) ways of encouraging customers to use a service in the way the developer prefers. Dark patterns are nothing new; since the advent of advertising, companies have been working with psychologists to motivate consumers in ways that benefit the bottom line. Offline, these are what behavioral psychologists call nudges, and they’re everywhere. The modern supermarket alone provides a plethora of examples with everything from the placement of cereal boxes to the location of the entrance and the size of shopping carts craftily pushing families to spend more than they otherwise would. It’s worth noting that both online and offline patterns and nudges don’t have to be dark, and that good, ethical design can motivate users and customers to improve their lives. Still, this doesn’t make deceptive design any less ubiquitous.

How dark patterns define users’ experience

Dark patterns are no less relevant than they were back in 2010 when Brignull started documenting them. Behind some of this year’s biggest tech scandals are examples of dark patterns gone wickedly right. Below, we illustrate how dark patterns were put to use this year:

Dumb defaults

Psychologists have long known that people are biased toward preferring defaults. For example, in some countries where citizens must opt out of organ donation after death, there are higher rates of organ donation compared to those where citizens must opt in for the process. With regard to tech services, the default settings tend to lead to less altruistic outcomes. In a lot of apps, privacy defaults tend to skew toward sharing the largest amount of data possible, often while obscuring how to change the settings. Venmo provided an example of this when it was discovered that the app made all transactions visible by default and users had to know beforehand to opt out. Users who tried to opt out of the broad, public default settings were greeted with a pop-up dissuading them from doing so, which is another psychological trick.

Piecemeal privacy

With privacy issues making headlines recently, more users are taking responsibility for their own privacy. Regardless of their intentions, these users can run into roadblocks created by the privacy controls they’ve been given. Earlier this year, a fitness wearables company called Strava got into hot water when it was revealed that the company created a heat map detailing the activity of all its users. The service, which views itself as a social network, shares all data by default. Additionally, its privacy controls are so complex that a 1,000-word blog post was required to describe them. The post seems to have been written after a tech journalist described her frustrations with Strava last year. A whole lexicon of ambiguous terms like “enhanced privacy” and “privacy zones” effectively obscured the full number of steps she needed to take in order to limit her privacy settings to sharing with just her friends.

The illusion of control

Similar to Strava, Google got into trouble in August when an Associated Press investigation found that Google users who chose to “pause” their location history still had their geodata accessed and used by Google applications. The company defended itself by stating that users have a variety of privacy controls available and it hasn’t hidden any of them. The location pausing feature was apparently designed to stop users from having any tracked locations be visible to them in their own Google account feed – an option that is obviously less comprehensive and arguably pointless. While Google didn’t technically hide any controls, the prominence, as well as the language surrounding the location history feature, is at best misleading, with the instructions previously stating, “With Location History off, the places you go are no longer stored. When you turn off Location History for your Google Account, it’s off for all devices associated with that Google Account.”

What’s more, back in 2015 the company unveiled its streamlined privacy and security settings, where “location history” appears as a self-contained category; however, the option to completely disable the sharing of location history with Google services is in another category. This again illustrates that through obtuse and piecemeal controls, as well as ambiguous language, companies can get away with making even tech-savvy users feel empowered without actually putting them in control.

Can you actually control your privacy?

Completely reclaiming your privacy is a tall order, maybe one that can’t be fully met, but keeping the following in mind will at least help you retain some privacy.

1. Don’t trust defaults. If it wasn’t clear from any of the examples in this post, in many cases, default settings aren’t your friend. When signing up for anything, online or not, make sure you read the fine print and check your settings.

2. Everyone uses dark patterns, not just the big guys. The list of dark patterns that you’ll see from Facebook and the other tech giants is almost endless, but that doesn’t mean they’re the only ones doing this. You can find dark patterns pretty much anywhere. In fact, some dark patterns are so common that they’re simply accepted as facts of life. For example, we all take out our credit cards when signing up for a trial with an online service, knowing that we likely won’t be given a reminder to cancel before we’re enrolled into a subscription. As depressing as it sounds, whenever joining a service, you should watch for such practices.

3. Read up on dark patterns and psychology. The best ways to avoid companies using these behaviors is to study them, so that you can learn how to recognize them. Harry Brignull’s dark patterns website is a good place to start. A brief understanding of the underlying psychology of the cogitative biases that companies exploit can be helpful, too.

4. Tell your network about dark patterns. Even if you’re on your privacy A-game and have studied dark patterns, make sure your friends and associates are also in the know because privacy depends on all of us. What our friends and families share about us can hurt our privacy, regardless of the settings we have.

Want to learn more about protecting your privacy? Follow our privacy blog to find out how you can protect your privacy online and offline.