Posted on 2nd April 2024

Examining one of Facebook's dark patterns

It just so happens after many years of having no account on Facebook I decided to join up again to interact with a few specific groups. Now I could discuss how my account got suspended, my suspicions about the levels of information Facebook are harvesting on us without our consent, or the fact that my account suspension seemed to coincide with me locking down my privacy settings.

But I instead want to discuss something much simpler and more self contained: a user experience choice designed to specifically steer the user away from proactively managing their privacy.

I did manage to get my account reinstated. For how long I don't know, nor is it of great concern to me. While I do still have the ability to access it though, I took the opportunity to grab some screenshots of a simple 'dark pattern' that exemplifies the falsehood of Facebook's privacy claims.

On the mobile web version of Facebook your privacy settings are several screens deep in a place that even I as a tech savvy individual took several minutes to find. But that's not what I want to look at since there's a lot of nuance in the information architecture of a product like this.

Instead let's look at what happens if you attempt to change one of the privacy settings:

tbd
Facebook's privacy settings. Black redaction and red highlight added by me.

If I tap on one of the settings, here's the options I'm presented with:

tbd
Changing a privacy setting. Red highlight added by me.

Only if I tap on 'see more' do I get the following option set:

tbd
'Only me' will only appear if you select 'see more'. Red highlight added by me

There is no reason to hide some of the more restrictive privacy settings such as 'only me'. It is not necessary for responsive design on a mobile. It is also much worse for accessibility. There is plenty of screen real estate to allow for the option to be presented. And yet it's hidden under a 'see more' link.

There is no nuance here, no case to be argued. Through a simple design decision they have implemented a practice that is not only bad UX but a deliberate dark pattern designed specifically to discourage users from sharing less data.

Now I know what some people might say about this: "it's one extra tap, what's the big deal?". While many tech industry veterans and many ordinary users alike will already recognise one extra tap is actually a big deal, it goes beyond usability of the product.

Others might take the view that Facebook is merely "protecting us" from inadvertantly sharing less than we would like too. "Things are better when we are open and social" is a stance Meta themselves would probably take. However user interfaces should not be opinionated. They can guide us to help us find the settings and features we want to use, but they should never obfuscate our options.

To illustrate the above point, let's take a more traditional online transaction - an e-commerce site for example. Obviously the aim of the site is get you to make an informed purchase of items that you as the consumer will be satisfied with. At no point do any e-commerce sites I use ask me how much I want to pay for the product - deliberately hiding the options to pay less. They may offer alternative items that are more expensive, and promote those accordingly. But it would be highly ethically dubious if they started putting cheaper variations of the product under a 'see more' link in a drop down. Or if the cheaper delivery / shipping options were hidden behind a 'see more' link.

Since in the case of Facebook the transactional nature of our interaction with the product is much less clear and involves our personal information and our privacy, I would argue it is even more ethically concerning.

At the heart of this is the conundrum Facebook have created by claiming to care about privacy, while implementing technology that doesn't take user's privacy seriously. It is a conundrum compounded by the fact that we all know now the unwritten contract that exists in using these products: we are selling our data in return for the product.

The deception is in the pretense that privacy is valued by Meta. I'm OK with the value proposition of data in exchange for access to a product, as long as it's explicit (which it obviously isn't). What I'm not OK with is the contradictory practices such as this, which not only raise ethical questions, but legal ones too. That's something designers, developers, lawyers, executives etc. at Meta have to live with and that they are not immune from scrutiny over. How about you, are you OK with this?