Disclosure: I work as a Software Development Engineer for Amazon (Alexa). I also exist as a living human person outside of my job. The views and opinions in this post are entirely my own.

I joined Facebook back in 2006, during my first few days as an undergraduate. Over the next few years, it would play an increasingly major role in my life. Facebook became central to connecting with friends, group chats, planning events; the epicenter of pretty much all communication taking place online. Not long after, membership was relaxed to allow anyone to sign up, and it became a central hub even for connecting with family.

On paper, Facebook sounds wholesome — it’s hard to find any fault in a platform based on connecting people — but about one year ago I deactivated my account and haven’t looked back since. The cost was just too high.

What cost? Quite simply: my time, and the freedom to form my own opinions. The longer you spend on Facebook, the more opportunities they have to target you with ads. The very purpose of these ads is to alter your mindset about something, be it the necessity of a FitBit purchase, or maybe your core political beliefs. In the age of Big Data, Facebook is able to take an systematic approach to maximizing their effectiveness on both counts.

The average amount of time spent on social media has increase by 52 minutes from 2012 to 2018 to nearly 2.5 hours per day¹. During this same period, the average leisure time for Americans has remained steady at around 5 hours². This means that nearly half of all leisure time is now being spent on Social Media.

Personally (and I’m sure I’m not alone), I’ve always felt like I did not have enough leisure time. If you asked me to measure out how I’d like to spend 5 hours, I would certainly not dedicate half to Social Media. So when I learned about this increase of screen time, I began tracking my own social media usage. Mine was a little below the average, but I still spent about 2 hrs per day on social media. Was it worth it? It certainly didn’t make me feel relaxed, fulfilled, or even connected to my communities. Yet it seemed no amount of self-restraint could keep me from checking my news feed every morning, night, elevator ride, lunch break, or any time I was a little distracted.

That’s not surprising though, as the incentives for Facebook are clear. To make money, they must keep us actively engaging with their platform. And Facebook has been built specifically to addict us, using a field called “Persuasive Technology”. It sounds insidious, but there’s a whole department at Stanford dedicated to exactly this (which brazenly flaunts all the ways you can be manipulated right on their homepage)³.

This addiction is built using what is called the Hook Cycle, which consists of four stages; trigger, action, reward, and investment. The trigger initially begins as external, such as hearing about a product from a friend and deciding to try it for yourself. This leads to the action. Updating your status. The next step is a reward. This comes in the form of likes, shares, and comments. Now it’s time for investment. The more friends you have, the greater your next reward can be. Finally, you are re-engaged by receiving new notifications through photo likes, new posts from friends, etc, and the cycle repeats again. As this cycle repeats, and investment grows, you will become officially hooked.

So, are you hooked? What do you look at while drinking your morning coffee?

Facebook also makes use of variable rewards. Behavioural Psychology has shown this to be the most effective for behaviour modification⁴. When we know a reward is coming, this results in an increase in dopamine in our brains (which is very enjoyable). Consistent rewards, however, begin to become too predictable. If the rewards are randomized, the dopamine response remains strong. This is seen in animals as well as humans.

Once Social Media is established as part of your ritual, Facebook’s next goal is to keep you engaged for as long as possible, so that they can then sell access to your attention. They do this through features like the timeline (which met with a lot of backlash when it was first added). The timeline gives you a news feed of endlessly scrolling content selected from an algorithm. Without natural breaks in an interaction, users miss the opportunity to pause and re-evaluate their interest. These come in the form of “next page” buttons. Like Netflix’s auto-play, Facebook has eliminated the chance to re-consider your interest by providing an endlessly scrolling feed of information. In fact, there have even been proposals in the US to ban endless scrolling and auto-play altogether due to its addictive power⁴.

So scrolling forever has its downsides, and so does the move to algorithmic selection of content. This used to be chronological, so it was obvious to us users why we were seeing a friend’s update. But with the timeline, Facebook has the ability to re-order or even suppress your timeline updates. So if you are boring to your friends, Facebook has the power to just hide your contributions.

The choice of what content can appear in your timeline also has the power to shape your views of the world based on Facebook’s desires. This was seen in a study conducted by Facebook on their users (without their consent) to see if the psychological phenomenon “Emotional Contagion” could take place over text-based communication. Emotional Contagion is a behaviour in which an individual’s emotions are directly triggered by another individual simply through interaction. In this study, Facebook altered the content of 689,003 users, observing their behaviour over the period of two weeks. It was discovered that when negative content increased, the use of negative language increased, and the use of positive language decreased. The inverse was also true - when positive content was increased, the use of positive language also increased, and negative language decreased. In other words, Facebook can and did alter the algorithm of its timeline to make 689,003 users feel more happy or sad, without their knowledge.

While my discomfort with Facebook had been simmering for a while, the final straw was the Cambridge Analytica scandal. If you haven’t heard of this, Cambridge Analytica was a political consulting firm who combined results from a Personality Survey quiz taken by 270,000 users with harvested data from 50 million non-consenting Facebook users. By correlating the personality quiz with Facebook activity for the smaller data set, they were able to use Machine Learning to create personality profiles for all 50 million users. Using this additional Psychological profiling along with the ability to target specific groups on Facebook for ads, it creates the opportunity for very insidious targeting.

This targeted advertising allowed political groups to select vulnerable individuals and lobby them with aggressive campaigns. One case of this was the pro-gun lobby, which targeted single mothers with tendencies for neurosis and showed them ads featuring break-ins in the middle of the night⁶. These ads could be tested on a small number of single-mothers with variations in the ads (wording, font size, pictures), and whatever resulted in the most engagement could then be broadcast to everyone. Facebook provides users to advertisers like lab rats, in a perfect propaganda lab. Unlike TV and newspaper ads, Facebook ads also provide granular feedback about who clicked, and how long they spent on the external page. On top of that, your Facebook ads are completely personalized. Nobody else is seeing what you’re seeing, so the balancing act of other people fact-checking or debating an inaccurate or false ad is non-existent.

After learning about Cambridge Analytica I finally pulled the plug on my Facebook account.

As a Software Developer with experience in Big Data and Machine Learning, it finally clicked for me just how ripe of a sandbox Facebook had become for perfecting our addiction and changing our views of the world. It all felt too creepy. Call me melodramatic, but I would rather risk social ostracism than expose myself to that level of manipulation, and I also didn’t want my own data to be used to manipulate other people who are similar to me.

About three minutes after I deactivated, I signed up again — by accident. Checking Facebook was such an ingrained habit, my brain auto-piloted me back to the homepage. My password was auto-filled, and I clicked “login” without thinking. This was enlightening to me, how ingrained the habit had become. I quickly deactivated again (and haven’t been back since).

Since deactivating, I have realized just how superficial my social interactions on Facebook had become. I do miss the occasional interaction with distant friends, but sadly what I missed the most was the dopamine rush of getting a lot of likes. Quitting luckily didn’t seem to change how much time I spend with close friends and family though, and when I do it is more exciting to catch up and hear about their lives first hand. It also became clear to me how addicted to other platforms had become, so I have begun monitoring all of my internet usage to see what else I’m addicted to using apps like Moment or Space. It seems like the whole internet has degraded to a competition for our attention, falling into a lot of the same dark patterns as Facebook. At the center of this trend seems to be the belief from consumers that software and internet services should be free. As the saying goes, if you’re not paying, then you are the product.

Why is it we are OK with paying $4 for a coffee, but scoff at the idea paying to access an app? We will marvel at how cheap it is to send a piece of paper across the world in a few days, but willingly hand over all our personal emails to Google rather than pay a few dollars a month. Since deactivation, I have been gravitating to platforms where the business case is obvious and doesn’t rely on selling their user’s data.

In summary: I deactivated my account because I just don’t trust Facebook. They try to turn you into an addict, sell access to your data and attention, and have shown through their own experiments that they can and will manipulate your mood, political views, and purchasing choices at their will.

My advice to you is this: de-activate Facebook. It’s easy to do, and you can rejoin by literally logging in again. How long do you think you can last? It will probably require a little will power in the first few days, so take the opportunity to examine the motives that are drawing you to sign back in. Are you coming back for the human connection, or the likes?

If you enjoyed this please feel free to share, but please don’t go on Facebook to do so! I also plan to follow up with a post featuring my ideal future for the internet. In the mean-time I strongly recommend checking out some material that really resonated with me:
You Are Not a Gadget by Jaron Lanier
Hello World: How Algorithms Will Define Our Future and Why We Should Learn to Live with It by Hannah Fry
The Center For Humane Computing started by Tristan Harris.

  1. How Much Time Do You Spend on Social Media?

2. American Time Use Survey Tables (2012, 2018)

3. Stanford Persuasive Tech Lab Homepage

4. Hooked: How To Make Habit-Forming Products, And When To Stop Flapping

5. Why Behavioral Psychology Makes Apps So Addictive

6 . Computers tell us who to date, who to jail: But should they?

I’m a Software Engineer who practices Human Centred Computing. I’m also a new dad, husband, coffee roaster, mountain biker, who can’t be constrained to 160 char