what can’t be changed

Many outlets have reported in recent days that Facebook is testing the removal of Likes from posts. Let’s say they do eventually implement this feature. If so, then the first question is whether it will be opt-in or opt-out. That is, will Likes show unless you choose not to show them, or will they be hidden unless you choose to show them?

My guess is that, in either case, and despite the many calls by tech journalists for changing the way Facebook works, most Facebook users will want to keep counting Likes. We need to remember that there are over two billion Facebook users, and hundreds of millions of them have undergone years of operant conditioning: they have been trained to seek Likes, to rejoice in Likes, to be made miserable by the absence of Likes. Many of them have returned thousands and thousands of times to Facebook to check their count of Likes, refreshing their browser tab even when they don’t need to. The habit of measuring their personal value by Likes is so deeply ingrained that it is difficult to see how a significant number of them will break it. Or even really want to.

I don’t think we reckon with this phenomenon often enough, or seriously enough. The major social-media companies have been conducting for the past decade an implementation of B. K. Skinner’s principles more massive than anything we can truly imagine. They have found ways to get billions of people to volunteer for the experiments and devote sometimes hours a day to pursuing them. Operant conditioning at this level works. And its effects are difficult to undo.

One way to see this: often when people get sick of Facebook or Twitter or Instagram and find some other online venue, they simply bring with them to their new location the habits they learned in the previous ones: the snark of Twitter, the rants of Facebook, the posturing of Instagram. It’s like the old line about travel: wherever you go, there you are. It’s hard enough for people to leave Facebook or Instagram or Twitter behind; what’s almost impossible to leave behind is the person that those sites’ algorithmic behaviorism turned you into.

Facebook wants to matter

Siva Vaidhyanathan:

There’s no one to punish Facebook if Facebook fails. Facebook’s trying to head off regulation by doing this, this transparency effort. But ultimately, transparency doesn’t matter. The same practices will occur. Facebook is committed to being a factor in global politics. And it not only wants to make money off it, more importantly, it wants to matter. Facebook wants to be the place where we conduct our politics.

Here’s what I don’t understand: Vaidhyanathan says this and much worse about Facebook, but also, in the same interview and elsewhere, discourages people from leaving Facebook. “By removing yourself from Facebook, you remove yourself from the concern. If you are active on Facebook and you watch how people relate to each other and how it affects you, you can be sensitive to the larger condition.” I don’t see how this follows. I don’t see why you have to be on Facebook to critique Facebook, or to see the damage it does. I ditched Facebook in 2007 but that hasn’t “removed me from concern” about the social and political damage it’s doing. Is Facebook really going to be responsive to people who stay on it no matter how foul it becomes?