...

Stagger onward rejoicing

Tag: Facebook (page 1 of 1)

decline and fall

TikTok and the Fall of the Social-Media Giants: A very interesting post by Cal Newport. His thesis is, essentially, as follows: TikTok’s popularity has alarmed Facebook — a company that has a history of forgetting what it does well in order to chase immediate relevance — and as a result Facebook is neglecting to consolidate its advantage in the “social graph.” The result will inevitably be a further and more precipitous decline in Facebook’s influence — but it is also unlikely that TikTok itself will remain as dominant as it is. 

As Newport says in an accompanying blog post, “If platforms like Facebook and Instagram abandon their social graphs to pursue this cybernetic TikTok model, they’ll lose their competitive advantage. Subject, all at once, to the fierce competitive pressures of the mobile attention economy, it’s unclear whether they can survive without this protection.” Thus: “If TikTok acts as the poison pill that finally cripples the digital dictators that for so long subjugated the web 2.0 revolution, we just might be left with more breathing room for smaller, more authentic, more human online engagements.” 

Well, let’s hope so. I’d love to see a future in which the algorithmic social-media domination of our online lives ended, and we return to online life at a more human scale. But how likely is that? We know that the venture capitalists and angel investors don’t want moderate successes — they want The Next Enormous Thing. Will they get it? I think it all hinges on how strongly people respond to VR environments. 

Nicholas Carr:

It’s revealing that, before the arrival of the net, people didn’t talk about “authenticity” as we do today. They didn’t have to. They understood, implicitly, that there was something solid behind whatever show they might put on for public consumption. The show was not everything. The anxiety of the deep fake had not yet taken hold of the subconscious. The reason we talk so much about authenticity now is because authenticity is no longer available to us. At best, we simulate authenticity: we imbue our deep fakeness with the qualities that people associate with the authentic. We assemble a self that fits the pattern of authenticity, and the ever-present audience applauds the pattern as “authentic.” The likes roll in, the views accumulate. Our production is validated. If we’re lucky, we rise to the level of influencer. What is an influencer but the perfection of the deep-fake self? 

This is usefully provocative of reflection — Nick specializes in that — but I don’t think he’s right about authenticity. See Lionel Trilling’s Sincerity and Authenticity, based on lectures given in 1970. He already saw the shift from an ethos of sincerity to one rooted in an ever-elusive quest for authenticity. See also Charles’s Taylor’s The Ethics of Authenticity (1992), where the fourth chapter is a brilliant capsule history of the the various meanings of “authenticity.” It’s especially interesting that Taylor believes that the debased and trivial way that authenticity is defined today is not the only one — that there is a nobler understanding of authenticity as a “moral ideal” that is worth defending. 

UPDATE: Also, see the recent Hedgehog Review issue on authenticity

How Facebook and Google fund global misinformation | MIT Technology Review:

Over the last few weeks, the revelations from the Facebook Papers, a collection of internal documents provided to Congress and a consortium of news organizations by whistleblower Frances Haugen, have reaffirmed what civil society groups have been saying for years: Facebook’s algorithmic amplification of inflammatory content, combined with its failure to prioritize content moderation outside the US and Europe, has fueled the spread of hate speech and misinformation, dangerously destabilizing countries around the world.

But there’s a crucial piece missing from the story. Facebook isn’t just amplifying misinformation.

The company is also funding it.

FiveThirtyEight:

I asked each of the interviewees what they would do if I waved a magic wand and gave them total control over Facebook. “I would turn off Facebook and apologize to the people of the world,” said Cathy O’Neil, a data scientist and algorithmic auditor. “They can’t actually solve their problems.”

two quotations on the metaverse

Nick Carr:

Facebook, it’s now widely accepted, has been a calamity for the world. The obvious solution, most people would agree, is to get rid of Facebook. Mark Zuckerberg has a different idea: Get rid of the world. […] 

His goal with the metaverse is not just to create a virtual world that is more encompassing, more totalizing, than what we experience today with social media and videogames. It’s to turn reality itself into a product. In the metaverse, nothing happens that is not computable. That also means that, assuming the computers doing the computing are in private hands, nothing happens that is not a market transaction, a moment of monetization, either directly through an exchange of money or indirectly through the capture of data. With the metaverse, capital subsumes reality. It’s money all the way down.

Clive Thompson:

The truth is, a thriving metaverse already exists. It’s incredibly high-functioning, with millions of people immersed in it for hours a day. In this metaverse, people have built uncountable custom worlds, and generated god knows how many profitable businesses and six-figure careers. Yet this terrain looks absolutely nothing the like one Zuckerberg showed off.

It’s Minecraft, of course. 

(I am not returning to blogging as such, not until 2022 at the earliest, but it occurred to me this morning that I still need a place to put quotes I want to remember and use later, so I’ll keep using the blog for that.

When I read this story, I had one immediate thought: Facebook will close the loophole that allows Oculus purchasers to get customer service. They really are the worst big corporation in American history. The damage that they have done to our social order is incalculable, and they compound that by their absolute contempt for their users — because, of course, as has often been said, the users are not the customers. It’s usually said that the users are the product, but they aren’t even that: they’re the conduit for the product, which is data. Facebook’s attitude towards users who have problems is precisely the same as the attitude of a factory dairy farm towards cows that cease to give milk.

it’s time

I read stories like this almost every day: banned from Twitter for no good reason; banned from Facebook for no good reason; banned from Facebook supposedly by accident, but come on, we know what’s going on here.

I don’t for an instant think Bret Weinstein’s Facebook account was flagged by an algorithm: someone there wanted to silence him and hoped to get away with it. But most of the time these bans happen because the sheer scale of these platforms makes meaningful moderation impossible. Facebook and Twitter would have to hire ten times the number of moderators they currently employ to make rational judgments in these matters, and they won’t voluntarily cut into their profits. They’ll continue to rely on the algorithms and on instantaneous denials of appeals.

Here’s your semi-regular reminder: You don’t have to be there. You can quit Twitter and Facebook and never go back. You can set up social-media shop in a more humane environment, like micro.blog, or you can send emails to your friends — with photos of your cats attached! If you’re a person with a significant social-media following, you can start a newsletter; heck, you can do that if you just want to stay in touch with five of six friends. All of the big social-media platforms are way past their sell-by date. The stench of their rottenness fills the room, and the worst smells of all come from Facebook and Twitter.

In your heart you know I’m right: It’s time to go.


P.S. Of course, I’ve been singing this song for a long time. I return to it now simply because the election-as-mediated-through-social-media seems to be exacerbating the misery of millions and millions of people. I’ll try to sing a different song from now on.

what can’t be changed

Many outlets have reported in recent days that Facebook is testing the removal of Likes from posts. Let’s say they do eventually implement this feature. If so, then the first question is whether it will be opt-in or opt-out. That is, will Likes show unless you choose not to show them, or will they be hidden unless you choose to show them?

My guess is that, in either case, and despite the many calls by tech journalists for changing the way Facebook works, most Facebook users will want to keep counting Likes. We need to remember that there are over two billion Facebook users, and hundreds of millions of them have undergone years of operant conditioning: they have been trained to seek Likes, to rejoice in Likes, to be made miserable by the absence of Likes. Many of them have returned thousands and thousands of times to Facebook to check their count of Likes, refreshing their browser tab even when they don’t need to. The habit of measuring their personal value by Likes is so deeply ingrained that it is difficult to see how a significant number of them will break it. Or even really want to.

I don’t think we reckon with this phenomenon often enough, or seriously enough. The major social-media companies have been conducting for the past decade an implementation of B. K. Skinner’s principles more massive than anything we can truly imagine. They have found ways to get billions of people to volunteer for the experiments and devote sometimes hours a day to pursuing them. Operant conditioning at this level works. And its effects are difficult to undo.

One way to see this: often when people get sick of Facebook or Twitter or Instagram and find some other online venue, they simply bring with them to their new location the habits they learned in the previous ones: the snark of Twitter, the rants of Facebook, the posturing of Instagram. It’s like the old line about travel: wherever you go, there you are. It’s hard enough for people to leave Facebook or Instagram or Twitter behind; what’s almost impossible to leave behind is the person that those sites’ algorithmic behaviorism turned you into.

Facebook wants to matter

Siva Vaidhyanathan:

There’s no one to punish Facebook if Facebook fails. Facebook’s trying to head off regulation by doing this, this transparency effort. But ultimately, transparency doesn’t matter. The same practices will occur. Facebook is committed to being a factor in global politics. And it not only wants to make money off it, more importantly, it wants to matter. Facebook wants to be the place where we conduct our politics.

Here’s what I don’t understand: Vaidhyanathan says this and much worse about Facebook, but also, in the same interview and elsewhere, discourages people from leaving Facebook. “By removing yourself from Facebook, you remove yourself from the concern. If you are active on Facebook and you watch how people relate to each other and how it affects you, you can be sensitive to the larger condition.” I don’t see how this follows. I don’t see why you have to be on Facebook to critique Facebook, or to see the damage it does. I ditched Facebook in 2007 but that hasn’t “removed me from concern” about the social and political damage it’s doing. Is Facebook really going to be responsive to people who stay on it no matter how foul it becomes? 

css.php