Social Media is Ruining Us

Over at the Atlantic, Jonathan Haidt has a piece on Why the Past 10 Years of American Life Have Been Uniquely Stupid. I don’t completely agree with his premise and feel like some of the examples are a bit of a reach, but the central premise – namely, that social media (and especially algorithmic social media) is rapidly eroding the social fabric and making us collectively dumber – feels pretty spot on, with some citations from research on the matter to back it up.

But gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.

Once social-media platforms had trained users to spend more time performing and less time connecting, the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.

Further:

By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous” for a few days. If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.

This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment, and their prediction of how others would react to each new action. One of the engineers at Twitter who had worked on the “Retweet” button later revealed that he regretted his contribution because it had made Twitter a nastier place. As he watched Twitter mobs forming through the use of the new tool, he thought to himself, “We might have just handed a 4-year-old a loaded weapon.”

As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.

And let’s not forget what current trends in AI end up allowing:

Now, however, artificial intelligence is close to enabling the limitless spread of highly believable disinformation. The AI program GPT-3 is already so good that you can give it a topic and a tone and it will spit out as many essays as you like, typically with perfect grammar and a surprising level of coherence. In a year or two, when the program is upgraded to GPT-4, it will become far more capable. In a 2020 essay titled “The Supply of Disinformation Will Soon Be Infinite,” Renée DiResta, the research manager at the Stanford Internet Observatory, explained that spreading falsehoods—whether through text, images, or deep-fake videos—will quickly become inconceivably easy. (She co-wrote the essay with GPT-3.)

It’s a bit of a long read, and it’s okay if you don’t 100% agree with his points, but there’s a lot there worth considering. He does end the piece somewhat hopeful, by offering some suggestions on things that could be done to help the situation. I’m a bit less optimistic that we’ll be able to implement any of those reforms, and unfortunately have no other ideas of ways to come back from our current state.

Lost Touch

An article from The Guardian that’s been sitting in my backlog for a while (it was published back in January, 2021), Eleanor Morgan discusses “Lost touch: how a year without hugs affects our mental health“. It’s not a particularly long read, but does have some good links to research and other information about the impact not having enough human contact in your life can have.

As adults, we may not comprehend the importance of touch even when it disappears. “We might begin to realise that something is missing, but we won’t always know that it’s touch,” says Prof Francis McGlone, a neuroscientist based at Liverpool John Moores University and a leader in the field of affective touch. “But when we talk about the problem of loneliness, we often ignore the obvious: what lonely people aren’t getting is touch.”

Certainly strikes true to me.

“Touch is a modulator that can temper the effects of stress and pain, physical and emotional. We have seen in our research that a lack of touch is associated with greater anxiety,” says [Dr. Katerina] Fotopoulou. “In times of high stress – the loss of a job, or a bereavement, for example – having more touch from others helps us cope better, particularly in calming the effects of [the stress hormone] cortisol.” Even if we’re used to not being touched a lot, after a while the need can feel very physical – sometimes described as “skin hunger” or “touch hunger”.

I also thought it was interesting to hear about CTs – nerves we have that are keyed for gentle contact:

The two square metres of skin that contain us are teeming with nerve fibres that recognise temperature, texture and itch, etc. One set of fibres exists purely to register gentle, stroking touch: the C tactile afferents (CTs). [Professor Francis] McGlone has been studying this since 1995, when it was discovered in humans. “These neurons, in the skin of all social mammals, transmit slow electrical signals to the emotional processing parts of the brain. They play a critical role in developing the social brain and our ability to withstand stress.”

I’ve discussed loneliness and depression on here before, it’s a bit of a recurring topic. I’ve seen and read other articles discussing the topic of the need for physical contact (and could have sworn I’d linked to some on here, though they seem to be escaping my search at the moment), and it’s something I’ve definitely given a lot of thought to – I’ve gone through various periods in my life where I had very little physical contact, and know from experience what a difference it can make on my general health, happiness, and wellbeing.

Efficiency != Effectiveness

Over on fs.blog, an article discussing how Efficiency is the Enemy. It’s got some solid observations, mostly gleaned from a book on the subject by Tom DeMarco called Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency. For instance:

It’s possible to make an organization more efficient without making it better. That’s what happens when you drive out slack. It’s also possible to make an organization a little less efficient and improve it enormously. In order to do that, you need to reintroduce enough slack to allow the organization to breathe, reinvent itself, and make necessary change.

Tom DeMarco

There’s been some articles already that talk about the idea of leaving room for flexibility and inspiration in creative endeavors, but I do think it applies to other kinds of work as well. The urge for “efficiency” I feel like is driven by the same Puritan-work-ethic mentality that loads up children with hours of homework every night – namely, that idleness is inherently wasteful, rather than an essential part of healthy productivity and learning.

Digital Nomad Timeline

Found via Kottke, here’s an excellent timeline of the idea of remote work and digital nomadism. The idea has been around for a long time – the timeline starts in 1964, with Arthur C. Clarke predicting it (well before it was broadly feasible – it’s sort of remarkable how much some of those 50’s and 60’s futurists managed to nail it). It’s been a long time coming, and while it’s not for everyone, the pandemic certainly gave many more people the chance to try it out. (It’s an imperfect trial, since being in quarantine and many places in lockdown isn’t necessarily indicative of what it would be like if you didn’t have that restriction and background stress.)

It’s unsurprising but sad that many companies are already insisting people come back to the office, despite it: a) arguably being too early given vaccination rates, new infections, and variants; b) not being necessary, based on general productivity gains and losses compared to in-office; c) not being what their employees want, many of whom seem to prefer either remote or a hybrid of in-office and remote. (Personally, I’m quite happy working remotely 90% of the time, but recognize that it’s useful to get some real face time, too. Anecdotally, I seem to do best when I’m off remote most of the time, then go into the office maybe once or twice a week. I’d be interested in trying out something like being primarily remote and then coming to work from the office for a week or two maybe once a quarter or a few times a year.)

Anyway, definitely some food for thought, and interesting to see the sort of evolution and adoption of digital nomad lifestyles across the past few decades.

The Outrage Machine

Over at Garbage Day, Ryan Broderick discusses A Unified Theory of Online Anger, noting how algorithmic social media has effectively been weaponized (notably by the right, but let’s be honest, not just by them). They’re not wrong.

As these trending main characters go viral on Twitter, hundreds of online outlets race to turn this into content. And there’s a real financial incentive for covering these stories. As most people working at various content mines can tell you, the thing Facebook readers love the most is getting mad about stuff that’s happening on Twitter.

Ryan Broderick
Continue reading “The Outrage Machine”

Tracking Overnight Camping

If you are like me and love the National Parks, but hate the crowds, you might appreciate A Night Under the Stars. It’s a look at when people are spending the night the most in each national park. It’s pretty interesting to see when each park is most popular (and by extension, which periods should be low population but still nice to visit). Found via the inimitable Kottke.

How Life Became an Endless, Terrible Competition

Over at the Atlantic, Daniel Markovits discusses how Meritocracy harms everyone. While many of the examples he cites are centered around the wealthy elite, that’s kind of the point – it’s already clear to people not in that caste that the notion of meritocracy, while good in concept, is a sham in practice. As soon as the criteria for “merit” become understood, it is gamed by those who are in a position to do so, further entrenching the already wealthy (but with a slightly more palatable veneer than the up-front nepotism of previous systems).

Hardworking outsiders no longer enjoy genuine opportunity. According to one study, only one out of every 100 children born into the poorest fifth of households, and fewer than one out of every 50 children born into the middle fifth, will join the top 5 percent. Absolute economic mobility is also declining—the odds that a middle-class child will outearn his parents have fallen by more than half since mid-century—and the drop is greater among the middle class than among the poor. Meritocracy frames this exclusion as a failure to measure up, adding a moral insult to economic injury.

Daniel Markovits

I agree that it’s a problem, and I don’t really know a good solution. It sounds like the author has some ideas (some of which sound a little… optimistic to me). It’s clear that things are coming to a head, though, and something is going to have to change.

Night Owls For Life

If you’re just not a morning person, science says you may never be, over at Vox. This is from a few years ago, but I only came across it relatively recently: there’s increasing evidence that whether you’re a night person or a morning person has a genetic component, and fighting it can have significant impacts on your health. A salient bit:

Even people who are slightly more oriented to the evening — people who would like to sleep between 1 am and 9 am, say — may be faced with a difficult choice: Listen to your body, or force it to match the sleep habits of most everyone else?

Research has been gaining insight on that question. It turns out our internal clocks are influenced by genes and are incredibly difficult to change. If you’re just not a morning person, it’s likely you’ll never be, at least until the effects of aging kick in.

And what’s more, if we try to live out of sync with these clocks, our health likely suffers. The mismatch between internal time and real-world time has been linked to heart disease, obesity, and depression.

Brian Resnick, Vox

For the record, I’ve always been a night owl. When left to my own devices (extended periods where I was setting my own schedule), I tend to go to bed around 2am and then get up around 10am. That’s still just 8 hours, but because it’s offset from the schedule most of society runs at, it still comes across as oversleeping. (These days, a mixture of work and the dog keep me getting up earlier… but I also find myself feeling like I need a nap more often, as well.)

It reminds me of something from The Devil’s Dictionary:

DAWN, n. The time when men of reason go to bed. Certain old men prefer to rise at about that time, taking a cold bath and a long walk with an empty stomach, and otherwise mortifying the flesh. They then point with pride to these practices as the cause of their sturdy health and ripe years; the truth being that they are hearty and old, not because of their habits, but in spite of them. The reason we find only robust persons doing this thing is that it has killed all the others who have tried it.

Ambrose Bierce, The Devil’s Dictionary

Instagram Influenced Architecture

From the Guardian: Snapping point: how the world’s leading architects fell under the Instagram spell. We’ve always had architecture-as-spectacle – if anything, Instagram is just the latest in a series of driving forces. But it’s still worth thinking about. Something the author notes:

Configuring buildings and public spaces as selfie sets may well work for tourism promotion and the buzz of a launch, but once the novelty factor has worn off, the whimsy can grate and the flimsiness become all too apparent. The urge for quick, affordable spectacle often leads to stick-on, paper-thin cladding materials that look good in photographs, but weather terribly. The stained, peeling facades of the last decade stand as a grim testament to prioritising photographability over function.

Oliver Wainwright

Art is wonderful, and public art is essential. It enhances and enriches the spaces we inhabit. But when we insert an overtly capitalist motivation of cashing in on a craze, we sacrifice the care and consideration in how we craft this art.

Something this also gets me thinking about is replication. That is, our penchant for seeing an interesting photo, and striving to replicate it. This isn’t new, and in many cases is designed (consider things like Yosemite, where you round a corner and get a reveal of El Capitan or Half Dome. You stop, and say “Wow,” and take a picture. This experience was designed, over a century ago, and is part of why we have 400 million nearly identical shots of these locations). It just bums me out, for some vague reasoning I can’t quite put a finger on, that rather than being inspired by an interesting photo, there is this impetus to simply replicate it. I should probably chew on that a bit.