Sinister Obligations

The economics behind our obsession with glowing screens.

//mark jay
 

A young woman is sitting alone in the center of a harshly lit room. She is staring at a large screen projecting a surrealist blur of images and sounds. A tangle of electrodes on her scalp is connected to a computer which maps her brain activity. The computer is surrounded by a team of neurologists who study the rhythms of her alpha and beta waves.

Later on, as the doctors pore over the results of the electroencephalogram (EEG), they are not searching for signs of epilepsy or a tumor; rather, they are looking for signs of receptiveness to an advertisement for the newly proposed Taco Bell menu item: The Supreme-Cheezy-Tex-Mex-Chex-Mix-Gordita-Baja.

© 2015 Megan LaCroix, "Untitled"

© 2015 Megan LaCroix"Untitled"

Scenes slightly less absurd than this one are becoming commonplace. Companies such as Coca-Cola, Yahoo, and Frito Lays all use “neural marketing” techniques that make a science of capturing human attention. This involves mapping test subjects’ brains to determine whether the parts that deal with emotion (amygdala) or memory (hippocampus) are triggered when exposed to new product designs, commercials, or catch phrases. Basically, in Freudian terms, companies like NeuroFocus craft ads that are chemically proven to seduce the Id in the milliseconds before the Ego has time to come in and mess everything up by asking questions like: do I really want to buy this shit?

It would be easy to write this off as a fad, as the comical, ill-fated union of 1984 and Mad Men. But I think the emergence of neural marketing has a lot to tell us about where we are as a society, and where it seems we’re headed.

****************

The economy has to grow. Economists - left, right, and center - agree that anything less than three percent growth in the global economy spells trouble. When there’s no growth, we get an economic crisis.

But the economy doesn’t just have to grow – it has to grow at a compound rate. To show what this means, David Harvey has referred to the apocryphal meeting of an old king and the inventor of chess. The king was eager to reward the man for his invention, and told him that he could have anything he’d like. The inventor, in turn, said that all he wanted was for the king to put a single grain of rice on the first chess square, and double it on every subsequent square. The king, impressed by the man’s humility, quickly agreed. But it didn’t take him long to realize that he’d been duped. By the 21st square, more than a million grains of rice were required. After the 41st square, there wasn’t enough rice in all the world to meet the request.

In one version of the story, the king, once he figured out the trick, had the inventor beheaded on the spot.  [1]

In 1970, 3% compound growth meant that there needed to be $6 billion of new profitable investment in the global economy. In 2012, $2 trillion more needed to find a profitable outlet. By 2030, the number will be $3 trillion. [2]

In the US — where consumer activity accounts for 70% of national GDP (as compared to China, where consumerism is only 35% of GDP) — compound growth necessitates our buying more and more products and services. [3] This non-stop consumption is administered by what the CEO of Google, Eric Schmidt, has termed the “attention economy.” In the late ‘90s Schmidt declared that the dominant global corporations would be the ones that maximized the number of “eyeballs” they could perpetually engage, control, and direct. [4] The primary site of accumulation in this “attention economy” is the human brain, and the laptops, phones, televisions, iPads, and other glowing screens that we spend most of our lives staring at have turned each waking hour into an opportunity for corporations to vie for our psychic attention.

As Columbia professor Jonathan Crary writes in his recent book 24/7:

Within the space of barely fifteen years, there was a mass relocation of populations into extended states of relative immobilization. Hundreds of millions of individuals precipitously began spending many hours of every day and night sitting, more or less stationary, in close proximity to flickering, light-emitting objects.
— [5]

And the psychological effects from our perpetual engagement with these glowing screens have been dramatic. For starters, this technology is addictive. But unlike smoking cigarettes or drinking coffee, there's no rush. In a seminal study from the late ‘80s, test subjects reported that watching TV for extended periods of time made them feel worse than when they didn’t watch, and yet when they couldn't watch TV they experienced withdrawal and depression. [6] Today, the average American spends more than five hours a day watching video content. [7]

But addiction to television was just the tip of the iceberg. Results from a few recent surveys lay bare our even greater dependence on smartphones:

  • 1/2 of Americans check their phone when they wake up in middle of the night. 
  • 40% of Americans habitually use their phones while on the toilet.
  • ¼ of Americans admit to regularly texting and driving.
  • The average American uses their phone upwards of three hours per day.

As distressing, or numbing, as these facts are, it’s crucial to understand that a reconstruction of our psycho-social makeup is underway that goes well beyond addiction. Indeed, the way we exist in this world has fundamentally changed. As Professor Crary writes, “Instead of a formulaic sequence of places and events associated with family, work, and relationships, the main thread of one’s life story now is the electronic commodities and media services through which all experience has been filtered, recorded, or constructed.” [8]

Now that the majority of Americans no longer work in production or agriculture, the distinction between work, leisure, entertainment, and relaxation can often be reduced to: where, and in what clothes, do we stare at our glowing screens.

Psychoanalysts have started to show how our technology-obsessed world – in which the majority of American babies now learn more words from machines than from their parents — is becoming incompatible with the development of traits such as compassion, patience, and empathy. [9]

The atrophy of these traits is blatant in the political arena – more and more it seems that we respond to problems like poverty and homelessness in the same way we respond to annoying pop-up ads and computer viruses: we act like these problems are inconveniences that we shouldn’t have to deal with in the first place. Consider the 71 US cities that have passed or are trying to pass laws against feeding the needy, or the civic architectural phenomena of curved benches and spiked ledges that ensure that people with nowhere else to sleep cannot sleep in public spaces.

Once it becomes obvious that these problems do exist, we try to make the needy and displaced less of a public nuisance. The market then takes over, allowing the suffering to continue, so long as it’s removed from the public eye. Hence the 300,000 Americans that are currently behind bars in for-profit prisons, the hundreds of thousands of immigrants currently being detained in the US — in conditions condemned by the UN — by for-profit contractors, and the privately-administered wars the US perpetually engages in around the poorest parts of the world that have created a $50-plus billion mercenary market. [10]

Finally, when these social antagonisms insist on flaunting themselves despite our best efforts to avoid them, the last resort is public spectacles that punish those morally repugnant folks who are causing all of our problems. Hence, in the last instance, our answer to problems like urban crime takes the form of police-raids/media-extravaganzas, which allows us to blame the victims of our compassionless society while also alleviating our own sense of guilt in the process (so we can return to our glowing screens with peace of mind, vindicated that the reason we are free to enjoy our customized entertainment is because we are not like them).

Professor Crary writes that “because of the infinity of content accessible 24/7, there will always be something online more informative, surprising, funny, diverting, impressive than anything in one’s immediate actual circumstances.” [11] And even though it might seem appealing in each individual moment, a lifetime spent searching for ephemeral gratification is less likely to make us satisfied and politically engaged than it is to create a society of isolated, narcissistic, and anxious individuals.

The crucial point is that this is actually a good thing from the standpoint of the economy.

If you’re depressed, then join the more than 30 million Americans currently on antidepressant medication, helping to create a $10-plus billion antidepressant drug market.

If you’re tired, then join the more than 50 million Americans currently taking sleeping pills. North Americans now sleep, on average, 3.5 hours less than they did a century ago: from 10 hours in the early 1900s, to 8 hours a generation ago, to 6.5 hours today. Whether or not this sleep reduction is symptomatic of our nonstop engagement with glowing screens, one thing is certain: the $55 billion sleeping pill market has become vital to our economy. [12]

As a society, we’re considering the construction of the Keystone Pipeline XL — which scientists have warned could devastate the environment — because of its economic benefits, which amount to little more 3,900 temporary jobs and windfall profits for the Koch Brothers. But no matter which side you’re on in the Keystone debate, the weight of the economic considerations should be an omen. If a few thousand jobs means so much, then what would we do without the $65-billion-sector of the economy sustained by sleeping and anti-depressant pills? What would happen to economic growth if upwards of $50 billion of profitable investment couldn’t be sunk into perpetual war? Seen through this lens, long-term peace and a collective sense of meaning don’t seem to be compatible with ceaseless economic growth.

What is compatible with economic growth is that we try to have fun despite it all: as Alain Badiou puts it, “enjoyment has become a sinister obligation.” [13] Each day we are overloaded with messages such as Be Yourself!, Go for it!, and Relax! And the way that we respond to these directives is synchronized with the demands of the attention economy: “What was once consumerism has expanded to 24/7 activity of techniques of personalization ... Self-fashioning is the work we are all given, and we dutifully comply...” [14]

Be it via social media, internet pornography, customized movie-watching experiences, message boards, made-to-order shopping experiences, personalized dating sites, etc. — the more we find out about ourselves, the more we express ourselves, then the more we should reflect on those disturbing, hyper-personalized ads that instantly try to sell us stuff based on our recent browsing history.

These creepy ads point to some larger questions. The very selves that we express online — are they the manifestations of our deepest, most innate wishes? The way we define ourselves as punks, hipsters, divas, rebels, foodies, chill bros, libertarians, bohemians, etc. — are these identifications the free expressions of our intrinsic selves? Or are they pre-packaged identities, sets of clothes, so to speak, that are sold to us by the same corporations that prompt us with 10,000 selling messages per day, all urging us to define ourselves? [15]

****************

In his 2005 commencement address at Kenyon College, David Foster Wallace urges the audience to resist the feelings of impatience and narcissism that come so naturally in our consumer society. He says that the next time we are in the supermarket, we should try to

look differently at this fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line …. Maybe she’s been up three straight nights holding the hand of a husband who is dying of bone cancer. Or maybe this very lady is the low-wage clerk at the motor vehicle department, who just yesterday helped your spouse resolve a horrific, infuriating, red-tape problem through some small act of bureaucratic kindness.

As nice as that sentiment is, our collective narcissism and impatience seems to be less a moral than a political-economic problem. [16] Sure, it’s true that after a long and tiring day each of us could individually attempt to redirect our selfish urges. But it’s also true that the average American spends nearly half of their waking life staring at glowing screens that communicate the same message: You should not have to wait when the urge hits to buy, communicate, browse, or watch. If we can go onto Amazon and buy anything we want with just one click, it seems unlikely that next time we’re in line at the supermarket we’ll suddenly transform into a society of patient and contemplative individuals, regardless of the strength of our moral compulsions.

As long as we subscribe to an economic system that needs to grow, and marketers are deploying cutting-edge neurological techniques in order to ensure that growth, then, in response to Wallace’s question — “Aren’t there parts of ourselves that are just better left unfed?” — society will always be compelled to give the same answer: No.

Sources:

[1] Harvey, David. Seventeen Contradictions and the End of Capitalism. 2014. Page 226.
[2] ibid. 2014. Page 228.
[3] ibid. Pages 194/5.
[4] Crary, Jonathan. 24/7. 2014. Page 75.
[5] ibid. Pages 40 & 80.
[6] ibid. Page 87.
[7] ibid. Page 84.
[8] Crary, Jonathan. 24/7. 2014. 
[9] For more see: Berardi, Franco. Heroes. 2015.
[10] Robinson, William I. Global Capitalism and the Crisis of Humanity. 2014. Page 204.
[11] Crary, Jonathan. 24/7. 2014. Page 59.
[12] ibid. Pages 11 & 18.
[13] Badiou, Alain. Polemics. 2006. Page 104.
[14] Crary, Jonathan. 24/7. 2014. Page 72.
[15] For more see: http://www.ubishops.ca/baudrillardstudies/vol11_1/v11-1-moser.html
[16] It initially caused internet shockwaves when it was revealed that DFW voted for Ronald Reagan. However, a close reading of his later works, especially The Pale King, makes it easy to identify Wallace’s deep conservatism.


//Mark Jay is a co-founder of The Periphery.


 

<<back to issue xv                                                                     POST A COMMENT >>