Generation Numb: How Losing My Phone Exposed Me to the Pain of My Peers

This terrible void of which everyone stays pleasantly unobservant is the unofficial sickness of Gen Z.

By Ben Scheer

Source: ScheerPost

I hadn’t planned to give up my smartphone. After all, I was starting my sophomore year of college and I had not met a single adult who lives without one and definitely no other 19-year-old. If you haven’t noticed, it is part of the culture. A few weeks into the semester, I forgot my phone (a Google Pixel 2, for all you phone nerds) in the backseat of an Uber on my way back from shopping off campus. This would lead to an opening to consider a break from the phone and everything that comes with it.

I remember feeling frustrated at myself for being so absentminded and immediately rushed to track it down, anxiously calling the driver from a friend’s phone. While I listened to the open ringing on the line I began to think about this one technology’s central role in my life.

That gateway to other worlds.

Over the next few days I continued to hope my phone would come back to me and also thought further into its role in my relationships. I felt how hindering it was in communicating with the people I love.

How it had required me to be always ready to pluck it from my pocket, pull my focus and as a result, never be truly present. I saw in that reflection the possibility for healthier relationships, less dependent on constant digital connection.

More free, less reliant.

Sure, there were a thousand things that would be made more challenging and tedious, but it would be worth it if I could be more present for my own life.

Days later, after I had given up hope of seeing it again, I used a friend’s phone to call my dad to tell him.

“I will get you another one, what kind do you want?” was the first thing he said.

“Actually I think I’m good,” I responded, to his surprise. “I can write you and my mom by email. If we want to talk, I can call you from my computer.”

Quickly, I started to feel the change. My mood and habits became clearer; I felt happier, more grounded, less looming anxiety, feeling more alone and with myself, more conscious of my space and those in it.

Being alone led to more opportunities for reflection and boredom. I felt calmer, and my walking and pace of day actually slowed down. It is hard to describe how it changed the patterns in my brain but I could feel it readjusting, as it does if you spend a week in the woods or on a beach. One thing I felt was that my days were longer and more connected from one moment to the next.

I also became more aware of the moods and feelings of others around me and was suddenly terrified to see how dependent my peers were on all their screens, and, looking back just a bit, how consumed I, too, had been.

Something else bubbled up from my unconscious: My expectations of college differed from what I was seeing. Between talking to older relatives and seeing college life on TV and in the movies, I was expecting a . . . BIT MORE JOY! Young souls, free minds, positive energy, community. I was seeking some bright spark in the people around me, yet all too often what I saw was dull, void, distractible, unthinking, unfeeling people enveloped in the world of their screens instead of being in community and conversing with the people around them.

Yes, that is harsh. It is also what I see.

So much lost human potential.

I want something or someone to blame, but maybe what I should really be blaming is myself for expecting something else. The truth is, though, I feel right in blaming the phones.

It has been over a year since I gave up my smartphone, but I remember vividly when looking at my phone for hours a day seemed normal to me. I was content to stare at the screen for hours, doing the same thing: Watching mildly entertaining videos or mashing game buttons, swiping on Tinder, scrolling on IG, surfing memes and YouTube videos. In many ways, it’s a drug too good to give up, seemingly a harmless drug, but no. We have opened Pandora’s box of marvels and there is no way to close it now.

We are continually wanting what we do not yet have, an innate motivation amplified and rerouted by capitalism’s hyperdrive marketing engines. A good fix for this constant yearning, or any discomfort is to drown it out with distractions. Feeling an uncontrollable emotion? Existential itch? Low-grade anxiety? No problem: Try scrolling on the social of your choice for a bit, trust me, it takes the edge off.

Trauma or poverty, pain or loneliness, all manner of worry, we have a cure! Or at least the emotional response can be staved off long enough for you to find the next thing to click.

This terrible void of which  everyone stays pleasantly unobservant is a part of me as well, the unofficial sickness of my generation. (COVID-19 will be the official one, memorialized in our virtual yearbooks.) This hole, this pit, that is created from not knowing one’s self, not trusting one’s self, or not allowing feelings. So distracted that these insecurities that we act on every day are a mystery. That you cannot see how fundamentally OK everything actually is because you have music constantly in your ear and a fear that, in silence, your own thoughts would be too scary. The sickening thought that your true feelings for someone were no more than an act of comfort-seeking desperation and that “love” is a lifesaver from yourself.

I can’t blame anyone though. The distractions surround us, they surround the people we are with. The collective unfeeling is so great, so vast, it cracked me open, actually made me cry often once I allowed myself to experience it. I felt like I was losing my footing in the world in which I had grown up. Disconnected from my past and unsure of where we were going.

(Side note: If you haven’t yet watched The Social Dilemma, go do that, it’s worth it.)

Walking into a small college class to see no one looking at each other or chatting, the blank vacant stares. The soft glow. It felt … confusing? Why were so many interesting young people ignoring one another? What was a classroom like before these tools were available and allowed everywhere? Why were the screens so celebrated?

Then my confusion grew, and from it rage and pain. These emotions danced together, fueled me and crushed me. I saw, for the first time, the real power that the phones held. Yes, a power we had given them, but a power nonetheless. The power to keep people who were 500 miles apart tethered to one another, or keep people sitting right next to each other brutally apart.

Who can I blame? It is just the world I have been born into, and the technology- and profit-driven changes are coming faster every year. We live in the future with brain equivalents in our pockets and a web that contains all of our collective knowledge but reveals the worst of what we are and so little of the love we contain and of which we are capable.

Is the internet a good idea? Are phones good for humans? These are not questions we have answered or even barely thought to ask in the mainstream of our culture. Of course they are, what else could they be? Progress! Faster communication! More knowledge! More speed! More fun! We will all be more efficient and entertained, and from that we will live better lives.

NO! Stop, slow down. Please. These gadgets, these everywhere-anytime screens, are ruining the minds of the people who have been mesmerized since they were only little children.

To clarify: not ruining, in that people are made stupid (although it definitely does not help develop our attention span), but rather ruining our emotional capacity and spiritual selves. And these limits fetter us as we are already so strained by the pain we are able to see through our little windows; we don’t have a moment to feel, to feel our own pain, or that of our friends.

Or, for that matter, the pain of the world, or the impact of our constant consumption, or where all this STUFF comes from and where it ends up, or the pain of our history and centuries of exploitation of other humans and nature from which all this wealth originally was derived.

Being human is objectively great. We can use language and words to describe how we feel. We can feel the sun on our face and take in the taste of a thousand foods, dance to every song. Feel great pain and great happiness. Yet, all the time I see this discomfort, unrest and fear in my most distracted friends. It is coming from the disconnection from what they are feeling, a disconnection from being.

This is the worst theft of modern-day, first-world life: the theft of being present.

The phones and distractions and speed create this disconnection from the simple fact that you are actually OK, better then OK — you are alive and can feel and that is a gift to cherish.

Worse than the days of sadness are the days of not feeling anything. Some people are depressed not because anything is wrong but because they are numb to their feelings, too distracted and avoidant to feel them; avoidant, and scared of what might happen if they let themselves feel.

Try this: Really study those you love, let their face become new to you, again, and become fascinated in the way they move. Feel the wonder of being with another human.

After more than a year without a smartphone, I am more hopeful than I was in those first few months. Not to say that everything is all right, far from it. Here in the United States our technology has allowed for multiple realities to coexist alongside one another and I don’t see a clear path out of that problem. Distraction and narrow self-interest prevent us from seeing and grappling with the apocalyptic future climate change can bring in a much shorter time frame than even most globally aware people are willing to accept.

However, I do know that change is a constant and I am healed by looking past my own lifetime and by all the simple beauties of life and living. In the end, these technologies do NOT define us. We can work together to change how we coexist.

 


Postscript: For those who are interested in also letting go of your smartphone, I would recommend downgrading to a flip-phone or phone of a similar level. I now walk around the world with a thick, red brick through which I can communicate with my work and family in case of emergencies. 

What happened to individual empowerment in the internet age?

By Kurt Cobb

Source: Resilience

Apple Computer’s 1984 Superbowl commercial—one of the most iconic television commercials ever made—announced two things: the introduction of the Macintosh computer and that this computer could in some fashion allow each of us to escape a future of tyranny and social control prophesied in George Orwell’s dystopian novel 1984.

The computer age and the coming of the internet have certainly moved more power into the hands of the individual, giving him and her access to social and professional connections around the world, information on every conceivable topic, and awareness of events in real-time or near real-time across the globe. The possibilities of the combined computational power of the modern computer and the connectivity of those computers across the globe are still being explored and expanded every day.

So, how is individual empowerment faring? Not so well. It turns out that practically every device, piece of software and internet platform not only holds the promise of enhancing the individual’s power but also can be weaponized to undermine it.

We somehow forget that for every thing and every person we can look up on the internet, those things and people can look back. Naturally, we can try to protect ourselves with antivirus programs and firewalls. But as with any arms race, there is a never-ending back-and-forth struggle to create better tools and strategies for snooping on and disrupting computers and their networks and simultaneously to build defenses against the newest methods of attack and surveillance.

But I am less concerned with this battle than I am with the voluntary things we do that undermine all the individual empowerment that was supposed to come our way.

The single most important power humans have is their ability to pay attention. It’s our focus that allows us to do not only our daily tasks but also to perform progressively better at tasks we choose. Now the most important thing to know about our attention is that it is a limited resource. There are only so many hours in a day and only so many of those when we are not sleeping and only so many of those when we can pay attention to something outside our basic needs of eating, getting a livelihood, staying safe.

I have noticed a distinct generational divide between those who have grown up with cellphones and computers and those who purchased their first cellphone and personal computer after age 30 when their daily habits and outlook were already well-cemented. Those who joined the computer, internet and cellphone age as adults tend to see these devices and networks as tools for accomplishing certain tasks they had previously accomplished some other way such as keeping a calendar, holding meetings and writing and sharing documents.

Those who grew up in the age of the computer, cellphone and internet view these technologies as portals to experience. The most important things that are happening in their lives, social, cultural, and economic, are happening online and via cellphone. Experiences mediated through electronic means have become primary and more important than direct immediate experience.

I am most struck by this when I walk the streets and see person after person listening to something coming from their cellphones as they walk, run or bicycle. People can listen to whatever they like as far as I’m concerned. But it occurs to me that they cannot simultaneously pay careful attention to the world right in front of them AND to whatever they are listening to.

It reminds me of the quote attributed to then California gubernatorial candidate Ronald Reagan about the cutting of the state’s redwood forest: “If you’ve seen one redwood, you’ve seen them all.” That’s not exactly what he said, but his opponent captured Reagan’s view all too well.

In any case, we now have a segment of the population which apparently believes that there is little to notice in any environment and for whom the physical world is just a concept and not at all the endlessly complex, differentiated, nuanced and ever-changing place that I experience.

Today, we have virtual reality to entertain us, complete with virtual reality goggles. Does it not occur to those putting on the goggles that they are limiting their reality rather than expanding it? That they are limiting it to what the creators of that particular virtual reality wish to convey? And, all this is undertaken when the reality that is right in front of them in their homes, workplaces, and outdoors has barely been explored or understood.

Every modern communications device, cellphone, computer and virtual reality machine gives us a highly edited version of the world, one designed especially to meet the goals of those who created the devices. The primary goals are making money and controlling our behavior in order to get us to pay more attention (so our attention can be sold to advertisers) and/or to get us to make additional purchases.

The addictive quality of these technologies has been well-documented. But it is in the nature of the addict to believe that he or she is being nourished by the very things to which he or she is addicted. And, that is the most devilish trick of modern networks: the idea that our futures and very well-being will be enhanced—when, in fact, our autonomy is simply being dissipated as we give our attention to things which sap our own power and health and enslave us to marketing and programming executives who themselves are caught up in a system that does not value individual autonomy in the least.

There are, of course, the myriad ways in which individual empowerment has backfired and put us into far more danger than ever before. The ability of a small group of hackers to tap into critical networks which service power generating stations and water and sewer plants is a rising concern. Especially concerning are nuclear power plants.

Miniaturization technology is making it possible to put more and more destructive power in smaller and smaller packages. Combine that with the ready availability of drones and you get a lethal combination.

The rise of designer viruses, though not yet open to those without sophisticated laboratories, threatens an unstoppable epidemic.

Empowering the individual sounds great when you say it. But it helps to be specific about what kind of power you want the individual to have and how that power might be used in nefarious ways or simply dissipated by absorbing a person’s attention in ways that undermine that empowerment.

The centers of official power—economic and political elites, corporations, and the government security apparatus of police, intelligence agencies, militaries—are all petrified at the vast destructive power flowing to the hands of individuals and small groups. And, they are equally petrified at the individual economic, political and social empowerment available to the individual through the very technologies these elites have unleashed.

The first threat they use as an excuse for blanket surveillance, preventive detention and secret prisons. The second threat they hope will be dissipated by the myriad distractions that the commercial interests which now largely control the internet provide to the public.

Orwell knew: we willingly buy the screens that are used against us

By Henry Cowles

Source: Aeon

Sales of George Orwell’s utopian novel 1984 (1949) have spiked twice recently, both times in response to political events. In early 2017, the idea of ‘alternative facts’ called to mind Winston Smith, the book’s protagonist and, as a clerk in the Ministry of Truth, a professional alternator of facts. And in 2013, the US National Security Agency whistleblower Edward Snowden compared widespread government surveillance explicitly to what Orwell had imagined: ‘The types of collection in the book – microphones and video cameras, TVs that watch us – are nothing compared to what we have available today.’

Snowden was right. Re-reading 1984 in 2018, one is struck by the ‘TVs that watch us’, which Orwell called telescreens. The telescreen is one of the first objects we encounter: ‘The instrument (the telescreen, it was called) could be dimmed, but there was no way of shutting it off completely.’ It is omnipresent, in every private room and public space, right up until the end of the book, when it is ‘still pouring forth its tale of prisoners and booty and slaughter’ even after Smith has resigned himself to its rule.

What’s most striking about the telescreen’s ubiquity is how right and how wrong Orwell was about our technological present. Screens are not just a part of life today: they are our lives. We interact digitally so often and in such depth that it’s hard for many of us to imagine (or remember) what life used to be like. And now, all that interaction is recorded. Snowden was not the first to point out how far smartphones and social media are from what Orwell imagined. He couldn’t have known how eager we’d be to shrink down our telescreens and carry them with us everywhere we go, or how readily we’d sign over the data we produce to companies that fuel our need to connect. We are at once surrounded by telescreens and so far past them that Orwell couldn’t have seen our world coming.

Or could he? Orwell gives us a couple of clues about where telescreens came from, clues that point toward a surprising origin for the totalitarian state that 1984 describes. Taking them seriously means looking toward the corporate world rather than to our current governments as the likely source of freedom’s demise. If Orwell was right, consumer choice – indeed, the ideology of choice itself – might be how the erosion of choice really starts.

The first clue comes in the form of a technological absence. For the first time, Winston finds himself in a room without a telescreen:

‘There’s no telescreen!’ he could not help murmuring.
‘Ah,’ said the old man, ‘I never had one of those things. Too expensive. And I never seemed to feel the need of it, somehow.’

Though we learn to take the old man’s statements with a grain of salt, it seems that – at some point, for some people – the owning of a telescreen was a matter of choice.

The second hint is dropped in a book within the book: a banned history of the rise of ‘the Party’ authored by one of its early architects who has since become ‘the Enemy of the People’. The book credits technology with the destruction of privacy, and here we catch a glimpse of the world in which we live: ‘With the development of television, and the technical advance which made it possible to receive and transmit simultaneously on the same instrument, private life came to an end.’

What does the murky history of the telescreen tell us about the way we live now? The hints about an old man’s reluctance and television’s power suggest that totalitarian overreach might not start at the top – at least, not in the sense we often imagine. Unfettered access to our inner lives begins as a choice, a decision to sign up for a product because we ‘feel the need of it’. When acting on our desires in the marketplace means signing over our data to corporate entities, the erosion of choice is revealed to be the consequence of choice – or at least, the consequence of celebrating choice.

Two historians have recently been pointing toward this conclusion – in quite different ways.

One, Sarah Igo at Vanderbilt University in Tennessee, has argued that Americans’ demands for privacy seem to have gone hand-in-hand with their decisions to sacrifice it over the course of the 20th century. Citizens simultaneously shielded and broadcast their private lives through surveys and social media, gradually coming to accept that modern life means contributing to – and reaping the rewards of – the data on which we all increasingly depend. Though some of these activities were ‘chosen’ more readily than others, Igo shows how choice itself came to seem beside the point when it came to personal data.

Meanwhile, the historian Sophia Rosenfeld at the University of Pennsylvania has argued that freedom itself was reduced to choice, specifically choice between a limited set of options, and that its reduction has marked a revolution in politics and thought. As options are winnowed to those we can find online – a winnowing conducted under the banner of ‘choice’ – we start to feel the consequences of this shift in our own lives.

One can easily imagine choosing to buy a telescreen – indeed, many of us already have. And one can also imagine needing one, or finding them so convenient that they feel compulsory. The big step is when convenience becomes compulsory: when we can’t file our taxes, complete the census or contest a claim without a telescreen.

As a wise man once put it: ‘Who said “the customer is always right?” The seller – never anyone but the seller.’ When companies stoke our impulse to connect and harvest the resulting data, we’re not surprised. When the same companies are treated as public utilities, working side-by-side with governments to connect us – that’s when we should be surprised, or at least wary. Until now, the choice to use Gmail or Facebook has felt like just that: a choice. But the point when choice becomes compulsion can be a hard one to spot.

When you need to have a credit card to buy a coffee or use an app to file a complaint, we hardly notice. But when a smartphone is essential for migrant workers, or when filling out the census requires going online, we’ve turned a corner. With the US Census set to go online in 2020 and questions about how all that data will be collected, stored and analysed still up in the air, we might be closer to that corner than we thought.Aeon counter – do not remove

 

This article was originally published at Aeon and has been republished under Creative Commons.

The Misguided ‘Vault 7’ Whodunit

By Jesselyn Radack

Source: Expose Facts

It is the leakiest of times in the Executive Branch. Last week, Wikileaks published a massive and, by all accounts genuine, trove of documents revealing that the CIA has been stockpiling, and lost control of, hacking tools it uses against targets. Particularly noteworthy were the revelations that the CIA developed a tool to hack Samsung TVs and turn them into recording devices and that the CIA worked to infiltrate both Apple and Google smart phone operating systems since it could not break encryption. No one in government has challenged the authenticity of the documents disclosed.

We do not know the identity of the source or sources, nor can we be 100% certain of his or her motivations. Wikileaks writes that the source sent a statement that policy questions “urgently need to be debated in public, including whether the CIA’s hacking capabilities exceed its mandated powers and the problem of public oversight of the agency” and that the source “wishes to initiate a public debate about the security, creation, use, proliferation and democratic control of cyber-weapons.”

The FBI has already begun hunting down the source as part of a criminal leak investigation. Historically, the criminal justice system has been a particularly inept judge of who is a whistleblower. Moreover, it has allowed the use of the pernicious Espionage Act—an arcane law meant to go after spies—to go after whistleblowers who reveal information the public interest. My client, former NSA senior official Thomas Drake, was prosecuted under the Espionage Act, only to later be widely recognized as a whistleblower. There is no public interest defense to Espionage Act charges, and courts have ruled that a whistleblower’s motive, however salutary, is irrelevant to determining guilt.

The Intelligence Community is an equally bad judge of who is a whistleblower, and has a vested interest in giving no positive reinforcement to those who air its dirty laundry. The Intelligence Community reflexively claims that anyone who makes public secret information is not a whistleblower. Former NSA and CIA Director General Michael V. Hayden speculated that the recent leaks are to be blamed on young millennials harboring some disrespect for the venerable intelligence agencies responsible for mass surveillance and torture. Not only is his speculation speculative, but it’s proven wrong by the fact that whistleblowers who go to the press span the generational spectrum from Pentagon Papers whistleblower Daniel Ellsberg to mid-career and senior level public servants like CIA torture whistleblower John Kiriakou and NSA whistleblower Thomas Drake to early-career millennials like Army whistleblower Chelsea Manning and NSA whistleblower Edward Snowden. The lawbreaker does not get to decide who is a whistleblower.

Not all leaks of information are whistleblowing, and the word “whistleblower” is a loaded term, so whether or not the Vault 7 source conceives of him or herself as a whistleblower is not a particularly pertinent inquiry. The label “whistleblower” does not convey some mythical power or goodness, or some “moral narcissism,” a term used to describe me when I blew the whistle. Rather, whether an action is whistleblowing depends on whether or not the information disclosed is in the public interest and reveals fraud, waste, abuse, illegality or dangers to public health and safety. Even if some of the information revealed does not qualify, it should be remembered that whistleblowers are often faulted with being over- or under-inclusive with their disclosures. Again, it is the quality of the information, not the quantity, nor the character of the source.

Already, the information in the Vault 7 documents revealed that the Intelligence Community has misled the American people. In the wake of Snowden’s revelations, the Intelligence Community committed to avoid the stockpiling of technological vulnerabilities, publicly claiming that its bias was toward “disclosing them” so as to better protect everyone’s privacy. However, the Vault 7 documents reveal just the opposite: not only has the CIA been stockpiling exploits, it has been aggressively working to undermine our Internet security. Even assuming the CIA is using its hacking tools against the right targets, a pause-worthy presumption given the agency’s checkered history, the CIA has empowered the rest of the hacker world and foreign adversaries by hoarding vulnerabilities, and thereby undermined the privacy rights of all Americans and millions of innocent people around the world. Democracy depends on an informed citizenry, and journalistic sources—whether they call themselves whistleblowers or not—are a critical component when the government uses national security as justification to keep so much of its activities hidden from public view.

As we learn more about the Vault 7 source and the disclosures, our focus should be on the substance of the disclosures. Historically, the government’s reflexive instinct is to shoot the messenger, pathologize the whistleblower, and drill down on his or her motives, while the transparency community holds its breath that he or she will turn out to be pure as the driven snow. But that’s all deflection from plumbing the much more difficult questions, which are: Should the CIA be allowed to conduct these activities, and should it be doing so in secret without any public oversight?

These are questions we would not even be asking without the Vault 7 source.

Embedded beings: how we blended our minds with our devices

besucher-mit-smartphone-im-louvre-in-paris_image_width_884

By Saskia K Nagel  & Peter B Reiner

(aeon)

Like life itself, technologies evolve. So it is that the telephone became the smartphone, that near-at-hand portal to the information superhighway. We have held these powerful devices in the palms of our hands for the better part of a decade now, but there is a palpable sense that in recent years something has shifted, that our relationship with technology is becoming more intimate. Some people worry that one day soon we might physically attach computer chips to our minds, but we don’t actually need to plug ourselves in: proximity is a red herring. The real issue is the seamless way in which we are already hybridising our cognitive space with our devices. In ways both quotidian and profound, they are becoming extensions of our minds.

To get a sense of this, imagine being out with a group of friends when the subject of a movie comes up. One person wonders aloud who the director was. Unless everyone is a movie buff, guesses ensue. In no time at all, someone responds with: ‘I’ll just Google that.’ What is remarkable about this chain of events is just how unremarkable it has become. Our devices are so deeply enmeshed in our lives that we anticipate them being there at all times with access to the full range of the internet’s offerings.

This process of blending our minds with our devices has forced us to take stock of who we are and who we want to be. Consider the issue of autonomy, perhaps the most cherished of the rights we have inherited from the Enlightenment. The word means self-rule, and refers to our ability to make decisions for ourselves, by ourselves. It is a hard-earned form of personal freedom and, at least in Western societies over the past 300 years, the overall trajectory has been towards more power to the individual and less to institutions.

The first inkling that modern technology might threaten autonomy came in 1957 when an American marketing executive called James Vicary claimed to have increased sales of food and drinks at a movie theatre by flashing the subliminal messages ‘Drink Coca-Cola’ and ‘Hungry? Eat Popcorn’. The story turned out to be a hoax, but after attending a demonstration of sorts, The New Yorker reported that minds had been ‘softly broken and entered’. These days, we regularly hear news stories about neuromarketing, an insidious strategy by which marketers tap findings in neuropsychology to read our thoughts as they search for the ‘buy button’ in our brains. To date, none of these plots to manipulate us have been successful.

But the threat to autonomy remains. Persuasive technologies, designed to change people’s attitudes and behaviours, are being deployed in every corner of society. Their practitioners are not so much software engineers as they are social engineers. The most benign of these ‘nudge’ us in an attempt to improve decisions about health, wealth and wellbeing. In the world of online commerce, they strive to capture our attention, perhaps doing nothing more nefarious than getting us to linger on a webpage for a few extra moments in the hope that we might buy something. But it is hard not to be cynical when Facebook carries out an experiment on more than 680,000 of its loyal users in which it covertly manipulates their emotions. Or when the choices of undecided voters can be shifted by as much as 20 per cent just by altering the rankings of Google searches. There is, of course, nothing new about persuasion. But the ability to do so in covert fashion exists for one simple reason: we have handed the social engineers access to our minds.

Which leads us to the threat to privacy of thought. Together with his Boston law partner Samuel Warren, the future US Supreme Court Justice Louis Brandeis published the essay ‘The Right to Privacy’ (1890). They suggested that when law was still being developed as codified agreements among early societies, redress was given only for physical interference with life and property. Over time, society came to recognise the value of the inner life of individuals, and protection of physical property expanded to include the products of the mind – trademarks and copyright, for example. But the intrusive technology of the day – apparently, the first paparazzi had appeared on the scene, and there was widespread worry about photographs appearing in newspapers – raised new concerns.

Today’s worries are very similar, except that the photos might be snatched from the privacy of any one of your interconnected devices. Indeed, having institutions gain access to the information on our devices, whether flagrantly or surreptitiously, worries people: 93 per cent of adults say that being in control of who can get information about them is important. But in the post-Snowden era, discussions of privacy in the context of technology might be encompassing too broad a palette of potential violations – what we need is a more pointed conversation that distinguishes between everyday privacy and privacy of thought.

These issues matter, and not just because they represent ethical quandaries. Rather, they highlight the profound implications that conceiving of our minds as an amalgam between brain and device have for our image of ourselves as humans. Andy Clark, the philosopher who more than anyone has advanced the concept of the extended mind, argues that humans are natural-born cyborgs. If that is the case, if we commonly incorporate external tools into our daily routines of thinking and being, then we might have overemphasised the exceptionalism of the human brain for the concept of mind. Perhaps the new, technologically extended mind is not so much something to fear as something to notice.

The fruits of the Enlightenment allowed us to consider ourselves as rugged individuals, navigating the world by our wits alone. This persistent cultural meme has weakened, particularly over the past decade as research in social neuroscience has emphasised our essentially social selves. Our relationship to our devices provides a new wrinkle: we have entered what the US engineer and inventor Danny Hillis has termed ‘the Age of Entanglement’. We are now technologically embedded beings, surrounded and influenced by the tools of modernity, seemingly without pause.

In 2007, Steve Jobs introduced the world to the iPhone with the catchphrase ‘this changes everything’. What we didn’t know was that the everything was us.

Google’s lemmings: Pokémon go where Silicon Valley says

index

An analysis of Ingress and Pokémon Go reveals important truths about corporate control and the ability of our mobile phones to organize our desires.

By Alfie Brown

Source: ROAR Magazine

his article has a clickbaity title but a sobering and concerning point to make. In 2010, Google started up what is now a very important subsidiary, Niantic Inc. Google starts up a lot of companies each year and acquires a great many more, so there is nothing special in this. What is important is that whilst most of us see Google’s acquisition of every “start-up” and endless development of “subsidiary” companies with different names as simply an attempt to completely monopolize the market, the case of Niantic shows us that there is more to the extent of Google’s power.

Six years on from its inception with the launch of its biggest game yet, Pokémon Go, Niantic has hit the headlines and people are finally paying attention to the company, with some apparent leftists even claiming we ought to boycott Pokémon Go. In fact, Niantic have been working on mobile phone psychology and social organization for several years. An analysis of the company’s two big games, Ingress and Pokémon Go, shows us some important truths about the world we are living in, about corporate control and about the ability of our mobile phones to organize our desires.

Niantic developed their first major game, Ingress, in 2011. The game, one of the most important of recent years, is a key ideological tool for Google — one that, unlike Pokémon Go, is little publicized. Ingress has seven million or more players and Ingress tattoos show the degree to which people define themselves by the application. Some players even describe Ingress as a “lifestyle” rather than a “game”. The reader can be forgiven for thinking: “I don’t play it, so why would this apply to me?” But the entertainment coming out of Google via Niantic is in line with Google’s wider project of regulating our movements and experiences of the physical world; unless you don’t use Google or any of its applications, many of which come built-it to our phones and cannot be uninstalled, this applies to you.

Ingress reflects a trend of mobile phone application development (which includes Google Maps and Uber, among other well-known apps) designed to regulate and influence our experience of the city, turning the mobile phone into a new kind of unconscious: an ideological force driving our movements while we remain only semi-aware of what propels us and why we are propelled in the directions we are.

I first considered the importance of mobile phone games to be about a kind of “distraction” — an argument I made in my book and related article in The New Inquiry. Later, when playing Ingress for the first time, I realized there was a lot more to it than this. Ingress, rather than simply distracting us from the city around us, actually trains us to become Google’s perfect citizens. In Ingress, the player moves around the real environment capturing “portals” represented by landmarks, monuments and public art, as well as other less-famous features of the city. The player is required to be within physical range of the “portal” to capture it, so the game constantly tracks the player via GPS. Importantly, it not only monitors where we go, but directs us where it wants us to move.

As such it is very much the counterpart of Google Maps, which is also developing the ability not only to track our movements but to direct them. Of course, Google’s algorithms have long since dictated which restaurants we visit, which cafés we are aware of and which paths we take to get to these destinations. Now though, Google is developing new technology that actually predicts where you will want to go based on the time, your GPS location and your habitual history of movement stored in its infinitely powerful recording system. This, like Ingress, shows us a new pattern emerging in which the mobile phone dictates our paths around the city and encourages us, without realizing it, to develop habitual and repetitious patterns of movement. More importantly still, such applications anticipate our very desires, not so much giving us what we want as determining what we desire.

Here again, the connection with the concept of the unconscious is useful. While some have seen the unconscious as a morass of unregulated desires, followers of Freud and later of Lacanian psychoanalysis have been keen to show precisely how structured the unconscious is by outside forces. Our mobile phones pretend to be about fulfilling our every desire, giving us endless entertainment (games), easy transport (Uber) and instant access to food and drink (OpenRice, JustEat) and even near-instantaneous sex and love (Tindr, Grindr). Yet, what is much scarier than the fact that you can get everything you want via your mobile phone is the possibility that what you want is itself set in motion by the phone.

Into precisely this atmosphere enters Pokémon Go, out just days ago, and already the most significant mobile phone release of 2016. The game is, of course, made by none other than Niantic Labs. A series of hysterical events have already arisen from the ethical minefield that is Pokémon Go. In the case of Ingress, academic study has already been dedicated to the fact that the game has sent young children into unlit city parks at 3am. With Pokémon Go, Australian police have had to respond to a bunch of Pokémon trainers trying to get into a police station to capture the Pokémon within and some people found a dead body instead of a Pokémon. It has already been suggested that Pokémon Go is eventually going to kill someone — and since that article was published someone has crashed into a police car and another has been run-over while hunting Pokemon. But, as with Ingress, it is not the occasional mad story to emerge that should concern us, but the psychological and technological effects of every user’s experience.

The premise of Pokémon Go is simply that you use your GPS to find Pokémon in the real environment and then your camera to make the Pokémon visible, so that the world is enriched by looking through the screen at what lies behind it, as in the image below:

images

The Pokémon itself is an incredible phenomenon deserving of a book length study. Perhaps for now we can say that the Pokémon is the perfect example of what Jacques Lacan called the objet a, that perfectly cute fetishised but illusive object of desire that would truly make us happy if only we could just get our hands on it. We never do, because there is always a newer, cuter and harder to capture version that we just have to catch!

Dystopian visions of what technology and videogames would lead to seem to have got something completely wrong. Depictions of the dystopian videogame future have always tended to see the future as involving each individual isolated from the rest and sat quietly alone in a small room hooked up into a computer through which their lives are exclusively lived. In other words, the importance of the physical environment recedes in favor of the imaginary electronic world. On the contrary to these predictions of the future, we now live in a dystopia where Google and its subsidiaries send us madly around the city almost non-stop in directions of its choosing in search of the objects of desire, whether that be a lover on Tindr, a bowl of authentic Japanese ramen or that elusive Clefairy or Pikachu.

In the 1990s parents could ask their children to “get outside more” to escape the videogame space, but now it is the games that make us charge around the city capturing portals and collecting Pokémon and going on dates. Putting aside the full access that Google gets to your accounts via Pokémon Go, this shows us something really dangerous. It points to the increasing reality that there really is no escape from Google — and that while we are doing what we think we want, believing that we are just using our phones to help us get it, in fact Google has an even greater power, a truly revolutionary one: the ability to create and organize desire itself.

It is this truly revolutionary power that is important when it comes to Pokémon Go and Ingress. To say that these games are revolutionary is not to say that they are doing any good, nor that they are “radical”, and certainly it is not to say that they are left-wing — on the contrary, the revolution in desire appears to be corporate, hegemonic and centralized. If the left is to have any hope, however, it must not resist Pokémon Go, as Jacobin have now famously suggested, but understand and perhaps even embrace the power of the mobile phone to re-organize desire and look for ways forward from here.

 

Alfie Bown is the author of Enjoying It: Candy Crush and Capitalism (Zero, 2015) and The PlayStation Dreamworld (Polity, forthcoming 2017). He is the co-editor of the Hong Kong Review of Books and writes on the politics of technology and videogames for many publications.

SMARTPHONES, SOCIAL MEDIA AND SLEEP: THE INVISIBLE DANGERS OF OUR 24/7 CULTURE

cell-phone-addiction

By Martijn Schirp

Source: High Existence

If there is one book to read about our addictions to work, phones, consumption, and the current state of capitalism, it’s 24/7: Late Capitalism and the Ends of Sleep by Jonathan Crary, a professor of Modern Art & Theory at Columbia University. Crary argues that sleep is a standing affront to capitalism and while that seems grim, it highlights the very real dark sides of always having glowing LED screens clutched in our hands.

Technology has ushered us into a 24/7 state: we live in a world that never stops producing and is infinitely connected. We have digital worlds in our pockets, and we carry our phones and screens everywhere, feeding our dopamine addictions when we’re bored or lonely, cradling us before bed with endless scrolls of news and waking us up with notifications and emails.

The barrier between work and home life has disappeared, and most professionals are able to and choose to continue working all hours of the day in an increasingly competitive, winner-take-all environment.

Most of our time then, is either spent working or consuming (the upside of working so much is money, which is then used to consume): food, drugs, shopping, films, Youtube videos, Instagram feeds, news articles, updates from friends — even socializing-time has been reduced to a passive “Netflix & Chill”.

There are now very few significant interludes of human existence (with the colossal expectation of sleep) that have not been penetrated and taken over as work time, consumption time, or marketing time.

The social-world and the work-world are both digitized, which makes it increasingly difficult to distinguish between the two, and beyond the pop-ups and video ads, individuals have become their own marketers. Building a “personal brand” as a living is not uncommon.

It is only recently that the elaboration, the modeling of one’s personal and social identity, has been reorganized to conform to the uninterrupted operation of markets, information networks, and other systems. A 24/7 environment has the semblance of a social world, but it is actually a non-social model of machinic performance and a suspension of living that does not disclose the human cost required to sustain its effectiveness.

The average North American adult “now sleeps approximately six and a half hours a night, an erosion from eight hours a generation ago, and down from ten hours in the early twentieth century,” and what suffers most from this lack of sleep is our innate ability to dream. Most people tend to forget or don’t even think about their dreams, much less their extraordinary ability to control them. What is frightening about this is the prevalent attitude of accepting the current state of reality as it is:

The idea of technological change as quasi-autonomous, driven by some process of autopoiesis or self-organization, allows many aspects of contemporary social reality to be accepted as necessary, unalterable circumstances, akin to facts of nature. In the false placement of today’s most visible products and devices within an explanatory lineage that includes the wheel, the pointed arch, moveable type, and so forth, there is a concealment of the most important techniques invented in the last 150 years: the various systems for the management and control of human beings.

What may be the most important fact to remember: Nothing must be as it is. Here are a three ways to escape the never-ending 24/7 state:

Unplug Your Phone & Plug Into Your Imagination

Break your cell phone habit. The dopamine addiction is real. I keep my phone in a Faraday pouch, which blocks signals to my phone and keeps me to my rule of no cell phone or screen use one hour prior to sleeping and one hour after waking.

As “visual and auditory ‘content’ is most often ephemeral, interchangeable material that in addition to its commodity status, circulates to habituate and validate one’s immersion in the exigences of twenty-first-century capitalism,” it is important to focus on the power of our own imagination. The hierarchal and algorithm-driven fields of social media and newsfeeds tend to serve us things we already know or like, and keep us wanting.

Instead, we can explore the limitless field of our imagination. Write down your dreams in the morning and use them as a vehicle for self-exploration, or venture into lucid dreaming to manifest your own desires or to explore creative pursuits. And yet for most of us, when walking, during our daily commute, even sitting on the toilet or in any moment where it’s just us and our thoughts, we turn to our cell phones for comfort, to fill the silence:

One of the forms of disempowerment within 24/7 environments is the incapacitation of daydream or of any mode of absent-minded introspection that would otherwise occur in intervals of slow or vacant time.

Even when socializing with friends, it’s a common habit to check our phones again and again. I’ve found that when one person does this, it enables others:if I see someone sitting across from me at a dinner checking their Instagram feed, I’ll feel less guilty about doing the same. Make it can stop with you — turn off your phone.

Reevaluate Your Drug Habits & Addictions

Beyond digital dopamine, are you addicted to caffeine, sugar, alcohol, adderall, cocaine, Ambien, Lexapro, vicodin, etc., etc.? We live in a self-selecting society, where some drugs are perfectly acceptable as long as they are prescribed by a doctor and other drugs are deemed dangerous. I used to babysit for an eight-year-old who was fed Ritalin daily for his ADHD, and then at night, had to take a tranquilizer to help him fall asleep. He was speedballing throughout his childhood, and I’ve met others who had the same experience only to question the impact of these drugs on their personality and life-path.

There is a multiplication of the physical or psychological states for which new drugs are developed and then promoted as effective and obligatory treatments. As with digital devices and services, there is a fabrication of pseudo-necessities, or deficiencies for which new commodities are essential solutions… Over the last two decades, a growing range of emotional states have been increasingly pathologized in order to create vast new markets for previously unneeded products. The fluctuating textures of human affect and emotion that are only imprecisely suggested by the notions of shyness, anxiety, variable sexual desire, distraction, or sadness have been falsely converted into medical disorders to be targeted by hugely profitable drugs. Of the many links between the use of psychotropic drugs and communication devices, one is their parallel products of forms of social compliance.

Ritalin, adderall (and cocaine) not only make the takers compliant but fueled to tackle the 24/7 lifestyle, deadening empathy, increasing competitiveness and perhaps is linked to “destructive delusions about performance and self-aggrandizement”.

While methamphetamines are regularly fed to children, psychedelic drugs tend to be demonized as extreme and dangerous. Yet, refreshingly, there are organizations now like the Multidisciplinary Association of Psychedelic Studies (MAPS) and other studies looking into how psychedelics can not only treat addictions, anxiety, and disorders, but also how psychedelics can expand consciousness and leave lasting personality changes for the better.

Find Your Passion & Connect With Real Life Communities

Crary argues that “whatever remaining pockets of everyday life are not directed toward quantitative or acquisitive ends, or cannot be adapted to telematic participation, tend to deteriorate in esteem and desirability.” Our tendency to tie our social worth to digital networks takes the saying “if a tree falls in a forest and nobody is around to hear it, does it make a sound?” and turns it into “if you do something fun and meaningful and don’t post it to social media, does it matter?”

But those meaningful moments in real life do matter, as does having a strong community to participate in. After all, addictions are a result mostly of isolation and bad environments:

As stated earlier: it is much easier to fold to the insidious trap of looking at your cell phone or constantly working if the person across from you does so first. Find your passion beyond the screen. Find your source of dopamine, what drives you, what engages you and makes you want to get up every day.

Finding a real community centered around a meaningful activity can help tremendously. For me, rock climbing is a meditative activity that requires focus and attention, and is anchored in a community of people who are invested in your success as much as they are in their own. The nature of the sport is so individual because each person is unique; climbing is a niche that carves out time for people to participate in life without any social rules and concepts of winning over another. Climbing outdoors is a way to be connected to nature and to just hang out with friends.

I just returned from a week in New York City, the city that never sleeps, the capitol of the 24/7 world, and it took me two weeks just to be able to find the time to sit down and write this. It is not easy to accept the bleak claims in Crary’s book because it would be admitting our own addictions and how we play into this non-stop state. It’s just as hard to look away from our screens, but you can. Tonight, don’t put your phone or laptop into “sleep mode” — turn them off, and pay attention to your own dreams.

Further Study:


24/7: Late Capitalism and the Ends of Sleep
 by Jonathan Crary

24/7: Late Capitalism and the Ends of Sleep explores some of the ruinous consequences of the expanding non-stop processes of twenty-first-century capitalism. The marketplace now operates through every hour of the clock, pushing us into constant activity and eroding forms of community and political expression, damaging the fabric of everyday life.