Saturday, February 17, 2018

Is Big Tech an Existential Threat?

(Image from Amazon)

Journalist Franklin Foer recently wrote a book that takes big tech companies to task.  He singles out Google, Facebook, and Amazon, calling them existential threats.  Hence, the title of his book: World Without Mind: The Existential Threat of Big Tech.

Why does Foer refer to Google, Facebook, and Amazon as existential threats?  He gives many reasons over 250-something pages, but here are three themes I thought I'd highlight from his book.

Power vs. Privacy

One reason is that these companies have amassed a lot of economic power over our data—too much power.  As a result, only a handful of corporations (Google, Facebook, and Amazon, in addition to Twitter, Apple, and Microsoft) control nearly all of our personal information.  Foer calls them “monopolies” (although technically, the correct term is “oligopoly”).

For our own protection, Foer recommends regulating big tech as trustees of our data, subject to the same privacy protection laws and standards that other media corporations must adhere to.

It's worth quoting Foer on this point:

It’s a basic, intuitive right, worthy of enshrinement: Citizens, not the corporations that stealthily track them, should own their own data.  The law should demand that these companies treat this data with the greatest care, because it doesn’t belong to them.  Possessing our data is a heavy responsibility that must come with ethical obligations.  The American government has a special category for corporations that profit from goods that they don’t truly own: We call them trustees.  This is how the government treats radio and television broadcasters.  These companies make money from their use of public airwaves, so the government requires that these broadcasters adhere to a raft of standards (201).

The poverty of attention

Another reason big tech is an existential threat is that social media and digital technologies have serious psychological consequences with respect to our mental wellbeing: too much time with them kills our ability to provide sustained attention and focus.  Again, to quote Foer:

This abundance of free material, however, created a new form of scarcity—with so much to read, see, and hear, with the unending web of links, it became almost impossible to grab an audience’s attention.  David Foster Wallace called the condition Total Noise.  With it, our reading became peripatetic, less focused.  Back in the seventies, Herbert Simon, the Nobel-winning economist, took these inchoate sentiments and explained them rigorously: “What information consumes is rather obvious.  It consumes the attention of its recipients.  Hence a wealth of information creates a poverty of attention.”  The poverty of attention, the inability to hold a reader’s attention for sustained time, that’s the crucial concept (88).

The threat to civic dialogue

Just as attention and focus suffer, so too does our capacity to engage in civic dialogue.  That's because platforms like Facebook aren't designed to expand our minds.  Quite the opposite.  The algorithms are designed to seal us inside information bubbles.  To quote Foer once more,

Facebook’s algorithms supply us with the material that we like to read and will feel moved to share.  It’s not hard to see the intellectual and political perils of this impulse.  The algorithms unwittingly supply readers with texts and videos that merely confirm deeply felt beliefs and biases; the algorithms suppress contrary opinions that might agitate a user.  Liberals are deluged with liberal opinions; vegetarians are presented with endless vegetarians agitprop; the alt-right is fed alt-right garbage; and so on.  Facebook shields us from the sort of challenging disagreement—although not from the idiocy of trolls and the blather of comments sections—that might change our minds or help us to better understand the views of our fellow citizens.
In economics, the peril of the network is monopoly—where a competitive market comes under the sway of big corporations.  In culture, the peril of the network is conformism—where a competitive marketplace of ideas ceases to be so competitive, where the emphasis shifts to consensus (177-178).

An existential threat, really?

Is Foer right that big tech is an existential threat?

I admit, I'm sympathetic to Foer's point of view, even if his premise is a bit hyperbolic.  His points about privacy, mental health, and civic dialogue are entirely valid.  For those reasons, World Without Mind is definitely a book I'd recommend, especially for those who would like to challenge their assumptions about technological progress.


Franklin Foer, World Without Mind: The Existential Threat of Big Tech (New York: Penguin Press).

Sunday, January 28, 2018

And now for something completely different: thoughts on The Last Jedi

Remember how Monty Python's Flying Circus used to begin?  "And now for something completely different."

That line might as well have been the opener for the latest installment in the Star Wars franchise: The Last Jedi.  (Warning: spoilers ahead if you haven't seen the movie yet.)  From Luke Skywalker chucking his historic lightsaber off a cliff to Supreme Leader Snoke getting sliced in half in what seemed like a total fluke, the film was so shockingly different that, honestly, I wasn't sure what to think at first.

The Last Jedi split the Star Wars fan base, with some petitioning Disney to remake the film.  Eventually, like our space heroes, we have to choose sides, and I'm going to take the side of defending the movie.  To do so, I'd like to address a couple common criticisms of the film: the subplot with Poe, Rose, and Finn was pointless; and Luke was way out of character.  Let's start with Poe, Rose, and Finn.

(Image from IMDb)

(Image from IMDb)

Nothingliterally nothingin their adventure went right.  So was their subplot pointless?  Yes.  But that was the point!  The old way of doing things is now obsolete.  This outdated MO includes triggering mutiny (Poe's insurrection against Resistance leaders), going rogue (Rose ditching her post to help Finn and Poe), and launching an all-out offensive with no real plan (Finn's failed kamikaze move).

As any team strategist knows, sometimes you need to play defense, and hold the offense for later.  Poe finally learns this lesson when he understands the evacuation plan his leaders were trying to execute, and Finn learns the same from Rose when she says, "We're going to win this war not by fighting what we hate, but by saving what we love."  With that lesson learned, let's talk about Luke's character.

(Image from IMDb)

Seemingly, Luke was out of character.  After all, he almost tried to kill his former student Kylo, and his verbal diatribe against the Jedi was surprising to say the least.  On closer look, however, I don't think he was out of character at all.  First off, Luke didn't try to murder Kylo.  Technically, he though about it and almost tried to, because he was so shocked when he looked into Kylo's corrupted soul.  But ultimately, Luke decided against murder, even if it didn't look that way to Kylo.

Second, Luke was right about the Jedi.  They were overly rigid and emotionally repressive, which played a role in Anakin Skywalker's meltdown into Darth Vader.  Moreover, the Jedi understanding of the Force was narrow-minded at best.  Consider Luke's ongoing dialogue with Rey: first, where Rey gets the Force wrong; and then when she gets it right.

(Image from IMDb)

Luke: What do you know about the Force?
Rey: It's a power that Jedi have that lets them control people ... and make things float.
Luke: Impressive.  Every word in that sentence was wrong.

Luke: What do you see.
Rey: The island.  Life and decay, that feeds new life.  Warmth.  Cold.  Peace.  Violence.
Luke: And between it all?
Rey: Balance and energy.  A force.
Luke: And inside you?
Rey: Inside me, that same force.

Bingo!  The Force doesn't belong to an aristocracy like the Jedi or the First Order.  That's why Rey wasn't a child of royalty or Skywalker lineage.  It's why she didn't truly have any special connection with Kylothat connection was just concocted by Snoke.  And it's why Snoke was no one special eitherjust some jackass in space who got what he deserved in the end.  Who does the Force belong to?  Nobody.  Which really means everybody.  Thus, the movie concludes with a young Force-sensitive slave.  You can't get more democratic than that!

But even if you don't agree with The Last Jedi's democratic spirit, can we at least agree that Porgs are adorable?

(Image from IMDb)

Friday, December 22, 2017

Dopamine-Driven Feedback Loops: Unintended Consequences of Social Media

I may be too optimistic right now, but 2017 may mark the year when people awoke to the unintended consequences of social media.

We saw the US government grill representatives from Facebook, Twitter, and Google about the propagation of fake news over the web.

Now we've seen former executives from Facebook come out to discuss the damage that their platforms have done to public discourse.

I wanted to highlight two interviews that caught my attention.  They show how new media have ecological consequences: when we add new technologies to the environment, they change the entire environment, often with unintended consequences (see my prior posts on media ecology).

Sean Parker, one of the founders of Facebook (played by Justin Timberlake in the movie The Social Network), has pointed out the fact that Facebook is designed like an addictive drug.  Every time you see more likes, comments, or shares online, you get a blast of dopamine (a chemical that signals immediate reward to your brain), which gets you coming back for more likes, comments, shares, etc.

The result is what Parkers called a “social-validation feedback loop” that exploits “a vulnerability in human psychology."

The thought process that went into building these applications, Facebook being the first of them ... was all about: 'How do we consume as much of your time and conscious attention as possible?’

And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you ... more likes and comments.
It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology.

Parker isn't alone in his assessment.  Chamath Palihapitiya, a former Facebook executive, has come out with a similar concern.  He goes so far as to express “tremendous guilt” about how “we have created tools that are ripping apart the social fabric of how society works."

The short-term, dopamine-driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, mistruth.

Okay, let me admit that I’m a still user of Facebook, although lately I’ve considered deleting my account.  Here’s the question I have to ask myself: What value is this media truly bringing to my life?  When I reflect on this question, here's the honest answer: not much.

Facebook does help me keep in touch with friends and organize groups and events with them (mainly, I use Facebook to organize get-togethers).  However, I’m not afraid to admit that I find most communication on social media to be superficial and phatic.

At this point, I agree with both Parker and Palihapitiya’s temporary solution (which you can hear in their full interviews): only use Facebook minimally.  Very minimally.  Make sure the majority of social interaction in in the real world, not the virtual one.

Saturday, November 11, 2017

Is Big Tech a Threat to Democracy?

This last month, we saw something unprecedented in the world of big tech.  Executives from Facebook, Twitter, and Google testified on Capitol Hill to address how the Kremlin used their platforms to spread propaganda and fake news over the Internet.

A lot of problems were discussed about disinformation over the web, and yet these problems all seems to be part of a larger problem: the formation of digital tribes.  People are already prone to what psychologists call the confirmation bias (a tendency to seek out information that confirms what we already want to believe), and social media makes this tendency so much easier.  As The Economist recently wrote,

The algorithms that Facebook, YouTube and others use to maximize “engagement” ensure users are more likely to see information that they are liable to interact with.  This tends to lead them into clusters of like-minded people sharing like-minded things, and can turn moderate views into more extreme ones.

Since I’m quoting it, I’d encourage everyone to check out this recent issue of The Economist—it should be required reading for students.

(Image from The Economist)

I remember back in some of my grad school seminars (my M.S. is in Scientific & Technical Communication with a minor in cognitive science), when professors and students were arguing how social media would revolutionize democracy.  More people, they claimed, would be able to participate in civic dialogue using social media, which would help participatory democracy flourish.

Well, social media didn’t revolutionize democracy.  Instead, social media seem to be replacing democracy with echo chambers of ideological prattle susceptible to fake news.  This can be entertaining and amusing but not necessarily democratic or informative.  As Neil Postman, the founder of media ecology, warned in his famous book (pictured below), we're in danger of amusing ourselves to death.

(Image from Amazon)

But let’s be practical: big tech isn’t going anywhere, and social media are great for some things, such as organizing events and sharing entertainmentpoints Clay Shirky makes in his books.  When it comes to civics, however, the big tech companies are not enhancing democracy or dialogue.  Going forward, they may need to prepare themselves for regulation and oversight to mitigate the unintended consequences of social media.

For example, just as we regulate large banks as trustees of our money so that they don't endanger our economic institutions, we may want to regulate big tech companies as trustees of our data so that they don't endanger our democratic institutions.


“Social Media’s threat to democracy,” The Economist, November 4th-10th 2017, 19-22.

Sunday, October 22, 2017

Consuming a Balanced Media Diet: digital tribes, information bubbles, and the matrix

I've said it before, I'll say it again: digital technology has played a major role in the rise of "digital tribalism"which means groups and people use digital technologies to construct information bubbles.  The paradox of the Digital Age is that we're more interconnected than ever, and yet we use search engines, social media, and recommendation systems to insulate ourselves from ideas outside our information bubbles.

Don't like the facts?  Google your own 'facts' instead.

The best thing about the Internet is that you can find out anything about anything.  That's also the worst thing about the Internet.

For example, if you're curious about climate, genetics, or vaccines, you can easily find educational resources such as Scientific American.  But if you don't want to believe in climate science, evolution, or vaccinations, it's also easy to find sites that assure you that global warming is a conspiracy, evolution is a hoax, and vaccines are dangerous.  Anyone can Google whatever they want to believe nowadays, and somewhere in the digital universe, there will be a website propagating blatantly false information that misrepresents basic scientific, historical, and medical facts.

Don't like other viewpoints?  'Unfriend' them.

If you don't like what someone says on Facebook or Twitter, you can 'unfriend' or 'unfollow' that person.  More and more people did that to others during the last election, and, unfortunately, it's just reinforcing digital tribes of liberals and conservatives.  I say unfortunately because, in the end, people only see what they are ideologically predisposed to like in their feed, and few are challenged by different viewpoints.  As a result, fewer people can have conversations, constructively disagree, or defend their own points of view, because they don't want to be questioned.

Only want to see what you'd like to see?  Let online recommendations make your decisions.

It's not just search engines and social media that are isolating us into digital tribes.  Online shopping is doing that too.  Take the recommendation system on Amazon as an example.  The Economist ran an analysis that looked at how online recommendations from Amazon influence customers who buy political books.  The result was summed up in the following graphic, which shows how left-leaning shoppers tend to buy only left-leaning books, just as right-leaning shoppers tend to buy only right-leaning books.

Source: the Economist, September 30th, 2017

As you can see, very few readers end up even looking at other political viewpoints.  The online recommendations are not only making decisions for customers.  Customers are also letting online recommendations determine what they end up knowing.  As the Economist concludes,
Jeff Bezos, Amazon's founder, has bought the Washington Post, and urged upon it the motto: "Democracy Dies in Darkness".  But Amazon conquered the book market in part on the strength of its "recommendation engine".  That now contributes to the dark spots in Americans' knowledge of their political opposites.  Whether Amazon willor even cando anything to change that is yet to be seen.
Trapped in the matrix ... of information bubbles and digital tribes

As we can see, search engines, social media, and online recommendations don't necessarily expose us to diverse ideas.  On the contrary, they may trap us inside our own information bubbles.  If you were wondering what The Matrix was an allegory for, now you know.

Or, to go from science fiction to science fact, here's why Google, Facebook, and Amazon don't necessarily open up our minds: digital technologies makes us prone to what psychologists call the confirmation bias (finding info that conforms to our beliefs).  No wonder they make us more tribal.

What to do now?  Try a balanced media diet.

So how do we overcome digital tribalism?  It's a question I ponder often these days, so please let me know if you have any suggestions.  One I have is what I like to call a balanced media diet.  The important thing to remember is that different types of media can affect our lives and minds differently (studying these differences is called media ecology), so moderation (as opposed to overdosing) is key.

Just as it's physically unhealthy to eat only one type of mineral, it's mentally unhealthy to consume only one type of media.  For instance, calcium is essential for a healthy body, but if all I consume is calcium, my body will be in serious trouble.  The same goes for media.  If all or most of my information and interaction comes from social media, my mind will lose its ability to focus and reflect, and my social skills will wane.  So maybe try this: for every minute you spend in front of a screen, spend the same amount of time reading a book and/or conversing with friends, family, or colleagues.  If you're lucky, they'll introduce you to new ideas, question your previously unchallenged beliefs, and pop your information bubble.


"Purple Blues," The Economist, September 30,2017, 75-76.

Wednesday, September 27, 2017

Some Thoughts about Digital Tribes and Book Clubs

Bret Stephens
(image from Wikimedia Commons)

Bret Stephens, a Pulitzer Prize winning journalist, recently wrote an excellent piece about what he calls “The Dying Art of Disagreement.”  The gist of his message is that Americans cannot respectfully disagree with each other, especially when it comes to politics (emphasis on respectfully).

After the polarizing election of 2016, I’d encourage all to read Stephens' thoughtful commentary.  Here, I’d like to highlight one point implicit in his article that’s close to my heart (particularly in the context of media ecology, or the study of how technologies change the way we live and think).

A Technological Paradox

Stephens points out that political polarization isn't just geographic (rural versus urban) or personal (liberal versus conservative) but also “electronic and digital”:

Americans increasingly inhabit the filter bubbles of news and social media that correspond to their ideological affinities.  We no longer just have our own opinions.  We also have our separate “facts,” often the result of what different media outlets consider newsworthy.
The paradox of digital technologies—from cable TV to Twitter—is that even though we’re interconnected more than ever before, we’ve isolated ourselves inside bubbles of information.

If you want news that conforms to your political bias, simply turn on your preferred programliberal, conservative, or reactionaryor just Google any website that confirms whatever you want to believe, regardless of the facts.  As a result, we’ve balkanized ourselves into digital tribes.

Digital Tribalism vs. Liberal Education

Digital tribalism contrasts sharply with open-minded analysis (e.g., what psychologists refer to as deep reading and listening) and sustained question-and-answer dialog (i.e., what the Greeks called dialectic).  One way to exercise those faculties is to unplug and immerse ourselves in books and conversation—or what's known as a great books curriculum.  To quote Stephens once more:
What was it that one learned through a great books curriculum? Certainly not “conservatism” in any contemporary American sense of the term. We were not taught to become American patriots, or religious pietists, or to worship what Rudyard Kipling called “the Gods of the Market Place.” We were not instructed in the evils of Marxism, or the glories of capitalism, or even the superiority of Western civilization.
As I think about it, I’m not sure we were taught anything at all. What we did was read books that raised serious questions about the human condition, and which invited us to attempt to ask serious questions of our own. Education, in this sense, wasn’t a “teaching” with any fixed lesson. It was an exercise in interrogation.
To listen and understand; to question and disagree; to treat no proposition as sacred and no objection as impious; to be willing to entertain unpopular ideas and cultivate the habits of an open mind — this is what I was encouraged to do by my teachers at the University of Chicago.
It’s what used to be called a liberal education.
Liberal education (“liberal” here just means in the spirit of liberty, nothing political) frees the mind to challenge itself and others, as opposed to isolating itself in an echo chamber of self-confirmation and vehement disagreeing with others, which is what tends to happen with digital technologies, especially social media such as Facebook and Twitter.

Social media certainly has a place in today’s world, but in the realm of civic education, it’s clearly a part of the problem, not the solution.

How to save civics

So how do we save civics?  I’ve no ultimate answer, but here’s an experiment I’ve been up to: organizing book clubs.

There’s something special about sitting with individuals of varying perspectives, reading other peoples’ viewpoints, and trying to understand and discuss areas of agreement or disagreement.  We don’t read, listen, and discuss to agree or disagree.  We do so to understand.  Then, when we do disagree, we understand why, which helps us do so respectfully or, when possible, synthesize or modify our own views.

Book club conversations, I believe, are a better model for civic engagement than posting on social media.  However, the two can be complementary.  For instance, I use Facebook to organize events for my book club, but once the book club begins, we unplug.  Social media, I’ve determined, is great for organizing events, while books and in-person conversations are better for civic dialog.

Wednesday, August 30, 2017

Multitasking is a liability, not a skill (a reminder we're not computers)

I recall a job interview in the not-too-distant past when the hiring manager asked me a peculiar question: "How are your multitasking skills?"

I was a bit taken aback.  As a former student who minored in cognitive science (and as someone who dabbles in yoga and meditation), I thought about going into a scientific spiel about the myth of "multitasking skills" and how multitasking is antithetical to mental focus.  Instead, I talked about concentration and how my ability to focus allowed me to complete tasks thoroughly, one at a time.

Well, I didn't get a call back from the hiring manager (which was fine with me), and lately I've heard similar anecdotes from friends and colleagues.  Why are hiring managers asking about "multitasking skills," as if such a skill set even existed?

What "multitasking" means

The concept of "multitasking" originated quite recently.  The very word, in fact, was invented in the 1960s by professionals in computing.  Originally, "multitasking" referred to what computers could do, which was run many programs all at once.  In this context, multitasking was akin to multiprogramming or multiprocessing.

It wasn't until more recently that the field of business management snagged the term and started using it to refer to human activity in the workplace.

And yet, human beings aren't computers.  Computing professionals were right to say that computers multitask.  Business managers were mistaken to think that people multitask in a similar vein.

What "multitasking" doesn't mean

Multitasking is the opposite of concentration.  To concentrate is to focus on one thing (maybe two) at a time.  To multitask is to bounce your attention between multiple things.

The question, then, is this: how well can people multitask?  Thus far, research from psychology, cognitive science, and communication studies gives a unanimous answer: not very wellin fact, we're terrible at it!  What's worse, multitasking can be bad for our mental health.

It's a bit counterintuitive.  In theory, you'd think that juggling multiple tasks at once would save time and effort.  In practice, the opposite happens.

Multitasking impairs focus

This almost goes without saying.  Multitasking is the opposite of concentration, because the former spreads our attention thin over many things, while the latter focuses our attention deeply on one thing (also known as monotasking).

Multitasking leads to mistakes and weakens productivity

Multitasking causes people to be careless and make errors, and switching between tasks makes us complete them more slowly, a phenomenon psychologists refer to as switching costs.

Multitasking kills creativity

People who multitask have difficulty managing their emotions and experience impediments to creative thinking.  (Yogis and Buddhists, of course, have said this for centuries.)

To multitask or not to multitask?

Look, I'm not saying don't ever multitask.  Multitasking is okay for routine things.  For mindless or zoning-out activities, such as cleaning the floor or jogging, I may talk on the phone or listen to podcasts as well.

However, for mindful work that requires attention, care, or imagination, concentration is your ally, and multitasking your Achilles heel.  So don't multitask on the job, when driving in a car, during home improvement projects, or while trying to create art.

We're human beings, not computers.  Let's treat ourselves accordingly.  Unless you're a machine, multitasking is a liability, not a skill.

Postscript: By the way, two of the best mental exercises to train concentration are mindfulness meditation and reading books.  I'd recommend two accessible and fun reads in that light: 10% Happier, by Dan Harris, and Proust and the Squid, by Maryanne Wolf.