Friday, December 22, 2017

Dopamine-Driven Feedback Loops: Unintended Consequences of Social Media

I may be too optimistic right now, but 2017 may mark the year when people awoke to the unintended consequences of social media.

We saw the US government grill representatives from Facebook, Twitter, and Google about the propagation of fake news over the web.

Now we've seen former executives from Facebook come out to discuss the damage that their platforms have done to public discourse.

I wanted to highlight two interviews that caught my attention.  They show how new media have ecological consequences: when we add new technologies to the environment, they change the entire environment, often with unintended consequences (see my prior posts on media ecology).

Sean Parker, one of the founders of Facebook (played by Justin Timberlake in the movie The Social Network), has pointed out the fact that Facebook is designed like an addictive drug.  Every time you see more likes, comments, or shares online, you get a blast of dopamine (a chemical that signals immediate reward to your brain), which gets you coming back for more likes, comments, shares, etc.

The result is what Parkers called a “social-validation feedback loop” that exploits “a vulnerability in human psychology."

The thought process that went into building these applications, Facebook being the first of them ... was all about: 'How do we consume as much of your time and conscious attention as possible?’

And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you ... more likes and comments.
It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology.

Parker isn't alone in his assessment.  Chamath Palihapitiya, a former Facebook executive, has come out with a similar concern.  He goes so far as to express “tremendous guilt” about how “we have created tools that are ripping apart the social fabric of how society works."

The short-term, dopamine-driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, mistruth.

Okay, let me admit that I’m a still user of Facebook, although lately I’ve considered deleting my account.  Here’s the question I have to ask myself: What value is this media truly bringing to my life?  When I reflect on this question, here's the honest answer: not much.

Facebook does help me keep in touch with friends and organize groups and events with them (mainly, I use Facebook to organize get-togethers).  However, I’m not afraid to admit that I find most communication on social media to be superficial and phatic.

At this point, I agree with both Parker and Palihapitiya’s temporary solution (which you can hear in their full interviews): only use Facebook minimally.  Very minimally.  Make sure the majority of social interaction in in the real world, not the virtual one.

Saturday, November 11, 2017

Is Big Tech a Threat to Democracy?

This last month, we saw something unprecedented in the world of big tech.  Executives from Facebook, Twitter, and Google testified on Capitol Hill to address how the Kremlin used their platforms to spread propaganda and fake news over the Internet.

A lot of problems were discussed about disinformation over the web, and yet these problems all seems to be part of a larger problem: the formation of digital tribes.  People are already prone to what psychologists call the confirmation bias (a tendency to seek out information that confirms what we already want to believe), and social media makes this tendency so much easier.  As The Economist recently wrote,

The algorithms that Facebook, YouTube and others use to maximize “engagement” ensure users are more likely to see information that they are liable to interact with.  This tends to lead them into clusters of like-minded people sharing like-minded things, and can turn moderate views into more extreme ones.

Since I’m quoting it, I’d encourage everyone to check out this recent issue of The Economist—it should be required reading for students.

(Image from The Economist)

I remember back in some of my grad school seminars (my M.S. is in Scientific & Technical Communication with a minor in cognitive science), when professors and students were arguing how social media would revolutionize democracy.  More people, they claimed, would be able to participate in civic dialogue using social media, which would help participatory democracy flourish.

Well, social media didn’t revolutionize democracy.  Instead, social media seem to be replacing democracy with echo chambers of ideological prattle susceptible to fake news.  This can be entertaining and amusing but not necessarily democratic or informative.  As Neil Postman, the founder of media ecology, warned in his famous book (pictured below), we're in danger of amusing ourselves to death.

(Image from Amazon)

But let’s be practical: big tech isn’t going anywhere, and social media are great for some things, such as organizing events and sharing entertainmentpoints Clay Shirky makes in his books.  When it comes to civics, however, the big tech companies are not enhancing democracy or dialogue.  Going forward, they may need to prepare themselves for regulation and oversight to mitigate the unintended consequences of social media.

For example, just as we regulate large banks as trustees of our money so that they don't endanger our economic institutions, we may want to regulate big tech companies as trustees of our data so that they don't endanger our democratic institutions.


“Social Media’s threat to democracy,” The Economist, November 4th-10th 2017, 19-22.

Sunday, October 22, 2017

Consuming a Balanced Media Diet: digital tribes, information bubbles, and the matrix

I've said it before, I'll say it again: digital technology has played a major role in the rise of "digital tribalism"which means groups and people use digital technologies to construct information bubbles.  The paradox of the Digital Age is that we're more interconnected than ever, and yet we use search engines, social media, and recommendation systems to insulate ourselves from ideas outside our information bubbles.

Don't like the facts?  Google your own 'facts' instead.

The best thing about the Internet is that you can find out anything about anything.  That's also the worst thing about the Internet.

For example, if you're curious about climate, genetics, or vaccines, you can easily find educational resources such as Scientific American.  But if you don't want to believe in climate science, evolution, or vaccinations, it's also easy to find sites that assure you that global warming is a conspiracy, evolution is a hoax, and vaccines are dangerous.  Anyone can Google whatever they want to believe nowadays, and somewhere in the digital universe, there will be a website propagating blatantly false information that misrepresents basic scientific, historical, and medical facts.

Don't like other viewpoints?  'Unfriend' them.

If you don't like what someone says on Facebook or Twitter, you can 'unfriend' or 'unfollow' that person.  More and more people did that to others during the last election, and, unfortunately, it's just reinforcing digital tribes of liberals and conservatives.  I say unfortunately because, in the end, people only see what they are ideologically predisposed to like in their feed, and few are challenged by different viewpoints.  As a result, fewer people can have conversations, constructively disagree, or defend their own points of view, because they don't want to be questioned.

Only want to see what you'd like to see?  Let online recommendations make your decisions.

It's not just search engines and social media that are isolating us into digital tribes.  Online shopping is doing that too.  Take the recommendation system on Amazon as an example.  The Economist ran an analysis that looked at how online recommendations from Amazon influence customers who buy political books.  The result was summed up in the following graphic, which shows how left-leaning shoppers tend to buy only left-leaning books, just as right-leaning shoppers tend to buy only right-leaning books.

Source: the Economist, September 30th, 2017

As you can see, very few readers end up even looking at other political viewpoints.  The online recommendations are not only making decisions for customers.  Customers are also letting online recommendations determine what they end up knowing.  As the Economist concludes,
Jeff Bezos, Amazon's founder, has bought the Washington Post, and urged upon it the motto: "Democracy Dies in Darkness".  But Amazon conquered the book market in part on the strength of its "recommendation engine".  That now contributes to the dark spots in Americans' knowledge of their political opposites.  Whether Amazon willor even cando anything to change that is yet to be seen.
Trapped in the matrix ... of information bubbles and digital tribes

As we can see, search engines, social media, and online recommendations don't necessarily expose us to diverse ideas.  On the contrary, they may trap us inside our own information bubbles.  If you were wondering what The Matrix was an allegory for, now you know.

Or, to go from science fiction to science fact, here's why Google, Facebook, and Amazon don't necessarily open up our minds: digital technologies makes us prone to what psychologists call the confirmation bias (finding info that conforms to our beliefs).  No wonder they make us more tribal.

What to do now?  Try a balanced media diet.

So how do we overcome digital tribalism?  It's a question I ponder often these days, so please let me know if you have any suggestions.  One I have is what I like to call a balanced media diet.  The important thing to remember is that different types of media can affect our lives and minds differently (studying these differences is called media ecology), so moderation (as opposed to overdosing) is key.

Just as it's physically unhealthy to eat only one type of mineral, it's mentally unhealthy to consume only one type of media.  For instance, calcium is essential for a healthy body, but if all I consume is calcium, my body will be in serious trouble.  The same goes for media.  If all or most of my information and interaction comes from social media, my mind will lose its ability to focus and reflect, and my social skills will wane.  So maybe try this: for every minute you spend in front of a screen, spend the same amount of time reading a book and/or conversing with friends, family, or colleagues.  If you're lucky, they'll introduce you to new ideas, question your previously unchallenged beliefs, and pop your information bubble.


"Purple Blues," The Economist, September 30,2017, 75-76.

Wednesday, September 27, 2017

Some Thoughts about Digital Tribes and Book Clubs

Bret Stephens
(image from Wikimedia Commons)

Bret Stephens, a Pulitzer Prize winning journalist, recently wrote an excellent piece about what he calls “The Dying Art of Disagreement.”  The gist of his message is that Americans cannot respectfully disagree with each other, especially when it comes to politics (emphasis on respectfully).

After the polarizing election of 2016, I’d encourage all to read Stephens' thoughtful commentary.  Here, I’d like to highlight one point implicit in his article that’s close to my heart (particularly in the context of media ecology, or the study of how technologies change the way we live and think).

A Technological Paradox

Stephens points out that political polarization isn't just geographic (rural versus urban) or personal (liberal versus conservative) but also “electronic and digital”:

Americans increasingly inhabit the filter bubbles of news and social media that correspond to their ideological affinities.  We no longer just have our own opinions.  We also have our separate “facts,” often the result of what different media outlets consider newsworthy.
The paradox of digital technologies—from cable TV to Twitter—is that even though we’re interconnected more than ever before, we’ve isolated ourselves inside bubbles of information.

If you want news that conforms to your political bias, simply turn on your preferred programliberal, conservative, or reactionaryor just Google any website that confirms whatever you want to believe, regardless of the facts.  As a result, we’ve balkanized ourselves into digital tribes.

Digital Tribalism vs. Liberal Education

Digital tribalism contrasts sharply with open-minded analysis (e.g., what psychologists refer to as deep reading and listening) and sustained question-and-answer dialog (i.e., what the Greeks called dialectic).  One way to exercise those faculties is to unplug and immerse ourselves in books and conversation—or what's known as a great books curriculum.  To quote Stephens once more:
What was it that one learned through a great books curriculum? Certainly not “conservatism” in any contemporary American sense of the term. We were not taught to become American patriots, or religious pietists, or to worship what Rudyard Kipling called “the Gods of the Market Place.” We were not instructed in the evils of Marxism, or the glories of capitalism, or even the superiority of Western civilization.
As I think about it, I’m not sure we were taught anything at all. What we did was read books that raised serious questions about the human condition, and which invited us to attempt to ask serious questions of our own. Education, in this sense, wasn’t a “teaching” with any fixed lesson. It was an exercise in interrogation.
To listen and understand; to question and disagree; to treat no proposition as sacred and no objection as impious; to be willing to entertain unpopular ideas and cultivate the habits of an open mind — this is what I was encouraged to do by my teachers at the University of Chicago.
It’s what used to be called a liberal education.
Liberal education (“liberal” here just means in the spirit of liberty, nothing political) frees the mind to challenge itself and others, as opposed to isolating itself in an echo chamber of self-confirmation and vehement disagreeing with others, which is what tends to happen with digital technologies, especially social media such as Facebook and Twitter.

Social media certainly has a place in today’s world, but in the realm of civic education, it’s clearly a part of the problem, not the solution.

How to save civics

So how do we save civics?  I’ve no ultimate answer, but here’s an experiment I’ve been up to: organizing book clubs.

There’s something special about sitting with individuals of varying perspectives, reading other peoples’ viewpoints, and trying to understand and discuss areas of agreement or disagreement.  We don’t read, listen, and discuss to agree or disagree.  We do so to understand.  Then, when we do disagree, we understand why, which helps us do so respectfully or, when possible, synthesize or modify our own views.

Book club conversations, I believe, are a better model for civic engagement than posting on social media.  However, the two can be complementary.  For instance, I use Facebook to organize events for my book club, but once the book club begins, we unplug.  Social media, I’ve determined, is great for organizing events, while books and in-person conversations are better for civic dialog.

Wednesday, August 30, 2017

Multitasking is a liability, not a skill (a reminder we're not computers)

I recall a job interview in the not-too-distant past when the hiring manager asked me a peculiar question: "How are your multitasking skills?"

I was a bit taken aback.  As a former student who minored in cognitive science (and as someone who dabbles in yoga and meditation), I thought about going into a scientific spiel about the myth of "multitasking skills" and how multitasking is antithetical to mental focus.  Instead, I talked about concentration and how my ability to focus allowed me to complete tasks thoroughly, one at a time.

Well, I didn't get a call back from the hiring manager (which was fine with me), and lately I've heard similar anecdotes from friends and colleagues.  Why are hiring managers asking about "multitasking skills," as if such a skill set even existed?

What "multitasking" means

The concept of "multitasking" originated quite recently.  The very word, in fact, was invented in the 1960s by professionals in computing.  Originally, "multitasking" referred to what computers could do, which was run many programs all at once.  In this context, multitasking was akin to multiprogramming or multiprocessing.

It wasn't until more recently that the field of business management snagged the term and started using it to refer to human activity in the workplace.

And yet, human beings aren't computers.  Computing professionals were right to say that computers multitask.  Business managers were mistaken to think that people multitask in a similar vein.

What "multitasking" doesn't mean

Multitasking is the opposite of concentration.  To concentrate is to focus on one thing (maybe two) at a time.  To multitask is to bounce your attention between multiple things.

The question, then, is this: how well can people multitask?  Thus far, research from psychology, cognitive science, and communication studies gives a unanimous answer: not very wellin fact, we're terrible at it!  What's worse, multitasking can be bad for our mental health.

It's a bit counterintuitive.  In theory, you'd think that juggling multiple tasks at once would save time and effort.  In practice, the opposite happens.

Multitasking impairs focus

This almost goes without saying.  Multitasking is the opposite of concentration, because the former spreads our attention thin over many things, while the latter focuses our attention deeply on one thing (also known as monotasking).

Multitasking leads to mistakes and weakens productivity

Multitasking causes people to be careless and make errors, and switching between tasks makes us complete them more slowly, a phenomenon psychologists refer to as switching costs.

Multitasking kills creativity

People who multitask have difficulty managing their emotions and experience impediments to creative thinking.  (Yogis and Buddhists, of course, have said this for centuries.)

To multitask or not to multitask?

Look, I'm not saying don't ever multitask.  Multitasking is okay for routine things.  For mindless or zoning-out activities, such as cleaning the floor or jogging, I may talk on the phone or listen to podcasts as well.

However, for mindful work that requires attention, care, or imagination, concentration is your ally, and multitasking your Achilles heel.  So don't multitask on the job, when driving in a car, during home improvement projects, or while trying to create art.

We're human beings, not computers.  Let's treat ourselves accordingly.  Unless you're a machine, multitasking is a liability, not a skill.

Postscript: By the way, two of the best mental exercises to train concentration are mindfulness meditation and reading books.  I'd recommend two accessible and fun reads in that light: 10% Happier, by Dan Harris, and Proust and the Squid, by Maryanne Wolf.

Monday, July 31, 2017

Book Review: Everything That Remains

Recently I had a chance to watch a film called Minimalism: A Documentary About the Important Things.  The movie follows two friends, Joshua Fields Milburn and Ryan Nicodemus, as they travel around the U.S. to promote their memoir Everything That Remains (see trailer below).

Who are these dudes, and why should we care about their story?

Joshua Field Milburn and Ryan Nicodemus
(Image from their site The Minimalists)

Joshua and Ryan are a couple of Midwestern guys who grew up poor, so they devoted their early adult years to climbing the Corporate Ladder, becoming well paid executives, and buying expensive stuff (lots of stuff) to reward themselves for their onerous toil at the office.

Sometimes they spent so much money that, despite their large paychecks, they went into debt (thousands of dollars into debt).  But hey, as long as the money was rolling in, life appeared hunky dory on the surface, so they continued to live the American Dream with panache.

Tragic events, however, force us to reevaluate, which is what happened to Joshua.  In a single month, his mother died, and his marriage fell apart.  He found himself alone, surrounded by mountains of material things that didn't make him any happier.  Realizing he needed to change his life, he ended up stumbling across a movement known as minimalism.

So what is minimalism?

Minimalism doesn't mean getting rid of all your belongings and living like an acetic.  Minimalism means asking a simple question about what we own and how we spend our time: does this truly bring value to my life?

Asking that question made Joshua realize a contradiction in the American way of life.  In the relentless pursuit of happiness, we've made ourselves unhappy, buying stuff we don't need with money we don't have by working long hours at jobs we don't necessarily enjoy.  As a result, we feel empty inside.  To fill the spiritual void, we buy more things, pile on extra debt, and work even longer hours to pay for that debt.  All the while, we lose track of our passions and relationships that make life worth living.  In the end, we don't really own our things.  Our things own us.

I'm reminded of what Henry David Thoreau says in Walden: "Men have become the tools of their tools."

Henry David Thoreau
(Image from Wikimedia Commons)


(Image from The Minimalists site)

To turn his life around, Joshua got rid of his excessive belongings, gradually paid off his debt, and quit his corporate job to become a writer.  His buddy Ryan soon noticed how much happier he was, so he followed suit.  Their book, Everything That Remains, is a product of this minimalist adventure.  It's a beautifully written memoir, composed by Joshua, with quirky commentary added by Ryan.

Two quick thoughts I had about the book:

  1. Beware: it's not for everybody.  After all, it's a critique of the American Dream, at least as we conventionally understand it: work hard, climb the Corporate Ladder, and then buy your dream car, large house, and all the gadgets you want.  Joshua and Ryan present minimalism as a meaningful alternative to mindless consumerism.  Their goal is to live more deliberately, more consciously.
  2. Minimalism isn't some new fad.  It's about searching for what's important.  Similar messages come from Buddha, Classical philosophers like Seneca, and American writers such as Thoreau.  Owning less minimizes the distractions in our lives, giving us more time for love and friendship.  What's new is how Joshua and Ryan present this message in an updated, accessible way.

In a sense, the word 'minimalism' may be misleading.  Something indeed is being minimizedmindless materialismand yet something else is being maximizedmeaningful living.  So while minimalism minimizes material distractions, it maximizes existential meaning.  If that resonates with your own outlook on life, Everything That Remains is likely a book for you.

Sunday, June 25, 2017

The Value of Simplicity: Revisiting Thoreau's Walden

Every so often, I love to pick up books I haven't looked at in a while and lose myself in the words of a sagacious writer.  Lately, I've been revisiting the writings of Henry David Thoreau, whose books I never cease to recommend.

Most of us have heard of Henry David Thoreau, but we don't quite know what to think about him in the 21st-century Digital Age, which is dominated by complex technologies that invade nearly every aspect of our lives.   Smartphones inundate us with text messages, social media notifications, and calls throughout the day.  And with more consumers buying new devices such as commercial drones and fitness trackers, personal privacy may soon become a vestige of the past.  Were he alive today, Thoreau would have nothing to do with any of these.

(Image from Wikimeda Commons)

Here was a writer who professed simplicity almost two centuries ago (he lived from 1817 to 1862).  He was a contrarian for his time, a period of history known as the Industrial Age, which was distinguished by the rise of giant factories, growing urban areas, and powerful machines such as steam engines.  In fact, Thoreau went so far as to withdraw from city life and live by himself in the woods.  He brought with him only the bare necessities of life, and he wrote about this adventure in his book Walden.  Why the name "Walden"?  I'll let Thoreau explain.

My own copy of Walden

"Near the end of March, 1845, I borrowed an axe and went down to the woods by Walden Pond, nearest to where I intended to build my house, and began to cut down some tall, arrowy white pines, still in their youth, for timber" (37).

Why make such a drastic move?  Again, I'll let the man himself explain.

"I went into the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived.  I did not wish to live what was not life, living is so dear" (82).

Why on earth would Thoreauor any of us for that matternot be living deliberate, authentic lives?  One reason may have something to do with the rise of complex technology.  We created modern machines to make our lives better, but they have done the opposite, according to Thoreau, who professes,

"men have become the tools of their tools" (34).

In other words, we are not the ones controlling technology.  Technology is controlling us.

Maybe that sounds hyperbolic, but let me try to update Thoreau's message.

By using modern technologies to make our social lives easier or more efficient, we often make our personal lives more complicated.  Every few minutes, our digital devices distract us with messages (at least half of Americans check their smartphones several times an hour or more, according to Gallup), and information overload from the Internet is confusing lots of people (many now have trouble distinguishing journalism from fake news).

When technology hits us with that much complexity all at once, it may be healthy to simplify our lives and curtail mental distractions.  I'll give a personal example.  One way I choose to simplify my life is not to pay for a data plan on my cellphone.  I only use it to make calls or send texts.  As a result, my phone distracts me very little, and I'm able to focus on the people in front of me.  Plus, the low phone bill is nice.

Okay, cutting the data plan from my phone isn't as drastic as Thoreau retreating to the woods, but it's my approach to a more deliberate life of experiencing the people and places around me without distractions.  Try this simple test next time you're out with a friend at a cafe, restaurant, bar, beach, park, or wherever: see if you can enjoy your friend's company without disrupting the conversation by checking your smartphone.  After all, would you rather squander your free time by compulsively checking your smartphone every other minute?  Or would you rather enjoy the moment with others you care about?  To wit, does technology control you, or are you in control of your own life ("to live deliberately," as Thoreau said)?

Walden's message of simplicity and deliberate living isn't about condemning technology.  It is, however, a rejection of technological consumerism.  On a practical level, simplicity means minimizing material distractions so that we can maximize the enjoyment of life around us.

Walden Pond (Image from Wikimedia Commons)

References: Thoreau, Henry David. Walden.  New York: Doric Books, 1950.

Sunday, May 21, 2017

Retro Reviews: Midnight in Paris

I love movies.  They’re my second favorite art form, right after books.  Like older art forms (novels, theater, etc.) cinema is an artistic medium that conveys heroic tales and modern myths, especially when it comes to fantasy, superhero, and science fiction genres.  There’s a common theme in these cinematic myths, whether we’re watching Harry Potter, The Avengers, or Star Wars: the relationship between human beings and their technologies, whether those technologies are antique tools (e.g., magic wands), advanced machines (e.g., robots), or super weapons (e.g., the Death Star).

I’ve written about some of my favorite films (see my list of greatest sci-fi flicks), and on occasion I review new films (including The Force Awakens and Rogue One).  However, other movie genres outside of science fiction and fantasy do not always get the full attention they deserve, so I decided to write "retro reviews" about great films that say something important about the relationship between humanity and technology.  Today, let's give tribute to Woody Allen's Midnight in Paris.

“The past is not dead.  It’s not even past.” – William Faulkner (quoted in Midnight in Paris)

(Image from IMDb)

This romanic comedy stars Owen Wilson as Gil Pender, an aspiring novelist with a nostalgic love for art and technology from the 1920s.  Gil not only feels out of place in his 21st-century culture; he also feels out of time.  In today’s digital age of computers and cellphones, he longs for an analog age of records, clocks, jazz instruments, and (most tellingly) books.  His idols include Ernest Hemingway, F. Scott Fitzgerald, and Gertrude Stein.

Without spoiling the whole plot, here’s a basic synopsis.  While wandering the streets of Paris on vacation, Gil is magically transported to the 1920s when the clock strikes midnight each night.  Suddenly, he's riding around with Zelda and F. Scott Fitzgerald.  At first, he enjoys these midnight trips back in time, which let him escape his own culture’s apathy for aesthetics and satiate his own yearning for the forgotten history of the 1920s.

(Image from IMDb)

These midnight trips are a much needed escape from his present situation, which is bogged down by a loveless relationship with his profligate fiancé, who only seems to care about spending money.

(Image from IMDb)

The spiritual void beyond her consumerism is filled by her phony friends, including a pretentious pseudo-intellectual who loves to lecture everyone about topics he ostensibly has no clue about.

(Image from IMDb)

To make matters worse, her parents are philistines with no respect for arts and literature, in stark contrast to Gil’s bibliophilic and aesthetic passions.

(Image from IMDb)

It's no wonder Gil so desires to lose himself in a glorious past, where he can talk to modernist writers and dance to the swinging rhythms of jazz.

(Image from IMDb)

During his midnight adventures, however, Gil slowly realizes that his own romantic nostalgia for the past is no better than his modern culture’s apathy toward art and history.  One is just the opposite extreme of the other.

In the end, he learns an important lesson: there’s a middle ground between nostalgia and apathy.  The moral is almost Buddhist (or maybe Stoic in the Roman sense): Remember the past, admire and learn from history, but don’t cease to live in the present.

In short, the past versus present isn't an either-or choice.  We should learn from history to live fulfilling lives here and now.  Remembering the past to inspire the present is, in fact, the life of the artist, symbolized by Gil strolling through the midnight rain (a symbol of new life), enjoying conversation with an artistic lady (not his profligate fiancé with whom he eventually breaks up).

(Image from IMDb)

Midnight in Paris is a beautiful, beautiful film, and it's much more than a romantic comedy.  It’s a healthy reminder that in our digital age of accelerated innovation, spendthrift consumerism, and forgetfulness of antiquated art and history, sometimes playing with antiquarian technologies and old media—from classic novels to jazz records—can be a healthy way to make sense of the present.