Tag Archives: Pop Culture

The implications of an all-online entertainment future

Great post by Kevin Kelly on why the future of entertainment (and more!) will involve renting rather than owning, but having access to anything at any time.

This is key: “The chief holdup to full-scale conversion from ownership to omni-access is the issue of modification and control. In traditional property regimes only owners have the right to modify or control the use of the property. The right of modification is not transferred in rental, leasing, or licensing agreements.”

We have yet to deal with the legal (and cultural) ramifications of an entertainment world where everything is pure information rather than a physical object, and where you pay to access the information but not to own it. Those ramifications deserve an article or book of their own.

Advertisements

A reasonable defense of Family Circus

Anyone who (like me) has ever made fun of lame comic strips and the newspapers that run them should read this David Sullivan post about audiences’ capacity and desire for cultural change. It’s the most persuasive case I’ve read for why newspapers stick with what I would consider outdated comics, features, and language:

A columnist or feature can occasionally be hip; but a newspaper can’t be hip. It can’t be the counterculture. It is the culture. It has been part of how new ideas are absorbed into the mainstream. …

But it can be hard to find one’s place in the culture, which grows more complicated by the day; the Internet, with its social networking and postings and chat, provides a new counterculture, or multiple ones, ones that make the mainstream look even lamer than “The Family Circus” did to me in the 1970s. The argument about the future of news is partly about whether the mainstream ends with the baby boomers, like the parents left behind in “Childhood’s End” as the children join the ubermind.

The problem is that newspapers have tended to do a poor job of figuring out how to satisfy both the Family Circus and the more modern audiences. Plus the younger mainstream audience is still hipper and savvier than the Baby Boomer Family Circus audience. But Sullivan’s point is well taken.

The Marty McFly Paradox

My friend Eric has written an awesome post about the massive holes in the Back to the Future trilogy’s time-travel logic. It’s too long to summarize, but here’s one choice riff:

But here’s the weird thing– when he returns to 1985, he goes back to the parking lot to find the scene from the beginning of the movie play out exactly as it did the first time (except of course that (MASSIVE SPOILER ALERT) the Doc is now wearing a bullet proof vest). Except the Marty who he watches go back in time had a COMPLETELY DIFFERENT SET OF LIFE EXPERIENCES. I’ll accept that to protect the space time continuum, Doc Brown made sure that he still became friends with Marty and he still sent him back to the past at the exact same moment as before (he had seen the video footage of same). But here’s what I’m wondering– and this would have been an interesting additional sequel. What exactly did the alternate, better-1985 version of Marty do when he went back in time.

While I haven’t read what are probably numerous hardcore, follow-the-premise-to-its-logical-conclusion sci-fi stories, my sense is there are two possible conclusions to a logically sound time travel story: infinite recursion on the one hand; the instant un-existing of the time traveling character (and anything else introduced in that character’s original moment in time) on the other.

I think Back to the Future is playing with the infinite recursion scenario when Marty returns to 1985 and sees himself drive off into the past. That is, Marty isn’t watching “the alternate, better-1985 version of Marty” (or “Marty 2,” as Eric calls him). It’s meant to be Marty literally watching himself re-enact the movie we just saw. Of course this makes no sense, since the revamped McFly clan was all changed thanks to Marty’s actions on his first trip back to 1955; Marty should be changed too, transformed into Eric’s Marty 2. But here the movie breaks from its already strained logic in order to toss out the cool, mind-bending idea of a single Marty McFly doomed to infinitely re-enact his time-traveling life.

This bad logic shows up again in Part II, when

Marty goes back to the past from the future (in order to get the sports almanac back from Biff). And he sees the exact version of himself that I was just talking about and, as he sees, that Marty does the EXACT SAME THING HE DOES IN THE FIRST MOVIE!! That’s weird!

Eric’s right that this should be Marty 1 watching Marty 2, but the movie instead walks away from its premise; it’s simply Marty 1 again watching himself. Which, again, makes no sense.

Anyway, if the first two BTTF movies blew your mind back in the 80s, go read Eric’s post and prepare to experience your own mental-infinite recursion. (Like contemplating infinity, ever-expanding space, and death, thinking about Back to the Future for too long makes my head hurt.)

Journalism reality check II: The death and rebirth of criticism

Over at American Scene, Peter Suderman offers a good response to Patrick Goldstein’s LA Times lament about the loss of entertainment critics in print media. Suderman writes:

For the vast majority of people, a Friday night at the movies is just that — and nothing more. Most people really don’t care about and have no use for lengthy dissertations about the ways in which Steven Soderbergh borrows from Godard. They just want to know whether to see Ocean’s 12! Playing blame the audience doesn’t work for music studios trying to combat piracy, and it doesn’t work for cranky critics who remain convinced they deserve $2 a word for 1) their insights into obscure movies few people want to see or 2) their complaints about Big Dumb Movies that everyone’s going to see anyway.

I would add that a majority of criticism doesn’t even rise to this level of sophistication/pretension. When I led a session on criticism at the Poynter Institute’s High School Writers Workshop, I presented the difference between good and bad criticism as the difference between a term paper (an original thesis supported by examples from the text) and a book report (basic plot summary with maybe a cursory judgment). Many print reviews still tend toward the book report end of the criticism spectrum. (Plus more papers are experimenting with things like American Idol live-blogs and other “insta-criticism” that runs more toward summary/quick response but is totally appropriate for the subjects and form.)

Suderman makes an even more important point about the lack of perspective from those in the newspaper industry who mourn the loss of print critics. He writes:

Trenchant criticism hasn’t died; it’s just shifted venues. …

Meanwhile, I simply refuse to buy the argument that the loss of book pages and film-review jobs is a bad thing. Yes, it’s a bad thing for professional critics. Yes, it’s tougher for those lucky few thousand folks to make a living reading books and watching movies! On the other hand, the internet has actually created vastly more opportunity for aspiring critics to get their work read. The barriers to entry in top-end publications are still high, but those outlets are no longer the only options for critics on the make. So we’ll see fewer professional critics, sure, but we’ll also see far, far more criticism.

And yes, some of it will be bad. But on the whole, I’d guess that it will create a net gain in serious, thought-provoking criticism of just about every medium. Meanwhile, most of those truly elite outlets — the New Yorkers and the Washington Posts — are not going away.

Terrific points all. Jody Rosen is the best music critic in the country; he writes for Slate, not a newspaper. Newspapers that have a Jody Rosen should build an online brand and community around that critic and hope the critic doesn’t leave. If they don’t have a Jody Rosen, if their critics file one book-report review after another — and if newspapers increasingly need to think about what they can offer readers that no one else can — then they should treat every kind of critic as a luxury except for (maybe) local-music and (definitely) restaurant critics.

But there’s one crucial piece missing from Suderman’s analysis. Yes, there’s plenty of great criticism online. Yes, there’s going to be a net increase in great criticism thanks to that online crit-boom. But like so much of the online news-commentary-criticism boom, it is invisible to newspaper readers.

Suderman assumes that getting rid of critics won’t matter because newspaper readers will find the good stuff online. That would be true if you assume everyone has an RSS feed and reads Slate, Pitchfork, and House Next Door. Needless to say, not everyone does. If they did, that would further erode newspapers’ declining readership.

So if newspapers do get rid of in-house critics, they need to simultaneously start giving readers some of the material Suderman talks about. That goes for more than just criticism. Newspapers can no longer treat the online universe as invisible. They have to find a way to bring that great content to their readers, both via the Web and in print.

Life is not, in fact, like a sitcom (or, What I learned from Carolyn Hax)

I’m a little late to this one, but I finally read “Marry Him!” — a buzz-fishing article in last month’s Atlantic that ostensibly makes the case for settling for a spouse instead of holding out for Mr. Right. Here’s the gist:

Of course, we’d be loath to admit it in this day and age, but ask any soul-baring 40-year-old single heterosexual woman what she most longs for in life, and she probably won’t tell you it’s a better career or a smaller waistline or a bigger apartment. Most likely, she’ll say that what she really wants is a husband (and, by extension, a child).

[snip]

My advice is this: Settle!

What stands out from the article isn’t the fact that author Lori Gottlieb herself hasn’t settled (she’s a 40-something who, along with a friend, decided to have a baby with donor sperm “in fits of self-empowerment” — surely the best reason to have a baby). Or her attempt at ironically defusing the shock and vitriol she just knew her taboo-busting article would provoke (“Oh, I know—I’m guessing there are single 30-year-old women reading this right now who will be writing letters to the editor to say that the women I know aren’t widely representative, that I’ve been co-opted by the cult of the feminist backlash, and basically, that I have no idea what I’m talking about.”) Or her repeated undermining of her case for settling.

No, the most notable aspect of the story is that Gottlieb is dispensing romantic advice even though she seems to be the kind of person who believes that life is like a romantic comedy. Or rather, that romantic comedies are true to life, and that adults should draw their lessons about life and love from TV and the movies.

Continue reading

SNL’s ‘Milkshake’ miss and the limits of viral video fads

Saturday Night Live’s first post-strike episode was surprisingly solid, thanks to Tina Fey and her love of slightly sexist humor and poop jokes. Only one sketch bombed (a TMI drunken wedding toast) and an otherwise brilliant Rock of Love parody was ruined by Amy Poehler’s annoying one-legged farter (topic for future consideration: why SNL still bothers to come up with “characters” and why SNL characters and catch phrases were ever big deals in the first place).

The most interesting sketch came near the end, when a scene opened on Bill Hader doing a spot-on Daniel Plainview impression inside what turned out to be an old-fashioned soda shop. Sure enough, it was an “I Drink Your Milkshake” sketch. And it got an interesting audience response — not crickets or forced laughter, but what seemed to me to be chuckles of sheer bafflement. Most of the audience simply didn’t know what was going on. (The biggest laugh line was Kenan Thompson joking that Hader would get a cold from his shake — hardly a reference to the original gag or the movie.) It was a great lesson in the limited reach of Internet fads and viral video.

The sketch is based on a scene from There Will Be Blood in which Daniel Day-Lewis’ crazed oilman shouts “I drink your milkshake!” I haven’t seen the movie yet, but I gather it’s roughly equivalent to Borat saying “I crush her” only more violent. Various geniuses made viral videos parodying the line, or mashing it up with the Kelis song “Milkshake,” or otherwise creating Internet hilarity. New York Magazine’s Vulture blog called it (only semi-sarcastically, as far as I can tell) “2008’s fastest-growing catchphrase” and provided a guide to its proper usage. Various non-NYC-insidery-blog media outlets picked up on what the cool kids were blogging about, and soon you had the Associated Press noting in its Oscar roundup:

Despite the art-house nature of “There Will Be Blood,” Day-Lewis’ performance has seeped its way into popular culture. A line he bellows during the film’s stunningly violent climax — “I drink your milkshake!” — has become a bit of a catch phrase.

Note the hedge “a bit.” Judging by the response to SNL’s milkshake sketch, the catch phrase hasn’t seeped very far beyond the in-the-know audience from which it came. It’s saying a lot if Saturday Night Live’s audience — not a hip bunch like the Daily Show crowd, but probably a good barometer of general pop culture awareness — missed the joke.

The sketch is a good reminder of how even the Internet’s top pop culture blogs are still pretty self-contained and inter-referential and off the general population’s radar. The same thing happened last year when Best Week Ever discovered “Chocolate Rain.” They tried to turn their discovery into a pop culture phenomenon; viral vid parodies ensued; and “Chocolate Rain” singer Tay Zonday appeared on Jimmy Kimmel’s show — again, to the audience’s utter bafflement.

I Drink Your Milkshake and Chocolate Rain are both fascinating examples of pop culture’s real-time, Internet-era metamorphosis. Their narrow reach, and the hipster blogs’ attempts to recreate old-school fads like catch phrases and characters in viral video form, show that maybe things aren’t changing as quickly as we thought.