Pages

Showing posts with label popular culture. Show all posts
Showing posts with label popular culture. Show all posts

Tuesday, March 15, 2011

Algorithmic Culture, Redux

Back in June I blogged about "Algorithmic Culture," or the sorting, classifying, and hierarchizing of people, places, objects, and ideas using computational processes. (Think Google search, Amazon's product recommendations, who gets featured in your Facebook news feed, etc.) Well, for the past several months I've been developing an essay on the theme, and it's finally done. I'll be debuting it at Vanderbilt University's "American Cultures in the Digital Age" conference on Friday, March 18th, which I'm keynoting along with Kelly Joyce (College of William & Mary), Cara Finnegan (University of Illinois), and Eszter Hargittai (Northwestern University). Needless to say, I'm thrilled to be joining such distinguished company at what promises to be, well, an event.


The piece I posted originally on algorithmic culture generated a surprising -- and exciting -- amount of response. In fact, nine months later, it's still receiving pingbacks, I'm pretty sure as a result of its having found its way onto one or more college syllabuses. So between that and the good results I'm seeing in the essay, I'm seriously considering developing the material on algorithmic culture into my next book. Originally after Late Age I'd planned on focusing on contemporary religious publishing, but increasingly I feel as if that will have to wait.

Drop by the conference if you're in or around the Nashville area on Friday, March 18th. I'm kicking things off starting at 9:30 a.m. And for those of you who can't make it there, here's the title slide from the PowerPoint presentation, along with a little taste of the talk's conclusion:



This latter definition—culture as authoritative principle—is, I believe, the definition that’s chiefly operative in and around algorithmic culture. Today, however, it isn’t culture per se that is a “principle of authority” but increasingly the algorithms to which are delegated the task of driving out entropy, or in Matthew Arnold’s language, “anarchy.” You might even say that culture is fast becoming—in domains ranging from retail to rental, search to social networking, and well beyond—the positive remainder of specific information processing tasks, especially as they relate to the informatics of crowds. And in this sense algorithms have significantly taken on what, at least since Arnold, has been one of culture’s chief responsibilities, namely, the task of “reassembling the social,” as Bruno Latour puts it—here, though, by discovering statistical correlations that would appear to unite an otherwise disparate and dispersed crowd of people.

I expect to post a complete draft of the piece on "Algorithmic Culture" to my project site once I've tightened it up a bit. Hopefully it will generate even more comments, questions, and provocations than the blog post that inspired the work initially.

In the meantime, I'd welcome any feedback you may have about the short excerpt appearing above, or on the talk if you're going to be in Nashville this week.

Friday, November 19, 2010

"Harry Potter Grows Up": The Meaning Behind a Cliché

For those of you who aren't familiar with The Late Age of Print, the final chapter of the book focuses on the extraordinary literary sensation that is Harry Potter. So, needless to say, Harry Potter has been on my mind quite a bit lately, especially with today's release of the first installment of the film adaptation of Harry Potter and the Deathly Hallows.

I don't have much to say about the latest film, honestly, not having yet seen it -- although I intend to, as I've seen the previous six movies and have read/enjoyed all seven books. Instead, what I've been thinking about lately is the age of Harry Potter, or rather that of his fans.

I teach an undergraduate course at the 300 or Junior level called "The Cultures of Books and Reading"; during one week, we focus on the many-headed Harry Potter phenomenon. When I first launched the book class, back in 2006, I was excited to realize that my students were basically Harry's contemporaries. Those among them who were eleven years old -- Harry's age -- when the series launched in 1997 were twenty in 2006, which is the typical age of most college Juniors.

But now it's four years later, and those twenty year-olds are turning twenty-four. Yes, that's right, twenty-four -- practically a quarter century. Graduate school age. Marrying age. Getting established in one's career age. Even baby-having age. I'm feeling old just writing about them! Indeed, it's not just that Harry Potter and the actors who portray him and his friends on screen have grown up. The whole fan culture surrounding Harry Potter has grown up, too, to the point where, as with Star Wars fans, we might even start thinking about a whole new generation of Potter enthusiasts.

This is what the release of the first installment of the film adaptation of Harry Potter and the Deathly Hallows really means. It marks the beginning of the end of the film adaptations, yet it also marks the beginning of the beginning of the next generation of Potter fandom. What role, if any, will the books, films, toys, games, candy, costumes, etc. play in their lives? And what new meanings will the Harry Potter franchise take on once the torch gets passed, or rather shared?

Monday, June 14, 2010

World Cup...Fever?

Most of my friends seem to have developed World Cup fever, including those who, up until now, haven't shown any particular interest in soccer/football. I suppose that's how you end up with one in every two people on the planet watching at least some portion of the tournament.

But for those of you who, like me, are suffering from the opposite condition -- World Cup hypothermia -- I'm happy to share this most excellent clip from The Simpsons. Enjoy.

Monday, February 15, 2010

Harry Potter and the Simulacrum

I've been meaning to blog about this for a couple months now. An article of mine, which may be of interest to readers of my book, The Late Age of Print, was published in the October 2009 issue of the journal, Critical Studies in Media Communication (CSMC). Here's the citation, abstract, and keywords:
Ted Striphas, "Harry Potter and the Simulacrum: Contested Copies in an Age of Intellectual Property," Critical Studies in Media Communication 26(4) (October 2009): 1-17.

This essay begins by investigating how and on what basis the boundary between originals and copies gets drawn within the framework of intellectual property law. It does so by exploring Harry Potter-related doubles that were featured in the 2000 trademark and copyright infringement case, Scholastic, Inc., J. K. Rowling, and Time Warner Entertainment Company, L.P. v. Nancy Stouffer. The paper then moves on to consider how, within the context of the case, the boundary line dividing “originals” from “copies” grows increasingly indeterminate, so much so that it becomes untenable to speak of either category at all. It thus investigates what happens when the figure of the simulacrum, which troubles bright-line distinctions between originals and copies, enters into the legal realm. Theoretically, the simulacrum would seem to pose a challenge to intellectual property law's jurisprudential foundations, given how it blurs what should count as an “original” or a “derivative” work. This paper shows that while this may be true in principle, powerful multimedia companies like Scholastic, Time Warner, and others can strategically deploy simulacra to shore up their intellectual property rights.

Keywords: Harry Potter; Intellectual Property; Copyright; Trademark; Simulacrum
There's a good deal of thematic overlap between the article and Chapter 5 of The Late Age of Print, which also focuses on Harry Potter and intellectual property rights. They differ, though, in that the journal essay is more theoretically focused than the book chapter; the latter, I suppose, is more historical and sociological.

The strange thing about "Harry Potter and the Simulacrum" is that even though it's quite theoretical, it's also quite -- I'm not sure what exactly -- playful? comical? whimsical? In any case, it's probably the most fun piece that I've ever written and published. I attribute that largely to the bizarre court case at the center of the essay, which I swear must have been plucked from the pages of a Lewis Carroll story.

In a perfect world I'd link to a PDF of the article, but the journal publisher, Taylor & Francis, prohibits it. In an almost perfect world I'd link you to a post-print (i.e., the final word processing version that I submitted to CSMC), but even that I'm contractually barred from doing for 18 months from the time of publication.

Taylor & Francis charges $30 for the essay on its website, which to my mind is just ridiculous. Heck, a yearly personal subscription to the journal costs $81! So, if you're university-affiliated and want to take a look at the piece, I'd encourage you to check with your own institution's library. If you're not, I'm allowed to share a limited number of offprints with colleagues, and you can email me for one.

To complicate matters even more, the printed version of "Harry Potter and the Simulacrum" has the wrong copyright declaration. I signed Taylor & Francis' double-secret "license to publish" form instead of the usual copyright transfer. Despite that, the piece still says © National Communication Association, which is the scholarly society under whose auspices CSMC is published. Sigh.

Suddenly this is starting to sound like a Lewis Carroll story....

Thursday, February 11, 2010

Where the Cylons will come from

I missed most of the SyFy (née Sci Fi) series Battlestar Galactica (2004-2009), though I managed to catch enough to know that I wanted to watch the new prequel, Caprica, from the beginning. I haven't been disappointed. With the pilot and two episodes now under my belt, it's safe to say that I'm hooked.

Caprica provides an origin story for the Cylons, a cyborg race created by humans who later attempt to annihilate their masters. That may sound pretty de rigueur as far as the sci-fi genre goes, but here's the twist: we learn that each Cylon's "being" -- his, her, or its unique identity or essence -- is actually the aggregation of a human individual's medical records, purchasing patterns, educational transcripts, voting records, electronic communications, and other personal information archived online. The Cylons are, in other words, the walking, talking, informational avatars of the human race.

It was with all that in mind that I happened upon the clip embedded below, which is from the February 2, 2010 episode of The Colbert Report. The title, "Cognoscor Ergo Sum," translates from the Latin as, "I am known, therefore I am." How apt. In the segment Colbert spotlights Blippy.com, IJustMadeLove.com, and other websites that allow people to reveal and record the intimate details of their daily lives. Blippy lets you broadcast what you've just purchased using your credit card, and where. IJustMadeLove allows you shout from the electronic rooftops when, where, and how you've just done the nasty. (Yes, I wish I were making that one up.)

The Word - Cognoscor Ergo Sum
www.colbertnation.com
Colbert Report Full Episodes


There's been all sorts of talk for years now about the vulnerability of information online, and it's no surprise given the proliferation of networked databases that identity theft has emerged as one of the foremost crimes of our time. What's even more striking to me, however, is how Caprica and the Colbert clip together seem to shift the meaning of -- and even up the ante on -- identity theft.

Now, don't get me wrong. I'm not suggesting that we humans are poised to give rise a line of super-machines intent on wiping us out. What I am suggesting, though, is that we can only begin to imagine how and for what purpose the digital data trails that we leave behind today will be used in the future. I like to think about it this way: when I started college, how could I have anticipated a rash of photos and videos surfacing close to 20 years later on Facebook? Heck -- there was barely an internet back then, let alone affordable scanners or even the idea of social networking.

Leave it to popular culture, then, to register one of the critical questions of this new decade: how does a society plan for an information future that may well be unfathomable, technologically speaking?

Monday, February 08, 2010

Oprah has landed

It's always intriguing for me to see how life influences the direction of one's work. When I was growing up in the 1980s, 4:00 p.m. meant one thing: The Oprah Winfrey Show would be on the television set in my home. Sometimes my mother would take a break from cooking to watch the show in our TV room. If the meal was complicated, she'd just turn the TV up and listen from the kitchen. Either way, 4 pm meant that it was her time -- and consequently my time -- with Oprah.

Plus or minus two decades later I published an article on Oprah's Book Club in an academic journal called Critical Studies in Media Communication and, later, a chapter on the same subject in my book, The Late Age of Print: Everyday Book Culture from Consumerism to Control (Columbia University Press, 2009).

Because I've been ensconced in Oprah for so long, both personally and professionally, it's difficult for me to understand why people refuse to take her seriously. I suspect a lot of it has to do with offhanded impressions about the The Oprah Winfrey Show, television talk shows in general, or indeed Oprah herself. Honestly, I don't have much tolerance for critics who disparage or dismiss the Oprah phenomenon without studying it intensively, in all of its complexity and over the long-term. I don't embrace all-things-Oprah by any means, yet it seems pretty clear to me that she's transformed and even enriched U.S. culture in countless ways.

I'm excited, therefore, to see this week's edition of the media blog In Medias Res devoted to the theme of Oprah. Here's the lineup:
  • Monday: "Stories of O: Oprah's Culture Industries" by Kimberly Springer
  • Tuesday: "Too Big to Fail" by Janice Peck
  • Wednesday: "For the Sake of the Children" by John Howard
  • Thursday: "I've Been Rich and I've Been Poor: The Economics of Oprah" by Vanessa Jackson
  • Friday: "Oprah's Got Beef?: Alleged Matriarchies and Masculinist Rhymes" by Kimberly Springer
I'm looking forward to seeing how the series of posts unfolds. I find that academic authors tend to be extremely cynical towards Oprah, both the person and the broader phenomenon, and so I'm keeping my fingers crossed here. Hopefully the contributors will give such complex subject matter its due.

You can expect to see me leaving comments on IMR throughout the week, since, clearly, this is a topic that's been with me for a good long while. I'd encourage you to chime in, too. In the meantime, enjoy the Letterman-Oprah-Leno ad from last night's Superbowl.

Friday, February 20, 2009

Countercultures

Over the last year or so I've been thinking a great deal about countercultures, or more specifically, the countercultural legacies of the 1960s. What first prompted me to do so was Fred Turner's outstanding book, From Counterculture to Cyberculture (University of Chicago Press, 2006), which I blogged about here back in January 2008.

Since then I've had the good fortune of reading a number of books, all of which explore the persistence of countercultural practices and sensibilities from the 1960s. These include: Preston Shires' Hippies of the Religious Right: From the Counterculture of Jerry Garcia to the Subculture of Jerry Falwell (Baylor U.P., 2007), a wonderful book that I just finished, about the meteoric rise of evangelical Christianity in the late-20th century and its roots in the 1960s counterculture; and Joseph Heath and Andrew Potter's Nation of Rebels: How Counterculture Became Consumer Culture (Collins Business, 2004), a provocative look into how an anti-establishment, "rebel" ethos has come to pervade what used to be called mass culture.

Most recently I broached Thomas Frank's The Conquest of Cool: Business Culture, Counterculture, and the Rise of Hip Consumerism (University of Chicago Press, 1997). I'd been putting it off for some time, mostly because I know Frank looks unfavorably on cultural studies (my primary intellectual identification). Rightly or not, he claims that cultural studies, in its concern for "resistant" readings and uses of mass cultural artifacts, mis-recognizes the politics of culture. Since the late 1950s, Frank shows, advertisers have been touting not only their own anti-establishment sensibilities but infusing them into their advertising campaigns. Advertising, he argues, is a principal--and unusually effective--site where the critique of mass culture has been waged. Of course, this critique exists not for the sake of tearing down "the system," as it were, but rather for encouraging ever more consumption vis-à-vis product and consumer differentiation.

Frank may caricature cultural studies, but the larger point he makes is a compelling one. The so-called "creative class" about whom Richard Florida has written so much in recent years has its origins in the late-1950s and early-1960s, when (in the case of Frank's book) upstart ad men and women lashed out against the stultifying organizational and scientific structures within which they worked.

But what's also intriguing to me is how it wasn't simply advertising per se that led the way. Indeed, there was something of a countercultural, "creative revolution" happening in any number of other industries at the same time. Last summer I blogged about Gerard Jones' history of the comic book industry, Men of Tomorrow. I didn't realize it then, but Jones tells a story similar to that of Thomas Frank. Before the 1960s or 70s, most comic book companies employed writers and artists whom they treated like hacks. A good deal of the material was formulaic and dictated from on high, and the "creatives" were meant merely to execute that vision. And though I'm less familiar with the music industry, I gather that there's a similar story to be told there as well. If Tom Hanks' silly little movie That Thing You Do! (1996) is any indication, record producers of the 1950s pretty much ran the show, subordinating talent to what they knew--or thought they knew--they could package and sell. Is it any surprise that, at the end of the film, the character Jimmy (Jonathan Schaech) breaks from Mr. White's (Tom Hanks) Playtone record label to pursue a successful solo career making serious rock 'n roll? He's the film's embodiment of the creative revolution that was about to happen in music.

I'm not sure where all this reading is going, honestly. Nevertheless, all of the books I've mentioned suggest that we now live, as it were, in the long shadow cast by the 1960s. That makes me wonder: what, if anything, will be the unique contribution of this moment in which we're now living? How does one create, let alone "rebel," when the dominant ethos is already "anti-establishment" and throw-out-the-rules "creative?"

Saturday, January 10, 2009

Lessig on Colbert



Perhaps the only thing more daunting than squaring off in front of the United States Supreme Court is having to go head-to-head with Stephen Colbert on his television talk show. Lawrence Lessig handles things beautifully in discussing his latest book, Remix: Making Art & Culture Thrive in the Hybrid Economy (Penguin, 2008). Bravo, Professor Lessig.

Be sure to check out Lessig's Blog for some creative remixes of the segment.

P.S. Happy 2009, y'all!

Tuesday, December 02, 2008

"...not a democracy"




There was a telling moment in last night's Inside the Actors Studio interview with Daniel Radcliffe, who plays Harry Potter in the film adaptation of the bestselling book series. About midway through the video sequence embedded above, host James Lipton asks Radcliffe how he felt about the various romantic pairings author J. K. Rowling had crafted for her characters. Lipton then admits that he once believed Harry and Hermione Granger would eventually end up together, whereupon the studio audience applauds. "Vox populi," Lipton observes.

Radcliffe's response? "The Harry Potter series is not a democracy." Truer words haven't been spoken.

Friday, August 29, 2008

Hari Puttar takes Bollywood by storm...maybe

From Monday's BBC Entertainment News:
Warner sues over Puttar movie
Warner Bros says it wants to protect intellectual property rights.

Harry Potter maker Warner Bros is suing an Indian film company over the title of upcoming film Hari Puttar - A Comedy Of Terrors, according to reports.

Warner Bros feels the name is too similar to that of its world famous young wizard, according to trade paper The Hollywood Reporter.
With thanks to Simon Frost at the University of Southern Denmark for passing on the story to me, the complete version of which you can read here. I'm in the midst of finishing up a project right now, but some commentary on the suit should follow from me soon, hopefully.

Wednesday, July 09, 2008

Men of Tomorrow

Wow.

It's rare that I read a book and feel compelled to reread it immediately. But that's what happened when I finished Gerard Jones' Men of Tomorrow: Geeks, Gangsters, and the Birth of the Comic Book (Basic Books, 2004). It offers a fascinating look into a nascent industry full of fast-talking hustlers, shrewd accountants, and nerdy young men all struggling to make their mark on U.S. culture in the 20th century.

Jones is an outstanding writer. I say this having read a fair amount of work by other comic book authors who've decided to switch genres, turning either to novels or to nonfiction. Usually the work isn't a disaster, but then again, neither is it all that memorable. It's a different story for Jones. He penned Batman, Spider-Man, and Superman early on in his writing career, where he developed a knack for exposition and an ear for engaging dialogue.

He uses both skills to his advantage in Men of Tomorrow. The book moves nimbly between large-scale social/cultural history and more intimate, narrative reconstructions of the lives of the early comic industry's key figures. What results is a precarious yet perfectly executed balancing act. Jones' account is rich with historical detail, yet he never manages to lose the plot.

The book opens with an aged Jerry Siegel, co-creator (with Joe Shuster) of Superman, learning that a blockbuster movie featuring the Man of Steel would soon be making its way onto the silver screen. It was the mid-1970s. Siegel was working as a mail clerk in Southern California, barely making ends meet and seething inside about having signed away rights to the lucrative character decades before. Men of Tomorrow then takes a sharp turn back in time and space: to New York City's Lower East Side, circa the early 1900s, where we're introduced to the sons of Jewish immigrants who'd go on to become the authors, illustrators, editors, printers, and distributors of a peripheral print genre that would, with time, become a part of the American cultural mainstream. Eventually the book returns to Siegel's desperate, last-ditch effort to secure rights to Superman--a success, it turns out, owing the rallying of fans and others to the cause.

Jones isn't only an outsanding writer, he's a talented historian and analyst. He's read practically all of the secondary literature, scholarly and otherwise, on comic books. He interviewed most of the early industry's key players at one time or another, in addition to their family members. He meticulously reconstructs contested information and never tries to pass it off as anything but. Beyond these more insular, disciplinary concerns, his research displays a remarkable sensitivity to comics' critical reception by midcentury academics and politicians who, owing to experiences far removed from those in the comic book industry, fundamentally misunderstood the genre's psychosocial and cultural impact. Jones is a historian with a deft touch.

Men of Tomorrow ends with a provocative claim, namely, that U.S. culture today is significantly the product of geeks. And in this respect it shares something of a kinship with another book I admire: Fred Turner's From Counterculture to Cyberculture, which I've mentioned in passing on this blog. In their best moments, both texts capture something rare. They manage to put into words what Raymond Williams called a "structure of feeling"--what it felt like to live (for some, at least) in 20th century America.

This is the mark of history at its best. Excelsior!

Saturday, January 05, 2008

Should I join Facebook?

I'm undecided on the issue, personally, which is why I'm asking all of you to weigh in. On the one hand, it's enough for me simply to maintain this blog, let alone to contribute to Sivacracy (oh--and earn a living). On the other hand, a recent peek at a friend's Facebook page showed me that, well, essentially everyone I know in the universe belongs. No one's directly pressured me to join, yet I feel compelled to be a part of something so many people seem to be engaging in. (Yes, I succumb fairly easily to peer pressure.)

In other news, I've made a few minor changes to add further interactivity to D&R. Each post now contains a footer with email, Digg, and subscription links. I've also changed my site syndication, which is now handled through FeedBurner.

Happy 2008, everyone, and let me know what you think about Facebook.


P.S. For more on this thread, see my post from May 2008, "Why Did I Join Facebook?"

Wednesday, August 22, 2007

Reality TV: The new opinion poll

It's over. Summer break, that is. Today started orientation for new graduate students in my department here at Indiana University, which means fall semester has begun for all intents and purposes. Honestly, summer really ended about 10 days ago for me, when on last Monday morning there arrived an avalanche of emails pertaining to things that needed to happen NOW before the semester started. And on top of that, my department moved buildings. More on that, later.

The summer was a reasonably productive one, as I'm sure readers of D&R already know. When I wasn't writing, reading, prepping for fall classes, or traveling, I spent a good deal of time watching reality TV. It seems as though that's becoming an annual occurrence for me, as one of my posts from last summer attests and as my colleague, Jon Simons, reminded me today during one of our orientation sessions. This year I got sucked into two cooking competitions, Fox's Hell's Kitchen and Bravo's Top Chef, in addition to On the Lot (a competition to become a feature film director) and So You Think You Can Dance. (Yes...I watched So You Think You Can Dance. Snicker all you want.)

Most of these shows wrapped within the last week, and so with a little critical distance under my belt, I'm moved to reflect on their significance as a genre. I'm especially intrigued with shows like On the Lot and So You Think You Can Dance, both of which, like American Idol (Pop Idol for my readers from across the Pond), base their weekly contestant eliminations on audience call-ins, text messaging, and internet voting.

This is marketing research, and a clever form of it at that. It's so clever that rather than costing money, it actually generates income for show producers who subsequently sell the already-proven skills of the contest winner in the form of CDs, music downloads, movies--you name it. Think about it for a moment. Rather than someone from some random opinion-polling firm calling you up during dinner, bothering you with questions about whether you'd prefer to see this or that type of film, TV program, or performing artist, viewers contact these shows of their (our) own volition to provide essentially this type of information. We do it en masse. Now, this isn't perfect research, to be sure. People typically can vote as often as they'd like within an allotted period of time. But even so, what's essentially happening is that the unsexy drudge-work that used to be hidden away in mass culture's "back office" (i.e., opinion polling) now is emerging front-and-center as a key aspect of the entertainment value of these shows. And of course, it's never called "opinion polling" or "market research." In good "democratic" spirit, these shows always stress audience interactivity and empowerment. (I wish I had a dollar for every time I heard Ryan Seacrest proclaim, "America voted, and here are the results....")

All this is part of a larger set of trends. From bar codes becoming things that people other than cashiers now pay close attention to, to the widespread, public testing of "beta" versions of products and more, the boundaries between what used to be called "production" and "consumption" are increasingly fuzzy. And oftentimes, it seems, this fuzziness provides not only for a richer, more potentially informed and interactive relationship with TV programs and other cultural consumables; it also opens up weekly, hour-long opportunities to test-market products in front of millions of viewers.

Focus groups are just sooooo 20th century, aren't they?

Tuesday, July 17, 2007

Harry Potter...stolen!

I wasn't planning on writing for another week or so, but this one's too good to pass up. I just caught this article in The New York Times about the final installment of the Harry Potter book series, Harry Potter and the Deathly Hallows, having made its way onto the internet. Someone got their hands on a copy of the book sometime before this Saturday's highly-anticipated release, photographed a good chunk of the pages, and then posted them online. I've checked around and, sure enough, there they are--at least, that is, until Potter's publishers get their act together and the takedown notices start flying!

Now, to all you Potter fans out there, you can rest assured that I'm not going to spoil any of the secrets. I like the books myself and respect your love of the series too much to do that. And to those of you who are hoping I'll spill the beans, sorry. You'll have to go elsewhere for that. My point in writing is to comment a bit on the Harry Potter security phenomenon. I talk about this at length in my upcoming book, The Late Age of Print: Everyday Book Culture from Consumerism to Control, which includes a chapter called "Harry Potter and the Culture of the Copy." Here I'll make just a few offhanded observations.

First, I take this security meltdown, and those preceding the release of the previous two Potter installments, as an effect of what in The Late Age of Print I call "the mass production of scarcity." Think about it: 12 million copies of Deathly Hallows have been printed in the U.S. alone. By now they're in bookstores all over the country, doing absolutely nothing as they sit locked away in stock rooms...other than generating hype.

Since Harry Potter and the Goblet of Fire, the boy wizard's publishers have been enforcing what the book industry calls "global lay-down dates," which, the publishers say, ensure that the books' surprises remain sacrosanct. Clearly, they don't. Even so, global lay-down dates do perform a kind of magic: they make Harry Potter, a mass-produced commodity if there ever was one, disappear despite his sheer ubiquity. And as anyone who's taken Business 101 will tell you, scarcity tends to augment demand.

My second observation pertains to the fact that Harry Potter and the Deathly Hallows made its way online in the form of digital photographs rather than, say, scans. The folks over at PC World have noted that, in doing so, the culprit may well have inadvertently revealed her or his identity:
In an interesting development it appears that the person who took the pictures of the book left his camera meta info attached to the image files. This is significant because with the camera meta data you can extrapolate the serial number of the camera. And with that information and time authorities could track down who took the pictures.
Little did I--someone who studies digital culture--know that digital photos contain this kind of personal information. I suppose it's naive of me not to have realized this, since privacy is nothing if not compromised online. In the end, what a cautionary tale it will be if the pernicious Potter pilferer is apprehended because of the digital trace she or he has left behind.

And finally, despite most, if not all, of Harry Potter and the Deathly Hallows' secrets already having been revealed, I have nothing but confidence that all 12 million copies of the book will eventually sell--and then some.

Thursday, June 07, 2007

Second class music?

First off, apologies, apologies. I've been swamped with writing projects of late, and so the prospect of writing still more just seemed too out of reach. Now that I'm out from under the really heavy stuff (at least for the moment), I figured I should get back into the swing of things on D&R. Thanks as always for your patience, dear readers.

I'm likely to get some smirks for telling the world this, but I download music from Apple iTunes. I know they're not the friendliest of companies when it comes to music downloading, especially since they've long maintained Digital Rights Management (DRM) schemes that regulate what you can and cannot do with your paid-for music. I'm not a huge music downloader, though, and so I've never really bothered to look elsewhere, despite my professed uneasiness with DRM.

All that's just a lead-up to tell you that I receive regular emails from iTunes, telling me about new music releases and other pertinent news. The other day, this message arrived in my inbox:
Now you can download music and videos from EMI that are free of DRM rules and restrictions. With iTunes Plus, you can burn the music you download from iTunes to as many CDs as you need, transfer it to as many computers (Mac or PC) as you want, or sync it to as many devices as you like. And because it's encoded in 256 kbps AAC, your iTunes Plus music is virtually indistinguishable from the original recording. Hear it for yourself — you can preview all iTunes Plus songs before purchasing. iTunes Plus music is available now for many EMI artists, such as Paul McCartney, the Rolling Stones, Norah Jones, Coldplay, and many more. DRM-free EMI music videos are still $1.99 and music tracks are $1.29.
I'd been aware of Steve Jobs' mention a few months back of how he thought music should be stripped of its DRM. Needless to say, I was pleased to see some movement on the issue from Apple.

But then I started to think about it further. Regular, DRM-laden music downloads are 99 cents on iTunes. That means, if you want to be free of DRM, you have to pay 30 cents more per song. That's not a lot of money, admittedly, though if you're a real music aficionado, I suppose it could add up over time. Anyway, what bugs me is the principle; what's happening with schemes such as this is that Apple and other companies are creating (at least) a two-tier system of property owners. Those with more money can own their songs and videos more or less free-and-clear. Those unwilling to ante up the additional money, on the other hand, become indentured to iTunes and the record companies with respect to DRM-induced terms of use.

Something strange is happening to property, in other words. We're slowly creating a system in which there are "haves" and "don't quite haves." I'm also troubled by the way in which these companies are beginning to leverage the mere prospect of DRM to extract more money from consumers.

I'm not altogether sure what my solution to the issue would be. I'd be inclined to say get rid of the DRM altogether, though I'm sure that wouldn't sit well with intellectual property producers and distributors. Then again, maybe that wouldn't be such a bad thing after all.

P.S. If you want a copy of the article to which I linked above, you can email me at: striphas@indiana.edu

Thursday, January 25, 2007

Just say no to The Matrix

I'm writing to declare a moratorium on scholarly books and essays on The Matrix.

Why? First, it seems as if every other journal and book catalog I receive these days contains some new screed on one or more installments of the film trilogy. After I pointed out this phenomenon, a friend of mine in rhetoric aptly commented, "It's as if The Matrix were becoming to the humanities what Abraham Lincoln's 'Gettysburg Address' has long been to studies of public address in the United States"--which is to say, groundbreaking at one time, but at this point, overdone. Indeed, the shear volume of Matrix scholarship seems to be transforming the film into something of a trite object, so much so that the phrase, "the Matrix has you," is becoming our scholarly reality.

Beyond that, though, a good deal--though certainly not all--of this scholarship tends to be rather boring anyway. Part of this has to do with the fact that The Matrix wears much of its potential scholarly insight on its sleeve. "Oh my! Is that Baudrillard's Simulacra and Simulations? The film must be saying something about postmodernism!" "Is that Cornel West I see? There must be something philosophical going on here!" "Hmmm....how real is our so-called waking life? Maybe the films are about epistemology!" "Cause and effect, is it? Aha! Etiology at work!" "So I've already made all my choices in life, and now all that's left to do is to find their meaning. Perhaps the films are about ontology after all!" And so on. This isn't to say The Matrix trilogy isn't valuable for, say, teaching purposes, and this isn't to say that there aren't good questions to be asked of and through the films even today. But at this point, scholars interested in writing still another book, essay, or what have you on The Matrix would do well to proceed cautiously...very cautiously.

Lest you think I'm just a tired old crank, I will say that my favorite piece on The Matrix is Jennifer Daryl Slack's "Everyday Matrix," which is included in her edited collection, Animations [of Deleuze and Guattari]. It's a wonderful look at the mobilization of affect in, through, and beyond the first film, and in this respect it differs from many of the more textual "readings" or straightforward "philosophical" ruminations that tend to dominate the burgeoning field of Matrix scholarship.

And yes, indeed, it's fast becoming a field--or maybe even an industry. Heck--if you need a quick publication, something on The Matrix would be a safe bet.

Monday, November 06, 2006

Dee, me, & the PMRC

First of all, if you're living in the United States, vote tomorrow. That's what's really important.

Now on to matters at hand. I was watching one of those "totally 80s" countdown shows on VH1 the other day, when I heard the Twisted Sister anthem, "We're Not Gonna Take It," start blaring. It was such a blast from the past, especially seeing lead singer Dee Snider all decked out in the band's drag-show-gone-wrong regalia. I never was much of a Twisted Sister fan myself, though several of my friends had a penchant for drawing the band's "TS" logo all over their notebooks when we were in junior high. Even so, there's something so wonderfully anti-establishment about "We're Not Gonna Take It" that it always manages to put a smile on my face.

Or so I thought. The "We're Not Gonna Take It" clip also included a "where are they now?" segment, which focused mostly on the comings and goings of Dee Snider since the heyday of Twisted Sister. Evidently--and perhaps this is news only to me, since I live in Indiana--he's a staunch Republican who's campaigned for Arnold "the Govinator" Schwarzenegger and other Republican candidates. I was shocked to hear this, not only because of the song's message (and here I'm reminded of the adage, "the politics of media texts aren't inscribed in media texts..."), but also because of Snider's resistance to the Parent's Music Resource Center or PMRC. For those of you who don't remember, the PMRC was founded in the mid-1980s by spouses of prominent US senators (then-Senator Al Gore's partner, Tipper, chief among them) who campaigned to censor "explicit" music. One of the more intriguing moments that I can recall from my adolescence is seeing images of Dee Snider emerging from the US Capitol after testifying on behalf of musicians opposed to the PMRC. Talk about dissonance.

I suppose it was naive of me to assume that Snider's resistance to media censorship would carry over into a more general, left-leaning politics. Beyond that, I'm also reminded of the fact that the PMRC was composed of both Republicans and Democrats, so I guess there should have been no reason for me to assume that Snider would have been a Democrat, anyway. I guess that all just goes to show how formal governmental politics and the politics of culture aren't always commensurable and how, conversely, they sometimes make strange bedfellows.

Wednesday, September 06, 2006

V for, "Does it really matter?"

Last weekend I rented V for Vendetta, the Natalie Portman/Hugo Weaving vehicle that's based on comic book impresario Alan Moore's graphic novel. For those of you who haven't seen the movie, it's set in the not-too-distant future and is about the people's struggle against a totalitarian state--Britain, to be exact. V, the main character, is a modern-day Guy Fawkes who inspires the oppressed masses to rise up and to confront the homophobia, religious intolerance, fear-mongering, and lack of civil liberties that have beset jolly-old England.

What's abundantly clear is that the film is a warning about the slippery slope countries like Britain and the United States find themselves on these days. The future Britain it portrays--where copies of the Koran are banned, sexual minorities must live underground, art is suspect, and eavesdropping on the populace is the order of the day--is, in some respects, embodied in our present, though perhaps not in quite those extremes.

You might say that the film offers a scathing critique of the current policies of the British and U.S. governments, especially many of the initiatives that have begun under the auspices of the "war on terror." My question is this: Does it really matter?

Perhaps I've been out of the loop, but I don't get the impression that V for Vendetta has sparked much of a serious public dialogue about democracy's slide toward totalitarianism in either country. Perhaps that's asking too much from one film. But for me it raises a larger question: to what extent are the media genuinely effective in producing concrete shifts in governmental policy? Another way of putting this would be to say: to what extent is cultural politics able to change formal governmental politics or policy anymore?

V, for me, is an intriguing test-case. To the extent that it hasn't seemed to produce much public outcry (or effective public outcry), my inclination would be to say that the power critics once attributed to cultural politics may be on the decline. Don't get me wrong. I still believe cultural politics matters. By the same token, a film like V suggests to me that cultural politics may not matter in the way that it once did.