She stole a piece of him. Now he wants it back.
That’s the tagline to what is now my least-favorite movie, based on one of my all-time favorite novels. When you understand that the woman who “wrote” and directed it thieved it literally scene-for-scene from a man’s ten-years-published novel, you’ll appreciate its irony. I wonder if she’ll direct the (theoretical) sequel, the one where the legit production company who owns the rights to the book sues her distributor and ensures that this hack scribe “never works in this town again.” No, I imagine they’ll have to get someone else to lens that one. Oh, and I’ve already got that sequel script on file with the copyright office, so don’t get any delusions of career resuscitation.
Most plagiarism treads a grey area, difficult to prove. Maybe the concept appears derivative, the characters sketched in similar strokes, or a snatch of dialogue is reminiscent. The above example is an extreme one, with often-verbatim dialogue from the mouths of identically-adjectived characters who follow the exact plotted paths of their literary twins on the page in chronological order. Even an officially-blessed adaptation would rarely shadow its source material so closely. Fortunately, the movie is terrible in every way, not even worth the 35mm stock that it sullied, yet I’m conflicted about drawing even more undeserved attention to it. The larger issue is what makes this a worthwhile discussion.
We analyze media around here, per our namesake mandate. In this era of highly-specialized TV networks, direct-to-DVD, web series, blogs, vlogs, and podcasts, there’s an exponential amount of content saturating the ether these days. Most of it nonfiction infotainment (like this site), and most of that, noise. Still, the so-called “demand” for fictional content is far greater as well due to these outlets’ existence/capacity, while its audience is more segmented than ever (one positive evolution). Unfortunately creativity, talent, and professional execution can’t grow at the same rate as demand and production. It’s like having a restaurant’s line cook create their new menu. The ability to operate a microwave doesn’t legitimize your published cookbook. And just because you can doesn’t mean you should.
The new generation of media consumers is very tolerant of poor production values as a by-product of greater choice of voices. They’ve grown up with YouTube and iPods and reality television dominating their diets, rendering the term broadcast quality meaningless. We now trade development for diversity, polish for portability. But pass or fail, truth or consequences, at least come up with your own material! If you’re going to offend my senses, it had better be original. I can sometimes overlook one side of that equation, but not both.
All artists borrow. Especially in their early efforts, influences are often worn on sleeves. It’s how we learn, and why so many student projects suck. They’re exercises, dumpster abortions at best, and not meant for public consumption (or DVD distribution). For years, every riff I wrote sounded like Van Halen, but they never left my bedroom. And while Edward’s ghost still pervades my phrasing to this day, I’m picking my own notes.
Look, the number of pleasing chord progressions is somewhat finite. Complementary colors will be used together more often than others. Time-tested techniques are mastered and passed on. We humans have a fairly small palette of expressible emotions, and only a handful of mythological themes drive nearly all fiction. We accept this, even if it’s largely unspoken. What we want are new interpretations. Change the point of view, put those twelve notes in a different order, select a new medium – simple alterations at the conceptual level will lead to exponential originality as the silk of your ideas is spun and woven into a web of brainstorming that eventually becomes the finished work. That, or I’ll see you in court.
Let’s get some comment discussion going on this one. Release the hounds!