The Fog of Music Criticism

Rob Horning of PopMatters posted a really thoughtful response to a Peter Suderman post on the inherent positivity of music criticism (in response to a Joe Queenan article, aaaaa! blogosphere!). First, Suderman’s claims:

Scan the sidebar of Metacritic’s music page. Nearly all of the review averages are positive or very positive, and almost none of them are straightforward pans. In fact, right now I don’t see a single album with a review average that gets a score categorized “generally negative reviews.” Contrast this with the movies page, which contains more than a dozen films with low averages. Even the limited release indies — the “artsy” films — are often given low marks.

Is contemporary pop music really that much better than contemporary mainstream filmmaking? I think not. Instead, it’s just that the music reviewing culture has developed in such a way that most everything scores a “pretty good” or a “not bad.”

Horning responds:

Unlike films, many many records get released, and just noticing one and running a review of it already marks it as significant. The substance of the review itself is almost beside the point. Acknowledging its existence is already an admission that it’s “pretty good,” so it would be strange for the review to suggest otherwise.

It might amuse some readers to see well-established artists attacked, but who wants to read negative reviews of stuff they haven’t heard of? There’s no point, and the reviewer just comes across as mean. I certainly felt this way about myself when I was writing the negative reviews. It seemed dumb for me to be discouraging these performers, who had no chance of making it, really, no matter what I wrote about them. It’s no fun pissing on people’s dreams. In fact, it made more sense to try to champion all bands, so I could potentially claim some of the glory for helping one of them make it.

Exactly. In general, films are massive undertakings with insane budgets and scads of people involved to make it happen. For example, I recently watched 2004’s Primer, an amazing lo-budge sci-fi film that was made for an insanely low amount, around $7,000. But it still required a set of actors, a crew, tons of gear, etc. I am currently producing an album using solely the computer with which I am writing this blog. For, um, $0. For every indie film produced there are hundreds of indie albums. Why review one just to crush it? It would be rather sadistic (although most reviewers do display a hint of sadism, IMO).

I have a point of contention with this though:

Readers often want hype, not evaluation, because it gives pop culture a sure-fire context, whereas a review that traces musical influences and parses lyrics only helps a select few readers. Besides, there are no established criteria for what’s good beyond popularity or fidelity to genre expectations. Maybe Suderman thinks it’s possible that music reviews could be objective evaluations of quality, as defined by some unimpeachable universal standards, but I don’t believe these exist for pop music (or for much of anything in culture—aesthetic criteria are political creations). The pop music people consume is typically a tribal thing or a means to participate in the zeitgeist, and it’s hard as a reviewer to shape the zeitgeist from the margins.

I think what plagues much music criticism, both from professional reviewers and in the minds of listeners, is the lack of objective criteria from which to judge a work of music. The apparent criteria has become almost purely social: work is judged by its supposed “honesty”, self-consciousness, unpretentiousness, and authenticity rather than by traditional musical merits. The question becomes: is this rapper/singer actually from Brooklyn/Manchester or does he just say he is; it’s not about the musical product, it’s about the narrative. The highest rated albums are often the albums that are the most fun to write about. Yeah, you might feel like a douche praising a young white-boy rapper from Hempstead whose daddy bought him a record contract, but if the product is good, suck it up.

The most obvious example of this need to sustain narrative is J Dilla’s album Donuts. The story goes, he wrote the album on his laptop in the hospital whilst dying of cancer. Now, who wants to write a narrative about his sad, valiant efforts culminating in an album that’s, well, a piece of shit? More importantly, who wants to read that?

Now, not to get all conservatory-trained-musician on y’all, but why not focus on the product? Yes, a good back story can enhance the appreciation of music – Beethoven’s deafness for example, or Brian Wilson’s mental illness – but to ignore technical criteria in music, even pop music, is asking to be lead around in a fog of subjectivity and ambiguity.

Here’s a few objective things pop reviewers should listen for:

1. Originality: Not for its own sake, of course, but the band/artist should sound like itself. If the R&B singer sounds like Stevie with a hip hop beat, or the garage band sounds like The Velvet Underground with auto-tune, it is not original. Of course, originality for its own sake can be just as tedious, so keep an eye out for extra-musical distractions: costumes, romantic back-stories, different colored eyes…

2. The Singer: Can the singer get the same effect live as on the album without the album’s effects and auto-tune. This doesn’t mean the singer has to be classically trained, or even good. It just means he/she has to be a real performer. Also, as stated earlier, the voice should be original, if I hear one more Blink 182-influenced pop-punk singer I think I’ll open my veins.

3. Production: I dig gritty production, but there’s a big difference between lo-fi and bad. I can also appreciate intensive production, but not when it becomes glossy. Also, knowing some basics about electronic music can easily help you sift through the hordes of house and trance tracks built around presets and simple filter tweaks. Learn your gear, it is far more important to know the basic varieties of effects, synths, and editing techniques than it is to have heard of every last indie band to come out of Ann Arbor in the late-90s.

4. Musicianship: I can appreciate Teenage Jesus and the Jerks for what they are, but that doesn’t mean that the bar should be set at their level of technical proficiency on their instruments. If an artist or band is lacking in technical skill, they had damn well better make up for it tenfold with originality, creativity, and their lyrics. Yes, Meg White is kind of a sucky drummer, but she also amazes me at the same time (how can she drag the and-of-3 the exact same way every bar??)

5. Lyrics: The rhyming of the words “fly”, “high”, and “sky” in a sequence should be a federal crime. I don’t care if you’re being ironic. Lyrics that sound as if they were written on a pad of paper and then forced into some chord changes are nicht güt.

6. Composition: It doesn’t have to be symphonic – in fact pop albums can easily sound overwrought when inundated with orchestral instruments and weighted with complex 8-minute tunes – but there should be commentary on the craftsmanship of the songs by the reviewer. Structure, pacing, arrangement, and the overall vibe should be taken into account, much more than the off-microphone lives of the musician(s).

7. Authorship: Did the band/artist write their own tunes, or were they written by a professional song-writer? I think it is utterly hypocritical for many reviewers to lavish importance on the extra-musical elements of an artist’s life and how they supposedly enhance the musical experience for the listener, while the songs themselves were written by some old white dude living in Brentwood.

PopMatters is actually one of the music review sites I respect (and not just because they gave AWS a great review…). They do their research and for the most part their reviews are pretty down-to-earth. Contrast them with Pitchfork, the leading bullshit-driven review site. Pitchfork has a cadre of fantastic writers. Really, I’m in awe of their skills. But a review should not be a place to display your skill of writing, a music review should be an arena to display your musical knowledge and your talent at objective critique. If a reviewer’s ‘musical knowledge’ consists of the names of thousands of bands and the names of the thousands of members of those bands and their respective histories, then their knowledge is – to borrow a word from Horning – political. It is not musical. The increased prevalence of non-musical experts in music critic positions has turned the role of the music reviewer into a analyzer of the sociology of music rather than the art of music.

1 Comment

Filed under Uncategorized

One response to “The Fog of Music Criticism

  1. 23/7

    great post. have to agree with most of what you said, especially about how critics might try actually analyzing/criticizing the music instead of whining about off-stage personas and unrelated gossip garbage.
    check out my “reviewing the reviews” post where i “review” about 20 reviews of the same album.
    http://23-7.blogspot.com/2008/10/reviewing-reviews-cardinology-by-ryan.html

    and of course feel free to poke around the rest of the blog too. nice to see someone else mixing music and politics…. ok, gonna get back to sifting thru some more of your blog!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s