Does your favorite band suck? Scientific proof


As others have pointed out, empirical proof of propositions posited about rock n’roll are difficult to come by. What r- squared would be sufficient to prove that Limp Bizkit sucks? How would we construct a multi-variate regression to test the hypothesis that Led Zeppelin rules.

What is needed is a level playing field: a matrix that can fairly and accurately measure and compare the positive attributes and various shortcomings of rock bands. I propose that the following criteria should form the basis of such a matrix.

# of Great Albums
Quality and Frequency of Live shows
Degree of Innovation and Enduring Influence on Music.

1. I propose that the number of great albums should be given, to a slight degree, a higher weight than the other factors - 40% seems right. Thus, the amazing number of great albums by The Rolling Stones counts for more than the single moment of brilliance by, say, the Sex Pistols or The Stone Roses. Of course, we cannot judge the overall quality of a band only by their most brilliant moments. We can’t let the Rolling Stones go unpunished for “Undercover” or “Dirty Work”. So, we must come up with a weighted great albums score.
Thus:
One Great Album 20 points
Two or Three Great Albums 30 points
Four or more 40 points

We’ll adjust the “Great Albums” score in the following fashion:

# of truly awful albums released
one subtract 10%
two or three subtract 20%
four or more subtract 30%

2. It pains me to put a lower weight on the frequency and quality of live shows than a band’s recorded output. The live score should reward consistent brilliance, not rare, momentary greatness. Thus, for example, though it can be said of Led Zeppelin’s live shows that they “totally ruled”, that group’s infrequent touring denies them the full 30 points for this category.

Quality and frequency of live shows:
poor and/or infrequent 0 points
infrequent or mediocre 15 points
frequent and great 30 points

3. Degree of Innovation and enduring influence
The history of philosophy consists of a debate and conversation, not concerning the determination of truth, but rather an attempt to ascertain what is important. Quite simply, some bands are important, and some are not. A friend, whose judgment I implicitly respect, recalls being marvelously entertained at a concert by Adam and the Ants. I leave it to you to debate the great innovation and lasting influence of Adam and the Ants. Furthermore, the “lasting influence” component of this area does not imply a “positive’ influence. You might variously regret, or celebrate, the flood of self-indulgent expressionism unleashed by “Pet Sounds” and “Sgt. Peppers”, but the lasting (pernicious?) influence of these two discs is a robust fact. As far as innovation goes, I happen, to example, to enjoy “The White Stripes”, even though I am painfully aware that that their catchy retread of Led Zeppelin’s own electric blues retread scores a big fat zero on the innovation charts.

So,
Innovation and Enduring Influence
Negligible 0 points (e.g., Bow Wow Wow)

Slight 10 points (e.g., Squeeze, Supertramp)

Important 20 points (e.g., The Clash, Led Zeppelin)

Profound and Lasting 30 points (Bo Diddley, The Beach Boys, The Stooges, Jimi Hendrix)

Let’s plug some bands into the matrix and see what happens:

The Rolling Stones
# of great albums 40 points! (I don’t expect any arguments here)
# of truly awful albums 40 – 30% = 28 (where do I start?)

Quality and frequency of live shows: 15 (My older brother saw the Stones in London in 65, and went to the Hyde Park free concert. He reports that though they were good shows, The Stones were always known for not being as good on stage as they were in the studio. I first saw the Stones in 78, and I’ve never seen them do much more than an over-hyped, overblown, cynical, ham fisted nostalgia revue: a sort of Sha Na Na as staged by Andrew Lloyd Weber)

Innovation and Enduring Influence: 20
Yes on the enduring influence; far less so on the innovation. Whatever your feelings about the British Blues scene (John Mayall, Alexis Korner, et al.) may be, innovative, it was not. The Stones’ later psychedelic offerings have a whiff of “me too” forced desperation. I‘ll grant them their only innovation points for their amazing Mick Taylor era recordings. Their influence however, remains profound and widespread. “Exile on Main Street” begat “Physical Graffiti”. The Stones themselves begat, among many others, the Flamin Groovies, Guns n Roses, and The Strokes.

28+15+20= 63

overall score for The Rolling Stones = 63

Now. Let’s chart the Velvet Underground

#of Great albums: 40 (I don’t think that "Loaded" counts as a great album, but the two ex post facto disks, “VU” and “Another View”, do)

# of truly awful albums: 0

adjusted score for recorded output = 40

Quality and Frequency of Live Shows: 30 (of course, in this regard, I am relying on the testimony of others; but my expert witnesses tell me that VU shows never failed to shock and amaze)

Innovation and Enduring Influence: 30 (no comment necessary)

So there you have it: 40 + 30 + 30 = 100!
tweakgeek
Wow. Quantification of rosk Albums. What happens when I just has a real lousy day at work and i just want to play the 'Jesus & Mary Chain' "Dirty Water"?

Anyway, it is a bit too early to follow your formulas. It does seem a bit like the Poetry Teacher using graphs to evaluate poems in the movie "Dead Poet's Society"
Post removed 
I love this. It really appeals to the geek in me, and maybe even makes us think about *why* we like our favorite music.

Of course, there's no shame in liking bands that don't score high. It just helps us realize that our favorite music may not be timeless!

By the same token, perhaps we can relieve our guilt at not liking one or another "great" band or artist. Once we realize that "great" includes "influence" and "importance," we can decide that we don't care about those things in our personal rankings, though they are indisputably important to a band's place in history.

OK, please realize this is somewhat tongue in cheek here ... I'm not sure if I like the "awful albums" adjustment. Is it really fair to say that a band with 20 great albums and 4 awful ones is a worse band (on that measure) than one with 4 great albums and no awful ones? Maybe the "awful album" adjustment needs to be based on a percentage of total albums put out, or a percentage of the number of great albums. Or maybe we just need to accept awful albums as the price we pay for long and interesting musical career!

- Eric