Pretty much as soon as I started Popular I decided to mark each song out of ten. I had a few reasons for this. I’d not really written reviews with marks attached before and thought it might enliven things a little. I had been reading EDGE magazine, which is terribly austere about its marking of computer games and has given 10/10 scores only four times in its ten-year history: that kind of stinginess was appealing. I also thought that as the feature kept on a very low or very high score might be seen as an ‘event’ and a reason for a few extra links: whatever its merits as criticism, Popular is also a brazen and so-far quite successful attempt to boost my site’s hits.

The feature is still in its infancy as of writing and no arguments have been sparked by any marks yet. But it’s not to soon to confess that the discipline of giving out marks seriously distorts the way I write about pop, maybe even the way I listen to it.

That’s not a surprise. I’ve always been against marking systems in magazines – they’re a crutch for the lazy reader and a slap in the face for the careful writer. Systems that dole out their maximums eight or nine times a month seem particularly stupid – the slightest twang of a battered acoustic seems to be enough for Uncut to lassoo a constellation to Earth in gratitude, and their grades become a mockery. (Of course if I was editing a magazine and was actually expected to make money I’d be much friendlier to my readers and would naturally have marks coming out of my ears.)

A mark represents totality and finality. It is a quantified summary of the entire listening experience, and a verdict on it. But a review is those things, too, you might say. I would disagree – a good review can be open ended, can be provisional, can reflect the writer’s mixed feelings and still be useful and entertaining. A mark on the end sabotages that, which is why reviews and their grades often seem dissonant.

Giving out marks for songs is a cheap way for a critic to pose as an authority. Some actually are authorities, of course – Robert Christgau has listened to so many records now that I wonder if his famous grades are a kind of memo-to-self that he’s ‘done’ this or that one. But in general critical authority rests on shaky ground. Listening to music is after all a very easy business indeed. For some particularly subtle or complex pieces a ‘how to listen’ guide might be useful – beyond that music criticism is simply a data filter. As such it works by a kind of good cop/bad cop principle: you have to gull the presumed reader into believing you know more than them, while still convincing them you’re close enough in taste and habit to be trusted.

Popular, though, is in these terms entirely useless, which is one reason I like doing it so much. Many of these songs aren’t available to buy easily, or even download readily. The ones that are, are often known by everybody anyway. Giving them a mark is absurd, and so adds to my fun. Even so the process of mark-giving is a nerve-racking one. I’ve tried very hard to avoid even thinking about the obvious question – which song is going to be first to get a ten – but something as simple as handing out a six rather than a seven always follows much beard-stroking.

As the marks get higher, things get harder, because I’m having to quantify pleasure. The simple joy of a nonsense song like ‘Doop’, for instance – I like it a lot, but does it really deserve as high a mark as a pop classic like ‘All Shook Up’? But then doesn’t even thinking about that betray my ‘beliefs’ about pop and the thrill of the moment? B-but maybe even having those beliefs is a betrayal of them? Or maybe thinking so much about pop music is turning me into an absolute headcase.

Methodological Notes

1. The ‘Doop’ question is hypothetical: if you’re going to submit to the ritual of a marking system you must be rigorous in how you do it, and my rule is no marking until a review is finished.

2. Marks in Popular are given for how much I like the song, not how important it is in ‘pop history’. Sometimes the knowledge of that importance makes me like the song more (or less), though.

3. What do the marks mean? Ooh, good question. Here’s a sort of guide:

0: Completely irredeemable. There can be almost no justification for making it: even if for charity, those involved should have donated their entire fortunes rather than see its release. In practical terms these records are very difficult to even finish listening to once.

1: Horrible. Hard to listen to and an immediate switch-off, but by stretching empathy to its limits you can imagine who would like it and why (i.e. it ‘succeeds on its own terms’ perhaps

2-3: Bad in varying degrees. You’d change the radio station or skip it on a compilation, and unawanted prolonged exposure might push you towards actual hatred. Tracks getting 2 rather than 3 usually have some particularly awful quality – a singer, a lyric, a noise – that stands out through the general badness.

4: Poor. Or just boring. You probably wouldn’t mind hearing it a couple of times, mostly though you’d just endure it.

5: Average. A ‘typical’ record of its era, maybe. Pleasant for a listen or two, certainly nothing to switch off, but not anything you’d return to either.

6: Good. Enjoyable pop record, if it came on the radio you’d say you liked it. You probably wouldn’t want your own copy but you’d have no problem hearing it fairly regularly.

7: Very Good. Would have been a highlight of the charts at the time, you’d definitely want to hear it regularly, you might want to own it. Has some stand-out quality that sets it above its contemporaries.

8: Excellent. A record you’d want to hear repeatedly and own for yourself.

9: Superb. A record you’d never tire of hearing. You’d certainly own it, in fact it would be one of your favourites.

10: Perfect. The sort of singles that justify the existence of pop music by themselves. Impossible to imagine ever not enjoying it. Difficult to imagine anyone else not enjoying it.

4. Yes, I do have a huge spreadsheet charting all the marks I’ve given.