You love the blog, so subscribe to the Beervana Podcast on iTunes or Soundcloud today!

Monday, May 18, 2009

Toward Consensus? Impossible.

Let's try a thought experiment. Imagine you assembled a list of a city's best beers. Then you polled a bunch of people to find the consensus of which of these they would recommend. Here's the experiment part: how many of those beers would have high levels of agreement--say 75% or more?

I would have guessed you could get at least a couple beers in every style--essentially broad agreement on the "best beers." Well, Matt Wiater at Portlandbeer.org actually did this, and guess what: not much agreement. Of the top 15 beers, only two met my hypothetical standard. Mostwere recommended by only a bare majority of people. Mirror Pond, for example, surely one of the more famous, beloved, and best-selling beers in all of Greater Beervana, managed a recommendation from only 50% of the people.

So who were these half-wits? Bloggers, mainly (including me).

The lesson is clear to me: there is no "best" of anything. "Bests" are reserved for track meets, where you can actually measure performance. In beer, the master is the taster. What's best is what your tongue likes. I tend to think we can talk about some general standards of quality, but specific beers?--clearly this isn't so easy to figure out.

So the next time (and there will be a next time) we get in a spat about a specific beer, we should recall this lesson. Different strokes, folks. And ain't it nice we have so many breweries to serve these different tongues?

Go have a look at the recommendations. You'll probably be surprised.

5 comments:

  1. Hmmmm, a few pale ales on the list, but they don't dominate like I thought.

    One thing that bothered me a little was the fact that there are several good beers and breweries missing from the list, and interestingly not a ton of diversity in styles.

    ReplyDelete
  2. Well, a small sample size gives strange results. C-Note is a dandy beer, and I recommended it, but is it more recommendable than Fred?

    I think the "Top 3 at each brewery" list is potentially more useful to the intended audience.

    ReplyDelete
  3. On the homebrewing side, I think this is why my beers score inconsistently from contest to contest. Yes, they are supposed to be judged to determine how well they fit the standard for the style, but I think a great deal of subjectivity on the part of the judges comes into play here too.

    ReplyDelete
  4. @Senor brew:

    "but I think a great deal of subjectivity on the part of the judges comes into play here too."

    The judges personal flavor preference definitey comes into play, especially in a style like IPA where you have so many different varieties of hops and methods of utilizing them to create a wide range of flavors and aromas. However, you can't overlook the strength of the competition as well.

    Having judged a few events I know how difficult it can be when you've got a flight of 6-8 amazing beers and you can only send 2 forward. Sometimes in order to reach a consensus amongst the the judges you have to look beyond style guidelines and factor in something a little less quantifiable like drinkability.

    In other words, all these IPAs are great, but I could drink pint after pint of entry A while entry B is so resiny that I might only want one. Both are "to style", but even if both received the same score on paper I'd be more likely to push entry A into the next round.

    ReplyDelete
  5. Subjectivity comes into play in any competition where taste is concerned. Chris makes a valid point about homebrewing comps, and I’ve heard conspiracies about professional level awards for years (The scuttlebutt being that commercial examples are recognizable even in blind settings, i.e. “beer A is tasty, but I know beer B is Blind Pig, so I’ll vote for that one.”)

    I’d parody Dr Wort on this subject but I’m getting exhausted thinking about how much I would have to type and how many times I’d have to hit the shift key.

    ReplyDelete