Our “Statistically Accurate” Survey Determines The Best And Worst Cheap Beers In New York

Hey you. Yeah, you with the whiskey and Coke. Put that down. We all know you really can’t afford that seven dollar excuse for a half a shot of Evan Williams. And with it being midterms season, all you really need is just an ice-cold beverage that is barely tolerable to your tastebuds, and we here at NYU Local are here to help, with our own personal guide to cheap beers in New York.

We’d be lying if we said we were the first website to ever come up with the idea to rank cheap beers. But unlike sites like BeerAdvocate, which spends its time comparing Bud Light on the same scale as the best Belgian ales in the world, or on Deadspin, where a single author gives his or her own subjected review, our survey comes from the palettes of fellow NYU students. You can rest assured that these taste buds’ opinions are the closest you’ll ever need to get to actually trying 10 terrible beers all in the same night.

Because that’s exactly what 10 of our finest writers, who will remain anonymous, did. These braves souls participated in a blind taste test of what we surveyed to be the most widely drank cheap beers amongst the NYU student body. (And don’t worry, just because we write for a blog that some accuse of being populated with hipsters does not mean that we all favor Pabst Blue Ribbon.) 

And when we say cheap, we mean it; even such fine ales as Yuengling and Fosters were excluded, even though many see them as a go-to option. Also, this survey is dealing only with cheap beer; if you’re looking for the cheapest way to get drunk, let me direct your attention to this fine piece of statistical analysis at the aptly named website GetDrunkNotBroke.

But back to the matter at hand. Because we’re “journalists” and all, we decided to investigate and determine how much brand loyalty is contributing to poor taste choices at our fine university, especially when it comes to bottom of the barrel brews. Should Pabst Blue Ribbon be more than just the choice of the hipsters? And is Natty Ice really the worst tasting beverage on the planet? The results? Well they were definitely unexpected in many ways.

Now let’s get the most important info out of the way.

Our Champion: Rolling Rock 

With each beer being judged on a scale of 1-10, Rolling Rock blew away the competition, with an average score of 5.5. The beer known for its iconic green can was originally brewed in Western Pennsylvania and is an alternative to PBR, the staple of many New York dive bars. For the most part, 0ur taste testers’ reactions were positive, describing the Rock as “smooth” and not “leaving as much of a bad aftertaste as the others.” And at $8.99 for a 12-pack at Associated Market near Stuyvesant Town, this is the perfect example of a great New York deal, right up there with dollar pizza.

Runner’s Up: Pabst Blue Ribbon, Miller Light

The two runner ups, PBR and Miller, are often considered to be the best of the cheap beers, depending on whom you ask. They also happen to be the most widely available cheap beers in New York bars and bodegas. Described as smelling “like Lucy’s (the East Village dive dar)” and tasting “vegan,” PBR was to no one’s surprise the most recognized beer by our taste testers in this blind study, with 6 out of 10 writers accurately guessing the beer and at the same time not dissuading any stereotypes about NYU students. Miller Light, on the other hand, was completely unidentifiable to our writers, with a perfect 0 out of 10 guessing that their beer was a Miller. However, the beer did score a 4.1 on average, good for third overall, and was noted by one surveyor for being “a little bland on the flavor, but the most refreshing beer I’ve had tonight.”

The Worst: Tecate

The biggest shocker of the test was the absolutely horrendous performance by crowd favorite Tecate, which many East Village bars such as the Library keep on ice daily. The Mexican brew finished in dead last, with an average score of 2.9. According to one taste-tester, “the Tecate smelled (and tasted) like sweat,” and another member of the experiment had to be calmed down after having his teenage dreams of Teacte being simply, “good enough” shattered. The best reaction however, came from our official statistician, who quipped, “[Tecate's] frothy head reminds one of rotten oranges, coriander, and the warm piss cup that you have to give your doctor.”

Now if you want to see the rest of our rankings and results just check out our handy charts below, along with the last but very important…

Methodology

Since we’re all about transparency when it comes to important manners such as blind beer tasting, below you will find a more detailed explanation of our methodology from our official statistician (who has an economics degree, so he must know what he’s talking about, right?):

I need to preface the results of this study with the following disclosure. This study has tons of holes in it, and results shouldn’t be taken too seriously. Only 10 people participated in the blind taste testing, which is far too few to draw any statistically meaningful conclusions. However, the results were very interesting and can be seen as a snapshot of what a larger study may find (NYU, we’re looking at you for a research grant).

The study consisted of two separate parts. The first asked tasters to rate each beer on a scale of 1 to 10 for taste. The second asked them to guess which beer they were drinking. The purpose of the first part of the study was to figure out which beer was the most preferred among the group. The second part of the study was meant to see if tasters could actually distinguish, with some reasonable statistical significance, which beer was which. 

 

 

 

 

 

Delving into the statistics behind the taste test, there are two main things to look at. Both of which involve z-scores, which — as those of you who have taken statistics know — is a tool that measures whether or not a result is random. In this case there were eight beers that we tried, so the probability of someone randomly guessing a beer correctly without even tasting it was 1 out of 8, or 12.5 percent. When calculating the z-score, we take the total number of beers guessed correctly by everyone, in this case 17, and divide it by the total number of beers tried, in this case 80, and come up with a 21.25% success rate. The NYU Local crew actually did do better than if they randomly guessed which beer was which, meaning that the beers had distinctive tastes that the drinkers could identify (although one writer did accomplish the feat of not guessing a single beer correctly). However, the small sample size once again means these results are only a semi-accurate window into an obviously important issue.

[Image via]

Charts courtesy of NYU Local



2 Comments

  • Ed Carroll
    March 8, 2014

    Best article I’ve ever read, so insightful, it would be a shame if NYU didn’t give these guys a DURF Grant to do more research.

  • Lee Ciocia
    March 10, 2014

    I think you mixed up z-score with p-value.

Leave a Reply

Commenting for the first time? Your comment may not appear immediately, so please be patient. See our policy on comments.