I may not be back to making YouTube videos on FullCircle just yet, but I am already thinking about ways to come back better than ever. Aside from that new YouTube background, I am constantly thinking about different ideas I can try to see through when I get started again. One way is standardizing the way I review my products.
The following is a little something thing I came up to add some consistency to my reviews, make it a bit easier on the viewer to trust my opinion, as well as help them make better comparisons and buying decisions when looking at the products I review. Since headphone reviews are the niche at the moment on FullCircle, those will be the focus of what you’re about to read.
How do you test headphones?
Each pair of headphones that I review is used on a daily basis on my daily commute to and from school, when I’m at home, and is actively used even when I’m asleep to ensure they are thoroughly tested (not just by sound, but build quality and comfort to say the least) by the time I make my evaluations.
What is burn-in?
Many love to compare this term to breaking in shoes. Naturally, headphones are almost never the same out of the box as they are days, weeks, and months down the road. Like shoes, they need time to be just right. It isn’t until that maturity that the headphones can be reviewed.
Many people uphold different standards for burn in. The way I break in headphones is, for the most part, consistent. Intuition definitely holds priority here - if I feel like a pair of headphones need more time, I’ll give it that and note it in the review. Generally, however, headphones I review get around 50 to as much as 100 hours of burn in using lossless audio files (FLACs) of music from various genres. This way, I can hopefully get the best, dynamic sound out of the product by the time I write up a review.
So now there’s a scoring system?
Let’s be frank, the pool of product reviewers out there is huge and each one has his or her own way of evaluating products. While having many opinions is never a bad thing, when it comes down to informing consumers, it can be real easy to skew people’s thoughts on products if my opinions aren’t clear and concrete enough, or just change over time. In my opinion, many reviews (many YouTubers in particular) carry on this problem - I for one in some bits can be added to this group of people. So far, all of the products I have reviewed have been evaluated without a scoring system, which worked because, when possible, I used benchmarks to pinpoint which products are better or worse than others. However, as I start to review more products and have more reference points, it could make it difficult for the viewer to concretely discern what product is better than the other. So with that said, starting with headphone reviews, which are already somewhat established on the channel, every review will include a non-weighted scoring system. This will hopefully further solidify my stance on the products I review, which should make it a bit easier for the viewer to make a buying decision.
How Scoring Works
Based off the noteworthy reviewing standard of tech blogging sites like the Verge and many others, the way I rate products is determined by an average calculated based on the scores the given product received in the predetermined areas of evaluation. Specifically, for headphones, each of the evaluation criteria, which are style, build quality, comfort, and sound quality, will each be ranked on a scale from 1 to 10 (one being the lowest, ten being the highest), and then these scores will be averaged to give us an overall score for the product.
Of course, unlike a blogging site, my reviews are solely done on video. That said, a cue will appear on the video for each review criteria I cover in any given review, as well a score. These will integrated into my reviews from now on and will seamlessly work into the way I usually compose these videos. Again, these scores will be averaged at the end for an overall product score, which you can utilize to compare the product to another previously reviewed product.
Lastly, for those that are wondering, here’s a simple translation of what the numbers on the FullCircle rating system say about a product (and yes, this is a total ripoff of the Verge… don’t hate me):
1 - Utter garbage.
2 - Slightly better than garbage, but still incredibly bad.
3 - Not a complete disaster, but not something we’d recommend.
4 - Mediocre, but likely has outstanding issues.
5 - Just okay.
6 - Good. There are issues, but also redeeming qualities.
7 - Very good. A solid product with some flaws.
8 - Excellent. A superb product with minor flaws.
9 - Nearly perfect.
10 - The best of the best. Perfect.
Okay. But seriously, when are you coming back?
Senior year is ending in less than twenty school days and the college process, for me, is in the final stretch. Really, the only thing I’m waiting on is Apple’s new 2012 MacBook Pro refresh, because my computers, at the moment, are too slow to accomodate the video style I want to pursue (I try to go hard on lower thirds, color correction/grading, etc. on last last (last? last last?) generation hardware and 2GB of ram… not gonna happen, and I’m not going to half-ass these videos either). I just ask that you guys be a little bit more patient. Don’t worry, there isn’t a day I think about making a video - I have two headphone review videos waiting to be edited!! Its depressing to think about sometimes.
Shō ga nai. - “It can’t be helped” in Japanese
I’ll try to be back soon. See ya then!