Challenge Audio Magazines


This thread comes from comments here, other web threads, talk on the street and print media regarding Stereophile's Audio/Video "Classification System" and more generally the wishy washy, inconsistent nature of Audio Reviewer Magazine (electronic or print)Magazines. I have been involved in high end audio since the 1970's (when there were reportedly about 10 hi-end stores in the country (this according to an interview with founder of CAL and Theta) and have heard almost everything out there. I have driven hundreds of miles to hear specific pieces of equipment. I am passionate about this hobby but I am also realistic. I know that much of the problem with high end audio is the HYPE. Magazines not only create the Hype they are also victum of it. If deciding on the best piece of equipment is likened to a trip on the road then Hi-End Magazines don't have a road map. They don't know where they are going but they don't have a consistent measure of where they have been. They tend to create new roads and abandon them a few reviews later. This is just on the basis of an individual reviewer. This gets multiplied orders of magnitude between reviewers. Reviewers being inconsistent in their approaches and what they communicate to their readers. I suggest we challenge Stereophile and other "golden ear" pubs on several fronts: 1.) They are outrageously inconsistent in their methodologies of evaluation. Imagine telling your boss or yourself, if are the boss, that you just changed your mind how you are going to do your work today and will likely change tomorrow and the next day. Follow up by stating that this will not be a problem come job evaluation time because using these flip flop methods you will always being doing your job the best because you define how it is done day to day. In effect say "Come to me when you need to evaluate my job performance and I will decide the criteria". The reviewer's job is to REVIEW, not confuse. We are essentially their boss but they can never be pinned down for the reasons given above.......Why the heck does not "60-Minutes" have a hay day with this nonsense. 2.) Have reviewers post their hearing tests in a color graphic so we call can see what they can or can not hear. 3.) Provide information on warranties for every review (how do decide when to post this and other information?) Audio pubs should post readers responses (they preassumes they would ask readers to comment on the full component experience, very very similar to what consumer reports asks annually about far less expensive things like microwave ovens and such)on Component Quality and Customer Service. We need to help those companies that are not owned by multinational corporations (many of which own several high end name plates) who could not give a rats a___ about your 10 year old component. Lets show case these poor customer service SOBs. If you have been around a while in this business you will recognize the profound change in support at many companies. For example, Audio Research has only 3 year warranties but their customer service is better than any life service plan around (they stock parts back to their first unit made in 1970, ask Sony about their policy). So while warranties are not the end all for support they are a starting point. Lets make these companies accountable and the aging golden ears at the pubs. 4.) Why not use a color graphic to report subjective reportings when they are necessary (subjective reports, many things can not be quantified yet but lets use a consistent reporting schema that is quick and easy to grasp and makes the writer responsible to bridging the gap between reviews). 5.) This may all seem like I hate Stereophile. I don't. I am a multiyear subscriber. Just that things could be a whole lot better and more accountable. In the end the Class A Class B etc nonsense is just jello artsy talk until you can make linkages to other reviews that are consistent and easy to follow. Far too often a Class A recommendation is too easy to follow back to the advertizing budget of groups like Harmon Kardon (opps Madrigal) etc in Stereophile than the threads of logic between reviews. More on this when I have time!
nanderson
I couldn't agree more. Prehaps I'm taking this abit to far, but if these Audio Mag. would give some drilling reviews on items. I truely beleive you would see prices fall and quality would go up.
Nanderson, congrats on this thread! I realised long time ago that any desigion, that involves your money and long term satisfaction, should be ultimately your own. For example: I am looking for the set of speakers, and for entairtainment purpose i picked What-hifi, british mag. Their worst graded speakers end electronics, were highly prised by stereophile. And vice-versa. Kef rdm-2, has 5 stars in What... and graded "C" in stereophile. Celestion A3 was graded three stars (mediocr) and Stereophile graded it next to the best "B". Linn AV 5140 was graded by stereophile mediocr class "C" and What HIfI highest 5 stars(not platinum) etc etc....i don't even want to go further. This is the example, if you are looking for advice from those guys...Forget about it! Read those magazines out of entertainment reasons ONLY!
NAnderson nice thread, you've hit the nail right there on the head. The magazine you refer really got to put their acts in order, but why are you still a multi-year subscriber? Courage, masochism. Many reviewers' papers are a mass of twists and turns and even backtracks over time for one major reason. The reviews are not always based on any solid scientific understanding, engineering compromise or inevitable trade off to create a hi-end product, so often not knowing how they got there in the first place. The job of a hi-end reviewer is really a tough one.
I’ve carefully read through the (so far quite few) threads which follow this ‘challenge’. However, there are a number of problems both with and without the traditionally ‘printed’ review magazines. I believe that a reviewer should be able to state the position he is in with respect to his system. For instance, if he is reviewing hi-end products (single components) I believe he/she should state what the review system is, and what the component being reviewed actually replaces in that system, thus allowing some semblance of a meaningful comparison but only in the context of that system. If you care to look back I stated in one issue of Listener (about 18 months ago) exactly what my reviewing systems are. They are still the same. This will hopefully provide something of a level playing field. In addition, (and it happens to be in keeping with Listener’s reviewing policy) I believe that reviewers should actually own the equipment they use to do their reviewing. This will at least bring to the forefront of their minds the realization that comparatively few can afford to go out and buy a truly hi-end system and still have enough left over to add to their music collection. I think every reviewer should be made to declare his owned equipment, and also his loaned equipment. It is also well-known that some reviewers also work within the industry. For some (like me, who deals with thermionic technical issues all day) the reviewing comes as a welcome break after having a head full of theories and flickering scope displays. Some others actually design for companies and then get paid as well. A disclosure of interests should be manadatory, so we can really see which side their bread is buttered. And moving on, some reviewers have very good technical knowledge, but know stuff-all about music, let alone what live music sound like. I believe that where a reviewer has only one area of knowledge, that should be the only one they’re allowed to comment on. I remember one review very well where the reviewer commented on the clarity of the oboe’s tone. It was a clarinet on the recording he used! If a reviewer has no technical, or no musical knowledge then they should not comment on it. After all, do you know what it’s like piloting an Apollo space shuttle? For my part I am a qualified electronics engineer, and also have an excellent music degree, and am a practising musician I also believe that reviewers should have integrity. They should not only say what they truly believe about a product (with some form of managed feedback from the manufacturer if the conclusions appear to be way off-beam) but also, they should not write simply to ensure a continuance of advertising revenue. However, despite all this philosophy, at least the printed reviews have behind them a nominal accreditation from the organisation putting out the publication. However, mistakes do still occasionally happen. The Impulse Kora is a stunningly good ¼ wave speaker, but it was given to a reviewer who traditionally uses 100watt amps in his system. The Kora is 92/93dB/W efficient, and under that sort of use the speakers just didn’t stand a chance. However, if you partner them with lower power amps (either solid state or tube) the speakers will really sing. The bad review (of a good product) killed the speaker design stone dead. Those who bought them still have them, mostly. Very few of the 1000+ pairs sold every reach the second-hand columns, and if you find them they still cost $1200, what they were new, 5 or 6 years ago. So the mags have to be sure that every product is given a good and fair auditioning with sensible partnering equipment. After all, you wouldn’t test-drive a Trans-Am with a caravan hitched up behind would you? However, some of the dangers with the net forums is that these areas are even more subjective. If you read printed reviews for long enough you should be able to get inside the head of the author and work out where he/she/s coming from, but sometimes results are contradictory. And most reviewers seem to have ‘this week’s sliced bread’ all the time, so where’s the consistency? But the Net is even more open to highly subjective comments, and also to ‘creative marketing’. Ever seen a glowing tribute to such-and-such a pre-amp, only to find the correspondent selling it in the Ads section? It happens. And lastly, if you sit and think about it, no worthwhile manufacturer is going to actively sell something which doesn’t stack up to the claims. It’s one thing to miss you own benchmark (which rarely happens) but as audio is so subjective, other things come into play; imaging, bass slam, pace, rhythm and timing (whatever they are) soundstage and so on. Since no-one can measure those, they’re a mite difficult to quantify in a meaningful way, but power output specs and so on are measurable. Quite often (like the first Hitachi solid state power amp) it measures well, but doesn’t sound good. You can’t condemn the product on its tech spec, only on audition. Things have to be kept sensible here. There’s no easy answer. As a starting point I believe (and practise) the following: 1 reviewers should have unimpeachable honesty and integrity in everything they write 2 they should own their basic system(s) which should be noted and used as a continuing reference (the only exception being a ‘whole system’ review) 3 reviewers should disclose any industry association / affiliations, and benefits (long-term loans, design contracts etc) 4 there should be scope for ‘bad’ reviews, with a right of reply 5 reviewers should stick to what they know 6 all equipment reviews should partner the review product with appropriate ancillaries 7 the reviewer should not benefit (or lose out) as a result of the conclusions of the review If you want to contact me, feel free; [email protected]