8 October 2006
In the 1960s and 1970s, audio manufacturers played games with specifications, because they perceived that what hi-fi buyers of the time wanted was Really Good Numbers. Eventually the FTC stuck its beak into the proceedings and decreed a standard for power output: that "280-watt" amplifier would become "42 watts RMS per channel, all channels driven, 20-20,000 Hz, ± 2 dB, 0.5% THD, 8 ohms." As with other Federally-approved numbers cf. "EPA city mileage" this tells you some things and doesn't tell you others. This particular amp sits in my living room. If I fed it nothing but sine waves, I'd presumably get exactly the numbers the Feds ordered. Music, however, isn't continuous tones: it's peaks and valleys. And for very brief peaks, the box might actually deliver more than 42 watts: as much as 70, in fact. Given that this is a four-channel amplifier, you can multiply 70 x 4 and suddenly there's that "280" rating. But that rating, too, conceals a lot: mostly, that the difference between 70 watts and 42 watts is only about 1.66 dB. And none of those numbers will tell you what you really want to know, which is "How does it sound?"
Back then, there were two markets for sound equipment: hi-fi and lo-fi. Today there are three: Real Crap, Average Crap, and Hideously Expensive But Good. A catalog from a dealer catering to the latter arrived this past week, and its cover photo tells the story: a rack of gear that cost as much as my house, off to the side a tube-powered amplifier, and seated off to the right, a fashion model, presumably expensively dressed, her expression suitably dreamy. I'd hazard a guess that guys who blow $100k on audio gear probably might not date a