Loudspeaker Myths: Separating the Scientific Facts from Science Fiction

H

Hellcommute

Enthusiast
Good article and vid. Poor Z100s.

I used to be pretty sceptical about break in.....

FWIW, I have heard an in store AB comparison with bookshelves where break in was distinguishable between speakers. The FR didn't change, but the tweeters softened up a bit and became smoother. Not as edgy sounding. These speakers were the Leema Xero, using vifa 1" fabric tweeters. Need specific material like violin or high pitched synth tones to clearly hear it though.

Exact same phenomenom with mordaunt short mezzo tweeters. Left on playing radio for 3 days in basement HT. Easily distinguishible, again, in the tweeter range. Only heard for minutes per day, just to make sure all is well before work and before bed. By day 3 there were no more changes.

I use these 2 samples for testing. The first 30 seconds or so of "man in the mirror" and "she's out of my life" from MJ History disc 1. I encourage anyone to try it. :). I use the synth chimes in MITM like a turkey thermometer. "Yep, sounds done." Ha! :D
 
gene

gene

Audioholics Master Chief
Administrator
Good article and vid. Poor Z100s.

I used to be pretty sceptical about break in.....

FWIW, I have heard an in store AB comparison with bookshelves where break in was distinguishable between speakers. The FR didn't change, but the tweeters softened up a bit and became smoother. Not as edgy sounding. These speakers were the Leema Xero, using vifa 1" fabric tweeters. Need specific material like violin or high pitched synth tones to clearly hear it though.

Exact same phenomenom with mordaunt short mezzo tweeters. Left on playing radio for 3 days in basement HT. Easily distinguishible, again, in the tweeter range. Only heard for minutes per day, just to make sure all is well before work and before bed. By day 3 there were no more changes.

I use these 2 samples for testing. The first 30 seconds or so of "man in the mirror" and "she's out of my life" from MJ History disc 1. I encourage anyone to try it. :). I use the synth chimes in MITM like a turkey thermometer. "Yep, sounds done." Ha! :D
Thanks for that input. Would be cool to test this out and pull some measurements. Always good to keep an open mind and try to correlate science with reality.
 
D

Dennis Murphy

Audioholic General
Good article and vid. Poor Z100s.

I used to be pretty sceptical about break in.....

FWIW, I have heard an in store AB comparison with bookshelves where break in was distinguishable between speakers. The FR didn't change, but the tweeters softened up a bit and became smoother. Not as edgy sounding. These speakers were the Leema Xero, using vifa 1" fabric tweeters. Need specific material like violin or high pitched synth tones to clearly hear it though.

Exact same phenomenom with mordaunt short mezzo tweeters. Left on playing radio for 3 days in basement HT. Easily distinguishible, again, in the tweeter range. Only heard for minutes per day, just to make sure all is well before work and before bed. By day 3 there were no more changes.

I use these 2 samples for testing. The first 30 seconds or so of "man in the mirror" and "she's out of my life" from MJ History disc 1. I encourage anyone to try it. :). I use the synth chimes in MITM like a turkey thermometer. "Yep, sounds done." Ha! :D

Never say never, but I'm not buying off on an audible break-in period for tweeters. There's a dedicated thread on this, where I bought a pair of Cambridge Aero-2 speakers, played one for 50 hours, and left the other unit mint. This is a speaker where many, many people swore that the tweeter softened and opened up dramatically after extended use. After the 50 hours of break-in was completed, I invited people to compare that speaker with the one that was still fresh out of the box. And I measured the unit with extended play before and after. The measurements showed absolutely no change, and I understand that you're not claiming there would be a change. The listening panel included a professional recording engineer who has recorded some of the major domestic symphony orchestras and regularly provides recordings of live concerts to the local public broadcasting station. All of the listeners except the recording engineer could hear no differences in the two speakers after repeated blind comparisons. The engineer was able to pick out the broken-in speaker, but only in the midbass region of the woofer. He could hear no differences in the tweeter region. I really think tweeter break-in is another audio urban legend, but I don't expect to convince you. I just want to make clear that not all of us nutty speaker-holics believe in tweeter break-in.
 
zieglj01

zieglj01

Audioholic Spartan
Never say never, but I'm not buying off on an audible break-in period for tweeters. There's a dedicated thread on this, where I bought a pair of Cambridge Aero-2 speakers, played one for 50 hours, and left the other unit mint. This is a speaker where many, many people swore that the tweeter softened and opened up dramatically after extended use. After the 50 hours of break-in was completed, I invited people to compare that speaker with the one that was still fresh out of the box.
Post #197/198 (plots in 198)
cambridge aero 2 bookshelf speaker - Page 7
 
Last edited:
R

Rich Davis

Enthusiast
Here's my take on the subject

What is meant when some one says "broken in" with regards to a speaker? First off, I won't take anyone's blind fold test with regards to what I hear. I wasn't in the room when the blind fold test was conducted, so I can't say anything regarding what someone else hears or doesn't hear. As far as measurements are concerned, I would simply say that unless you have measured EVERY single speaker on the market over a period of several hundred hours and retested that speaker in an anechoic chamber, then it's difficult for me to swallow that ALL speakers don't require any break in period.

I don't like generalizing based on a couple of measurements and blind fold tests. Here's some of MY reasons why.

1. When it comes to measurements, what test equipment and conditions existed? There are some that simply don't have great or precise enough equipment or a proper anechoic chamber to conduct proper tests, and the speakers may not have gone through a long enough period of time for the "break in" period. Nor has every single speaker gone through the exact same testing procedure/conditions to conclude any generalization.

2. With regards to blind fold tests, there are a LOT of reasons why someone might or might not hear something in a blind fold test. So unless I've been in that blind fold test and know that the listening conditions were proper and I was given a long enough period to really listen to the differences between a fresh speaker and a "broken in" speaker of the exact same model then I just wouldn't generalize.

What I do know from talking to engineers and reading various articles about electronic equipment is that capacitors GENERALLY have a break in period and since there are various brands of crossover that can be used in a crossover, to me, it would sound plausible that that could be what is REALLY being broken in and possibly not the actual driver. Obviously there are a LOT of different brands/models and designs of speakers on the market so what works for one might not work for another.

It would be nice to have a SPEAKER mfg discuss why THEY believe speakers need to be broken in and have test measurements that back up their reasoning behind THEIR products.

As far as I'm concerned, it would be nice to have the mfg that claim their products do have a break in period to discuss this and show test results that back up their claims. That would settle this dispute from the stand point of at least a specific brand and model does require it.

Unfortunately, we live in a world where MOST of us do not have a sound proof room with properly tuned room with room treatment so we can actually hear any subtle differences in speakers or components for that matter. I think that is a critical component of listening to audio is a sound proof and properly treated room with proper placement and listening position. It's rare to find someone that has this in their own home. So I discount other people's listening results because of this.

All I can say is talk to the mfg of the product you are buying and ask them and if it makes you feel more comfortable to allow a product several hundred hours of break in, then that's fine by me. Whether it's true or not is not going to be known unless it's been properly tested and I don't know of anyone that has conducted tests that have proven or disproven either side. I would like to see an in-depth test where the mfg have been able to comment on the test results and to share their own test results to compare.

The other aspect is how much different IS the speaker after a couple of hundred hours or whatever time one thinks it takes to break in? I know that some cheap drivers actually start sounding worse over long periods of time as the surround starts to break down. Remember that spongy foam surround companies used to use? That stuff sucked. But most companies don't use that anymore and they use a rubber surround that's much more durable. Also, some companies may have better testing before the product goes out to the customer that may minimize any need for break in. If I were a high end speaker mfg I would perform lots of tests to verify whether the product sounds different after several hundred hours of use to figure out if it needs to be broken in, or that components need to be changed to have a consistently performing product.

Bottom line, I don't like it when someone makes generalizations based on a VERY limited amount of testing. There are literally too many products on the market and no one has tested everything under every conceivable testing procedures.

I've read comments on specific brand/models of speaker that did claim they heard a noticeable change of the sound of a speaker over time and I quite frankly can't argue with them either because I know it's possible that there is.
 
R

Rich Davis

Enthusiast
I think for listening tests, I think using Harmon's listening test facility is probably the best room to conduct ABX tests on speakers since it's soundproofed, and is supposed to be a well treated and designed room and they have the ability to change speakers behind a black curtain to avoid viewing the product. They, apparently, have a listening test training that one can go through that is supposed to better train the listener. They have had supposed "Critical listeners" go into the training and do just as badly as an average person off the street and people have to go through their training course to actually know what to listen for. So it would be nice to have people doing these listening tests that have gone through the Harmon training course. That would be interesting to have trained listeners in a good testing facility, maybe they would have a different result in an ABX test.

What I find ridiculous is the two people think there isn't any difference in cables. Sorry, I've heard VERY noticeable differences in some cables and it was VERY noticeable, so anyone that says that there is no difference I don't listen to them because they haven't tested every cable in every scenario.

Plus, there are some companies, MIT Cables, that have proven that there is a different response curve that IS measurable. Whether someone can HEAR those differences is another story as some might and some might not, so that's something you have to do on your own. Unfortunately, its an expensive and lengthy process to figure out what will sound better and what won't.

I know that doing cable blind fold tests can't use any additional switch boxes/cables to AB the cable as that may minimize any such differences.

When it comes to speakers, if you hear a difference, then you hear a difference, but try not to go into the situation THINKING that it will or will not.

I think this discussion is kind of silly.
 
R

Rich Davis

Enthusiast
Here's a test that I don't know if it's been done, but should before making any comments on whether a speaker (specific brand/model) requires a break in period or not. some might, some might not, but this is probably the ultimate methodology to prove whether they do or don't.

Take 4 pairs of identical speakers from a company.

Have one pair have 300 hours of 24/7 break in using music at a volume level of between 80dB and 95dB to prevent any damage to the drivers. Take a second pair and have it go through 200 hours, etc. the third pair goes through 100 hours and then leave one pair without any break in.

Then take these 4 pairs of speakers and go to Harmon's listening room where they conduct ABX tests and get a group of fully trained listeners to do ABX tests and look at their subjective results.

Then take those same 4 pairs of speakers to the best testing laboratory with the most precise measurement equipment in an anechoic chamber to see if there are any measurement differences. That should be the best scenario for a measurement test. Maybe the measurement tests should be done first, but don't give the subjective listening group any information ahead of time of any measurement results.

If both results of these two methods are consistent with one another, then I might be inclined to view that as a good methodology for those SPECIFIC brand/model speakers.

What do you think about this methodology? I think methodology has to be the most legitimate form of testing, whether it's subjective or objective. Not everyone has the best equipment/environment and trained listeners and that's where faults in the test shine through.

If Audioholics or anyone else wants to write an article like this, I would like to see some legitimate testing being done, otherwise, there are too many areas of faults in the testing which will impact the results.
 
H

Hellcommute

Enthusiast
Totally cool, Dennis. I know who you are and respect your opinion very much. :)

Mordaunt short is being liquidated everywhere now. Their Mezzo 2s are an excellent bookshelf speaker. I own them and experienced the subtle change in the tweeter. If anyone would like to test this theory out, they would be a great candidate. Run about 500$ online right now.

Its cool the room scenario is playing into some posts. Some rooms are better than others for sure. The differences we heard were not frequency response altering. They were simply smoother and not grating anymore in the high treble region. I was only exposed for minutes a day, so not an acclimation thing.

I should note that the "process" may have gone quicker had I ran them at higher levels. I never run them in at much louder than speaking levels. Too annoying for the rest of the house. :p
 
H

Hellcommute

Enthusiast
I did this 2 at a time for 7 channel system with 2 pairs Mezzo 6s and 1 pair Mezzo 2s. Same in all 3 rounds. Took over a week! Ha! :p

MJ MITM was used in all 3 rounds before and after. Easy to hear harshness with it or other mentioned track.
 
jinjuku

jinjuku

Moderator
Plus, there are some companies, MIT Cables, that have proven that there is a different response curve that IS measurable.
Calling a something with a box in the middle stuffed with passive electronics a cable is a bit disingenuous. I've always maintained that someone can make something sound different and then market that difference.
 
jinjuku

jinjuku

Moderator
Agreed. Though blind testing is likely good enough.

I'm still waiting for the car industry to suggest you must do blind driving tests to truly determine which car the consumer prefers without the bias of brand or aesthetics
I thinks it's reasonable to say that one of the legs of a purchase of a car is aesthetics. With WAF being what it is it's going to play into it for speakers. I think if a serious buyer can relegate speaker aesthetic to the far back burner they should as form follows function when it comes to speakers.

I am absolutely digging the series you and Hugo are doing. Can't wait for the wireless mic system and some studio lighting.
 
GO-NAD!

GO-NAD!

Audioholic Spartan
When I've listened to speakers of similar quality and price, side-by-side, and can't detect any glaring faults (minimal/no sibilance, no bloated bass, etc.) I ask myself "OK, they sound different - but which is better?" I have a hard time making that call when differences are subtle. Perhaps I find it difficult because I'm trying to control my own bias, but all it does is sow seeds of doubt in my mind and I question my own judgement. I'm not a "trained" listener - which could be the reason - and my window of tolerance might be wider than somebody with more extensive listening experience. I've read guys' comparisons of speakers, here on this forum, commenting on subtle differences that they hear, but one is definitely preferable to the other. I've listened to the same speakers and thought, "Wow, you heard that? I could easily live with either of those speakers". That's why I find sighted testing questionable - if I have difficulty picking the better speaker when I can see them, how do I trust somebody else to do it?

That's why I'm a fan of the properly conducted DBT. Gene, I fully understand why you aren't partial to the DBT and you've provided quite valid reasons. However, I can't push aside my nagging doubts that bias can be discounted from a sighted test. Yes, a proper DBT is very difficult to conduct - hence, the reason they are often improperly conducted, throwing them into question. What would you say if the industry came up with a methodology for independent DBT'ing that passed muster, scientifically speaking and became the industry standard? Any other method would be considered invalid. Would you support such an initiative? I realize that could be like asking for the moon, because such testing would be quite expensive to conduct and some manufacturers might be very embarrassed by the results - which would prompt them to cast doubt on the methodology, bringing us back to square one.:rolleyes:

Sure, if we compare Bose satellites with B&W 801's, the differences wouldn't be at all difficult to detect and we could dispense with a DBT, assured that we can pick out the better speaker. :D But, when we are comparing similar quality speakers and differences are much more subtle, bias can creep in, notwithstanding your assurance that you can fairly compare speakers in a sighted test.


Hmmm, maybe if speakers are sufficiently similar that DBT is suggested, perhaps they're similar enough that we can just dispence with the question of which is better and just pick the cheapest. :D
 
3db

3db

Audioholic Slumlord
Hmmm, maybe if speakers are sufficiently similar that DBT is suggested, perhaps they're similar enough that we can just dispence with the question of which is better and just pick the cheapest. :D
That makes sense to me.
 
gene

gene

Audioholics Master Chief
Administrator
When I've listened to speakers of similar quality and price, side-by-side, and can't detect any glaring faults (minimal/no sibilance, no bloated bass, etc.) I ask myself "OK, they sound different - but which is better?" I have a hard time making that call when differences are subtle. Perhaps I find it difficult because I'm trying to control my own bias, but all it does is sow seeds of doubt in my mind and I question my own judgement. I'm not a "trained" listener - which could be the reason - and my window of tolerance might be wider than somebody with more extensive listening experience. I've read guys' comparisons of speakers, here on this forum, commenting on subtle differences that they hear, but one is definitely preferable to the other. I've listened to the same speakers and thought, "Wow, you heard that? I could easily live with either of those speakers". That's why I find sighted testing questionable - if I have difficulty picking the better speaker when I can see them, how do I trust somebody else to do it?

That's why I'm a fan of the properly conducted DBT. Gene, I fully understand why you aren't partial to the DBT and you've provided quite valid reasons. However, I can't push aside my nagging doubts that bias can be discounted from a sighted test. Yes, a proper DBT is very difficult to conduct - hence, the reason they are often improperly conducted, throwing them into question. What would you say if the industry came up with a methodology for independent DBT'ing that passed muster, scientifically speaking and became the industry standard? Any other method would be considered invalid. Would you support such an initiative? I realize that could be like asking for the moon, because such testing would be quite expensive to conduct and some manufacturers might be very embarrassed by the results - which would prompt them to cast doubt on the methodology, bringing us back to square one.:rolleyes:

Sure, if we compare Bose satellites with B&W 801's, the differences wouldn't be at all difficult to detect and we could dispense with a DBT, assured that we can pick out the better speaker. :D But, when we are comparing similar quality speakers and differences are much more subtle, bias can creep in, notwithstanding your assurance that you can fairly compare speakers in a sighted test.



Hmmm, maybe if speakers are sufficiently similar that DBT is suggested, perhaps they're similar enough that we can just dispence with the question of which is better and just pick the cheapest. :D
Let me make this clear, I am not ANTI DBT or Anti-blind testing as I stated in my article and video. However just b/c a test is done blind doesn't mean it is bias free as I've discussed. It's a pet peeve of mine when I see people abusing the scientific process to push forth their own agenda.

I am a fan of doing:

  • controlled blind tests
  • controlled sighted tests
  • individual testing of each product over an extended period (several listening sessions spread out over a course of days)
  • comparing the results of all three

The testing needs to be done on an individual basis b/c people influence each other during a test and we want the best seated position possible each time too. This is a big undertaking with no payoff for the manufacturers or the publication conducting the test. Most manufacturers don't like shootouts, especially the ones that cherish the DBT which is ironic if you ask me. They really want to be the ones running the tests not a 3rd party.

I may start experimenting using a single speaker mono test like Harman does. It's very easy to line up 2-3 speakers at a time, level match them and just listen. It's a bit robotic but according to Toole's research, excellent results emerge. Hugo and I are planning to do a video on this in the near future. Should prove to be a fun exercise.
 
GO-NAD!

GO-NAD!

Audioholic Spartan
Let me make this clear, I am not ANTI DBT or Anti-blind testing as I stated in my article and video. However just b/c a test is done blind doesn't mean it is bias free as I've discussed. It's a pet peeve of mine when I see people abusing the scientific process to push forth their own agenda.
Sorry, I didn't mean to imply that you don't like DBT. Poor choice of words on my part. Would it be fair to say that you have reservations about the way they tend to be conducted at present? That's what I meant by "often improperly conducted".
 
gene

gene

Audioholics Master Chief
Administrator
Sorry, I didn't mean to imply that you don't like DBT. Poor choice of words on my part. Would it be fair to say that you have reservations about the way they tend to be conducted at present? That's what I meant by "often improperly conducted".
Yes most definitely as stated in the article. I have a problem with manufacturers making claims of product superiority from their own "DBT" testing, especially when they don't reveal the products under test and their sample size of products are too small to make such sweeping generalizations. Most companies have such small advertising budgets and EVEN smaller research budgets to go out and buy every major product on the market. At best, they get a handful of direct competitors to do their own internal shootouts.
 
E

exlabdriver

Guest
GO-NAD:

Finally, someone who had the same experiences as me when I was auditioning numerous speakers a couple of years ago. In the end I came up with the same pragmatic, common sense conclusion to the auditioning experience as you did. Basically, within a similar price point I found them all to be 'similarly good' - a now often detested phrase that was coined way back when by the guru Dr Floyd Toole and crew. All products that I heard were good in my subjective opinion, sounded somewhat different but I could have lived with any of them.

Kudos to you for sharing your views. It kinda my validates my experience as well...

TAM
 
C

Chu Gai

Audioholic Samurai
When you say "at the same price point", are you talking about msrp or what?
 
E

exlabdriver

Guest
I'm talking a range of prices from about $1000 to $1500 per pair that was on the price tag hanging from the speaker in the stores. MSRP I guess unless they were on sale for a few bucks off...

TAM
 

Latest posts

newsletter

  • RBHsound.com
  • BlueJeansCable.com
  • SVS Sound Subwoofers
  • Experience the Martin Logan Montis
Top