It was a blind test. No one knew what speakers were behind the screen(black grill cloth) at any time during the tests. The participants were given about 20 minutes per round to listen, Craig used a remote to operate the switcher, waiting about 30 seconds to a minute between switching. There was 3 rows of seating, and listeners rotated on intervals a'la volleyball style. This gave listeners all possible listening positions in the room. If one speaker sounded better in one position, it was worse in another. The Sierra was a favorite, only falling to the Swans in this instance. Craig stated before and after testing that he paired the test speakers based on closeness of sensitivity, to avoid needing to apply excessive amounts of attenuation with the switcher. I think he took all conditions into account given the information/equipment available. All Craig did was host and setup the event. The scoring was done by the testees. It is much more difficult to score speakers than I had imagined. It is very subjective.
Let's give Craig a little credit here, and if we have doubts, try doing this on your own. It's not easy.
Ed
* Sierra-2EX's W/V2 crossover upgrade
* (2) Rythmik F12's
* Parasound Halo P6
* Audio by Van Alstine DVA-M225 Monoblock Amps
* MiniDSP 2x4HD For Sub calibration
*World's Best Cables Canare 4S11 speaker cables