Home Theater Forum and Systems banner

1 - 20 of 38 Posts

·
Banned
Joined
·
4,838 Posts
Discussion Starter #1
Home Theater Shack Fall 2015 Digital-to-Analog Converter DAC Evaluation Event Reporting and Discussion Thread


:fireworks2:
:fireworks1:​


The Event

The Home Theater Shack DAC Evaluation Event is under way. We will be evaluating a number of DACs ranging in price from $3,000 to $20. The evaluations will include sighted and blind ABX testing.

Coming to you from southern Alabama, the Home Theater Shack Evaluation Team has assembled at Sonnie Parker's Cedar Creek Cinema for the 2015 High-End Amplifier Evaluation Event. We have DACs, we have speakers, we have tunes, we have great eats, and it is sure to be a very fine time for all!


Event Sponsors

Our sincere thanks goes out to the kind, generous contributors without whom this event would not be possible. Please visit their web sites and consider sending some business their way.

Madisound



Oppo Digital

AudioEngine

Vapor Audio


Linkwitz Labs


miniDSP



The Details

The Event
  • Date: Thursday evening, November 12th through Saturday evening, November 14th.
  • Place: Cedar Creek Cinema, Luverne, Alabama.
  • Evaluation Panel: Leonard Caillouet (lcaillo), Louie Stevens (Lumen), Sonnie Parker (Sonnie), Wayne Myers (AudiocRaver).

The DACs
  • PS Audio PWD Perfect Wave DAC ($3,000)
  • Oppo HA-1 Headphone DAC / Amp ($1,200)
  • Headroom Desktop Headphone DAC / Amp ($800)
  • AudioEngine D1 24-bit DAC ($179)
  • Fiio E7 DAC / Amp ($79)
  • Fiio D3 DAC / Amp ($29)
Other key equipment special for the event:
  • Denon X5200 AVR
  • Parasound HALO A31
  • Van Alstine ABX Switch Box, recently updated version (February 2015)
  • miniDSP 4x10 HD
  • OPPO BDP-105
The Speakers
  • Oppo PM-1 Headphones - planar magnetic, super detailed and revealing
  • Vapor Audio Arcus - 2-way, time aligned design with horn-loaded Beyma AMT tweeter.
  • Linkwitz Labs LXmini- 2-way, time aligned design, set up with optional subwoofer support
  • MartinLogan EM-ESL - 2-way hybrid electrostatic design
Our Hosts

As mentioned, our deepest appreciation goes to Sonnie, Angie, and Gracie Parker, our hosts, for welcoming us into their home.
 

·
Banned
Joined
·
4,838 Posts
Discussion Starter #2
DAC Evaluation Results


Comparing DACs

First, we admit up front that we were not exactly comparing DACs, per se. When we bring it up, there are always those who point out that our comparison covers much more than just DAC Number One vs. DAC Number Two, that there are additional amp stages involved, sometimes a headphone amp, other control circuitry, and the sum total of what makes up each of these entities we call a “DAC” is a whole conglomeration of circuitry that can be used as a two-channel system DAC. In most cases, the unit was a DAC/Headphone Amp. That is what we mean by DAC, a System DAC.


Day One: Sighted Pairings

For the final evaluation, we settled on one pair of DACs that had distinctive and audible differences. This was determined on Day One through sighted tests of a number of DAC pairings with open discussion about what we were hearing.


PS Audio vs. Fiio D3: It was embarrassingly difficult to tell these two DACs apart.There were differences, but they were subtle and could not be identified with consistency by any panel member. The PS Audio had trouble locking onto the TOSLINK source consistently, so it was only used briefly in this one sighted pairing.

Oppo HA-1 vs. Fiio D3: This pairing was also very difficult to tell apart consistently, although differences could be heard.

Oppo HA-1 vs. Fiio E7: With this pairing it was fairly easy to hear a difference, and all panel members felt they could identify these DACs consistently in an A-B test.

Oppo HA-1 vs. Audioengine D1: This pairing was, for me, very difficult to identify. I had compared them extensively in my own preparatory work, and when Dennis Young, Tesseract, came to my laboratory later to add his data to that from our weekend in Alabama, he also worked briefly with this pairing. His evaluation was brief, as it was at the end of other listening tests and fatigue was setting in. Dennis also had a difficult time telling the two units apart. My own assessment is that the D1 and HA-1 were virtually indistinguishable.

Oppo HA-1 vs. Headroom Desktop Headphone DAC/Amp: This pairing we did not have time to work with in Alabama. I did my own A-B comparison back home, and was able to consistently tell them apart, using the final A-B test method employed during our weekend.​


How does one listen? And what does one listen for?

These turned out to be very important questions. When we did our High-End Amplifier Evaluations, we did a style of ABX testing where, having no idea what amps we were even listening to, the test subject would listen to one of them, X (randomly chosen), and then after a gap of 20 seconds or so for some randomization switching, the subject would take the remote and start toggling back and forth between A & B and had to try to determine which of them was the X amp he had first heard. That 20-second gap was too much for my own auditory memory. It does not sound that hard, but when you break down the mental processes at work, it can get more than a little complex. The process of trying to determine which of the two was the same as the original X unit was just too much for me to sort through reliably, although others did much better.

For our DAC evaluation weekend, we had agreed that we would use a simpler approach, but it was not until we were in the middle of blind evaluation that we fully understood what that meant. With a test pairing chosen for the blind A-B test on Day Two, we went about A-B testing to see how well we could identify one or the other.

Leonard went first. I mixed up the setting of which DAC was which (A or B) and handed Leonard the A-B comparison remote, so all he could do was toggle back and forth until he determined which was which and stop and identify the DAC he was listening to as his favored DAC. But it turns out we were not quite in sync about exactly what we were doing. Leonard would start off his test track and toggling back and forth, and finally stop and say “That's the good one.” And I would tell him which DAC it was that he stopped on and he entered that into his data set. Then we would do it again. Leonard did a number of those tests using a number of tracks, and then it was Louie’s turn. I did the same with him for a few minutes, and at some point both Sonnie and I realized that we had a communication problem going on. The testing stopped for about 10 minutes during which there was quite an enthusiastic “discussion” about what we were doing. Sonnie and I had thought that they were trying to identify one or the other of the two DACs, but that was not what they were doing at all.

They had identified a listening quality for a given track, and would pick their favorite DAC for that listening quality, with the goal of picking the same one over and over again. Which DAC was chosen was only an afterthought. As long as they picked the same one every time, for that listening quality on that track, they were being successful. I had missed that altogether. At first I thought we would have to completely start over with our testing, but at the end of our discussion we realized that both Leonard and Louie had used the same method for their evaluation and that Leonard’s data reflected it properly, and we were able to continue our testing.

Both Leonard and Louie bad been doing quite well using the method they were following. When my turn came, I felt that my approach would be a little different, I thought I would be able to identify directly which DAC was which, and I started out my testing that way. My results were not good at all.

Trying to properly identify a DAC directly, I was doing a little worse than 50% correctly, so I decided to try the other method, the one that Leonard and Louie had used, and at that point my results really turned around. The method ended up being much simpler mentally.

With a track and listening quality selected, it was much simpler to switch back and forth and decide which one I like better for that listing quality. For instance, the vocals on a Mindy Smith track we recorded very clean, and I would listen for the cleaner sounding of the two decks on that track. Choosing the one that I liked because it sounded cleaner, I was able to select the same DAC numerous times in a row, and had a high success rate, as Leonard’s data will show.

There was still one little wrinkle for me. Sometimes I would decide after I had selected the favored DAC to do a few more toggles to be sure. That almost always messed me up, and I would almost always end up with the wrong answer that way. I did much better with going with my first selection every time.

Using several different tracks and several different listening criteria, I completed my evaluation as Leonard and Louie had. Sonnie, our host, was not convinced he could hear any difference consistently and passed on this part of the testing.

When Dennis came to my house to add his own evaluation data, we used the same method for his work. His results tracked extremely well with that of the rest of us for that pair of DACs that was used for the body of our blind testing.

I have to say that I was quite surprised to find that my own original listening and evaluation method was so flawed, and delighted to find that a very simple approach was highly accurate. I have found ways to use that same method in my own private testing and evaluation work numerous times since then, always with great success.

I understand that this is not ABX testing as it is commonly defined. Our method was not double-blind, and the listeners know exactly which DACs were involved through the blind testing. There are a lot of ways to set up any kind of such tests or evaluations, some are extremely difficult from the listener’s perspective and some are easier on the listener. We purposely chose one that was fairly easy on the listener while still giving data that could lead to statistically significant and meaningful results.


Day Two: Evaluation Results

Leonard will add the data here.
 

·
Banned
Joined
·
4,838 Posts
Discussion Starter #4
Notes On TOSLINK and Fiber Optics in Audio

The first issue to be resolved in planning for this event was how to have the same signal running to two DACs at the same time so their outputs could easily be A-B compared. The easiest to set up would be using SPDIF serial digital, either TOSLINK or Coax, along with a splitter so it could be sent to 2 DACs at the same time. I decided on TOSLINK because it was supported by the DACs we had in mind, and would not create any new system ground loops or grounding issues when in use.

A little research online will pull up many comments, reviews, and articles stating that TOSLINK does not sound as good as other digital connections. That sounded to me like the kind of claim that is based on hype, faulty listening tests, or poorly executed reviews, so I decided to find out for myself. I have been using TOSLINK successfully for some time in my own system, and have had excellent results with it.

Following is a block diagram of the test configuration we would be using.

HTS DAC Evaluation Configuration.jpg

The TOSLINK connects in use in my system are inexpensive generic brands. To complete my preparatory TOSLINK evaluation, I ordered 4 each of two different inexpensive brands, plus a more expensive audio grade connect, plus an even more expensive glass fiber (most are plastic), plus a super-long fiber which I knew would be out of spec length-wise for TOSLINK. I already had an inexpensive TOSLINK splitter on hand.

With fibers and splitter in-hand, I conducted a number of A-B tests, using the HA-1 headphone DAC-Amp and PM-1 planar magnetic headphone combination by Oppo as a reference listening device. Together they are so super clean that any distortion from anywhere else in a system will be revealed. The HA-1 has TOSLINK, coax, and USB digital inputs. I did many tests with and without splitter, with different types of TOSLINK fiber, with coax, and with USB. I came to this conclusion:

For the particular fibers, splitter (including with/without), and USB interconnects tested, there was no audible difference. I am not claiming that such differences never exist, simply that they did not exist with any of the equipment that I made use of.

During preparatory tests, at the event, and since then, the only indication of a problem I have run into using TOSLINK was when a fiber was not plugged all the way in properly and the signal was almost buried in static. It either worked or it did not work, and the result when it worked was always good enough that there was no audible difference.

As a result of this testing, I have become highly suspicious of claims that TOSLINK “sounds worse” than other interconnect types or that different fibers have distinctly different sounds. If a different fiber truly has a different sound result in a system, I suggest that something about that fiber is on the edge of not working at all. There are undoubtedly exceptions to this rule, but I have yet to run across one myself.

Here are the fiber types that I made use of in coming to this conclusion:

TOSLINK Cables:
Audioquest Forrest Full Size TOSLINK Digital Optical Audio Cable (Amazon) 0.75 m (x1)
GearIT TOSLINK Digital Optical Audio Cable by PC Micro Store (Amazon) 50 ft (x1)
KabelDirekt Pro Series TOSLINK Digital Optical Audio Cable (Amazon) 6 ft (x4)
Premium TOSLINK Digital Optical Audio Cable by Cables-Direct-Online (Amazon) 6 ft (x4)
Toslink Glass TOSLINK Digital Optical Audio Cable by Unique Products Online (Amazon) 6 ft (x1)

TOSLINK Splitter:
 

·
Registered
Joined
·
2,261 Posts
The DAC shootout got underway after a few not so unexpected equipment and wiring snags. Turns out the most expensive of the bunch, the PS Audio, was the most finicky. It had trouble keeping a lock on the optical input signal. A different cable helped at first, but things got worse to the point where we couldn't get it to play optical data at all. Unofficial, preliminary (sighted A/B) results had it neck-and-neck with the Fiio D3, so the loss was not a great one, IMO. It's all I personally needed to help shatter one long-held belief that more money could get you much better digital. I'll now gladly go out on a limb to say differences favored the far more expensive unit. But those differences were slight (mostly in soundstage depth and hall ambience retrieval). It's up to the listener to decide how important the sonic and financial differences are.

I still can't help but feel that the PS Audio would have faired better with proper double-cryo'd TOSLINK, though. :whistling: :devil:
 

·
Premium Member
Joined
·
2,539 Posts
I am looking forward to the results write ups.
Since an early report of differences being heard in sighted tests has been made, it will be interesting to see if that holds up in blind listening.
 

·
Registered
Joined
·
2,261 Posts
Hint: It took us a while to learn what to listen for. Each panelist used their own songs. Had I not stumbled across a particular passage in one of mine, it would have been a toss-up for me. But once I did find *it* I picked out that difference over and over. Leonard was able to do the same, but we'll have to let him weigh in with his thoughts (and the data, of course). Wayne? He was also able to ID the same DACs with his own material after we explained to him how we were doing it. Another hint: we picked the two that had the biggest sighted differences for blind testing), Sonnie? He kept us on our toes!


Sent from my iPhone using Tapatalk
 

·
Banned
Joined
·
4,838 Posts
Discussion Starter #14
I hope Leonard doesn't get mad at me for spilling the beans and answering early. Yes, there were differences noted during sighted testing, and there were statistically significant results in blind testing. That is all I will say for now.

What was fascinating to me was the question of how to listen. That made all the difference in the world, for me anyway.

There will also be some interesting discussion about our specific ABX test method.
 

·
Registered
Joined
·
1,784 Posts
Yes Yes, how to listen. This is part of the education of listening seriously for differences or similarities in audio equipment. Like Lumen, I use very specific tracks, often with only a few players of which I am very familiar. Doing this allows me to more easily zoom in on differences, good or bad of the recordings that point to the piece of equipment being examined. Way to go gents. Im looking forward to your findings. :clap:
 

·
Premium Member
Joined
·
2,539 Posts
I want to know what to listen for as much as the DAC test results.

So spill the beans on the secret sauce and edgeamacate me.
 

·
Registered
Joined
·
2,261 Posts
The following isn't so much a lesson in how to listen, as it is a guide of what to listen for:

We each used our own passages of songs to listen for traits such as soundstage width, ambience, decay into the noise floor, etc. Speaking for myself, not all songs revealed the same trait. IOW, when I thought I heard a soundstage difference between DACs on one song, I listened for it on another song, but it wasn't always there. At first that seemed confusing, because I questioned whether or not I'd even heard it; so I had to go back to the first song to confirm. And sure enough, it was there. I guess that's to be expected given the nature of SQ in recordings. It's also why we use different songs to try and detect different sound qualities. What surprised me was I stumbled across a passage - one that I don't normally use for comparisons - that enabled me to consistently pick out two of the DAC's in blind testing (Leonard will be posting results). There was an objection, so we recalibrated the ABX box for equal levels. I was still able to differentiate after that. The "tell-tale" for me was the snare at the beginning of a Jennie Devoe song (sorry I don't have it with me, but like I said, I'm not that familiar with it so I forgot the title). I'll post all my demo songs soon! :bigsmile:

EDIT: Jennie DeVoe - Does She Walk on Water - Barefoot to Babylon
 

·
Plain ole user
Joined
·
11,121 Posts
I hope Leonard doesn't get mad at me for spilling the beans and answering early. Yes, there were differences noted during sighted testing, and there were statistically significant results in blind testing. That is all I will say for now.

What was fascinating to me was the question of how to listen. That made all the difference in the world, for me anyway.

There will also be some interesting discussion about our specific ABX test method.
They are your beans as much as mine. We each need to share our takeaways. How we present the data to best and most objectively reflect what we found will be what I do. But our experience and what we each learned about ourselves and the equipment is far more interesting IMNSHO.

What I will say at this point is that this might have been one of the most illuminating of all of the Sonnie Sessions. And one where we had the strongest debates about process and what questions we were trying to answer.
 

·
Banned
Joined
·
4,838 Posts
Discussion Starter #20
Sorry for the delay in posting more details. Needed a recovery day after two days of driving back to Nebraska, and was so busy while still at Sonnie's that I could never get to it.

As a side note, the power steering went out on our rented Ford Explorer about two hours from home Monday night. It was such a nice vehicle up until that point, even driving in heavy rain (as long as I could SEE), but without power steering and traction control became and absolute BEAST to drive. Luckily it was late enough that traffic was pretty dead through Omaha and Lincoln and I was able to get it to my doorstep without incident and get it unloaded. Whew!!!!! The Budget Car Rental folks sent a tow truck for it next morning. They were very nice to deal with at both ends of the rental, and knocked a day off my rental fee and waived the final gas refill for my trouble. I give them a blatant plug here out of appreciation for their excellent customer service!

Will get into posting reviews and results shortly! Lots to talk about!
 
1 - 20 of 38 Posts
Top