Getting tricky

Prev Next

Yesterday we were in the middle of designing our Magic DAC. We added a sexy little circuit that removed all traces of jitter right at the converter's input. Life should have been, well, magical. But to our astonishment we could still hear differences between jittered sources. Life's full of surprises. What we hadn't planned on was the effect the power supply had on the converter. What was happening was obvious once you understood it (isn't everything?). Every transition of the data goes from off to fully on. When an individual data bit is off, the power supply is at rest. When that data bit transitions from off to on, the power supply in the DAC is delivering current. It's also telegraphing the change from off to on throughout the entire DAC. No matter what we do to reduce the power supply's effect on these transitions we're still going to "hear" them to some degree. And that degree is going to be noticeable. But what would happen if those on/off transitions occurred perfectly in synch with our clock? In other words, what would happen if the data coming into our DAC wasn't jittered? The answer is simple: the ghost image of the power supply that is telegraphed throughout our DAC is in lock step with the data, so we have achieved magic. Heck, all we need to do is figure out how to get perfect data out of anything we plug into our Magic DAC. Oops. We forgot. We don't control the sources. Our potential customers are going to plug in all sorts of nasty, jittered sources. Damn their oily hides! But wait! What if it were possible to simply ignore those nasty transitions? What if we don't pay any attention to when the data move from off to on? What if we had a lookout scout sitting on a lawn chair watching for changes and when he sees one, picks up the phone and calls to our DAC. "I see a change, when the timing is right, go ahead and count a 1." Then our scout goes back and has another sip of lemonade. If that were possible, then we'd get the correct data presented at exactly the correct time and there would be no jitter on any input signal. Why? Because we're going to stop relying on those unreliable transitions to tell us when the data changes. Yup. Great plan eh?
Back to blog
Paul McGowan

Founder & CEO

Never miss a post


Related Posts

1 of 2