Biting bits

July 26, 2019
 by Paul McGowan

Yesterday we talked of the difficulty understanding how digital audio bits could be higher or lower quality. Would we hear differences in sound quality between digital bits streamed over the internet vs. those found on our local hard drives or CD players?

The simple answer is no. Given identical source files, bits are bits when transferred between data sources and their end clients. There are no clocks associated with streamed, stored, or transferred data. So, for example, one cannot accurately suggest that the "timing is off" on CD or hard drive stored data since the data itself are unrelated to timing devices. Once that data gets delivered to our DAC the server or CD player has added a clock to the bits. That is a horse of a different color. If the timing of streamed bits is off, it's the server or CD player we can point a finger at.

The quality of the switch handling our network data is meaningless. It either faithfully passes the bits or it does not.

So, why do people hear differences with various switch types and connecting cables managing our internet traffic? My guess is other reasons than data corruption. Fact is, we know the data is uncorrupted so finger-pointing probably needs to change direction: shielding, power supply noise entering the DAC, ground noise or contamination.

When investigating a commonly held belief it's always beneficial to assume the many observations are correct. That attitude leads one to quickly dismiss the obvious and dig deep for underlying possibilities.

Subscribe to Paul's Posts

27 comments on “Biting bits”

  1. Suggesting expensive Ethernet cables should make no difference, if they follow correct earthing and aren't built so poorly they cause dropped packets, but different switches can, not based on the "quality" of their switching, but based on different quality PSUs and grounding schemes etc.

    Oh, and hello, my first post, having spent the last few weeks signed off work and reading the entirety of "Paul's Posts" from the last 6 years over those few weeks off 🙂

    Made a good "catch-up" on where HiFi and audio have got to whilst I've been distracted by a career in IT the last 20 odd years, and a great introduction to the ethos of PS Audio, so many thanks!

  2. Just like to re-enforce Paul's description by re-stating the issue in a slightly different way. Firstly, Paul's correct - bits are bits - they're either correctly stored or not so I think there is no chance that the means of getting those bits into some type of 'buffer' (CD, hard disk or whatever) is going to affect the ultimate SQ of what you hear.

    As Paul points out, all the problems are with how those bits are 'rendered' into the analog world. The digital manifestation of your source material 'must' be kept isoloated from the analog rendering in order to achieve optimal SQ. The digital world is electronically noisy, dirty and distorted so you must keep it as seperate from the pure analog world as best you can. The nexus - the connection between the two worlds - is your DAC. This is where the most care should be taken.

    My choice of isolation is via PSA's bridge (in the DirectStream DAC) as it delays the addition of the all-important clocking information until the last possible moment before converting to an analog signal and thence onto the pre-amp/amp and ultimately to the speakers.

  3. A near-lifetime of working with digital transmissions has left me firmly in the 'bits are bits' camp - providing they remain as bits. In consequence I too believe that the quality variations seen in various means of transporting the bits are probably due to electrical noise carried by, or introduced into, galvanic connections (wires). I would like to see the transport of data to the DAC taking place over a good optical connection (not Toslink) to eliminate the noise transfer.

    1. Heartily agree. Wouldn't it be wonderful if someone could come up with a light-based transmission standard that could accommodate 'any' bit depth or bit rate! I mean, you'd think it would be possible. Some smart people developed the DSD over PCM (DoP) standard. Why not a 'really good' optical standard?

      1. Indeed, as others have pointed out - glass fibre kit capable of 1 Gb/s is cheap as chips now, and whilst I use TOSLink it leaves much to be desired from a mechanical point of view, and hence probably jitter-filled also...

      2. There are a number of solutions already in the market. For my PS Audio DSD I use an SFP port on my switch with a fiber optic cable that runs to a Triplite media converter that takes the optical signal and turns it back into a electrical signal that goes over a 12" ethernet cable to the PS Audio bridge. The electrical to light back to electrical conversion basically eliminates almost any change of electrical noise getting introduced to the cable prior to hitting the DAC. If you don't have a switch with an SFP port, you can use two Triplite media converters back to back. Companies like Gigafoil sell an all-in-one device that does the same thing. And Sonore has recently released an optical version of their network streamer.

      3. Maybe 10 years ago a friend loaned me just such a digital cable and it was indeed superb, though more than I could afford at the time. Sorry, but I can't remember the brand, nor do I know if they are still in existence. Perhaps someone else here can help us out?

  4. It’s all about da noise! Airborne, ground, and probably other noise. After reading Ted’s adventures improving the DS by reducing noise, I started looking at sources of noise in my system. The difference has been dramatic. I had three devices connected to a switch (which I kept directly behind the system rack). The demise of PS Audio’s power management eliminated one connection; I switched to WiFi on another device, which eliminated the second, the third connection I filtered. With only one internet connection, I was able to eliminate the switch near the system, and now I just run a longer cable from a switch in another room.

    These changes, in addition to brick-wall digital/analog power isolation, removed the high frequency edge that seemed to linger, even as I improved individual components in my system.

    1. Spot on. In just about every application other than high-end audio streaming, most of what an ethernet transmission system, and the component parts thereof, are ever judged upon is data fidelity. All it has to do is deliver the bits. On the other hand, standards-setting organizations also have a lot to say about secondary effects such as EM radiation which may not affect the data fidelity one iota but can cause unacceptable side-effects to other equipment in the vicinity, such as radios. So we can accept that such things exist.

      Weirdo stuff like the audibility of different ethernet cables used in various places in the system infrastructure is in many ways an extension of the same kind of issue, but at a level that matters not a jot to anybody except a few very keen individuals. Practically speaking, if an ethernet cable (or something of that ilk) can have a sonic impact on the audio system it contributes to, it can (IMHO) only be through some sort of RF interference vector. Our problem is that as yet we have neither the knowledge nor the tools to be able to clarify exactly what that mechanism might be, and what specific audible effects we would expect it to have. Absent those things, there is always going to be controversy surrounding the issue, with the accompanying mixture of genuine ad-hoc solutions and bogus snake-oil.

      1. "Our problem is that as yet we have neither the knowledge nor the tools to be able to clarify exactly what that mechanism might be, and what specific audible effects we would expect it to have."

        I can assure you as a career RF engineer, we have more than enough knowledge and the tools to measure this. It's just that the technology is beyond the reach of audiophile size consumer audio companies. OTOH, the big players like Samsung, LG, Sony, do run these tests on their consumer products. They still build to a price and may not address a problem that is beyond the product price point. But these issues are measurable and addressable.

        1. Either you misunderstood me, or you know a heck of a lot more about this than I do (or both)!

          Yes, technology exists to measure RF emissions from ethernet cables, or levels of RF noise superposed upon the data signal in systems employing those cables. My point is rather that we don't know what values those measurements need to deliver in order for the cable to have no sonic impact in a highly revealing audio system. And a significant aspect of that would be knowing what specific audible property is caused by which specific measured cable property.

          1. Well actually we do have a rather simple way to measure that. We measure the analog output of the DAC just like any other analog audio gear. If the signal is clean up to the noise floor of the power amp, which is rather low, then what ever noise the digital path is contributing is inaudible. That measurement also extends to jitter. Jitter causes unnatural harmonics. Those too are easily measured within the recovered analog signal.

            The truth is that in a typical residential setting, the amount of EMI/RFI noise on digital audio cables is minuscule. It's simply not a problem. But some hardcore audiophiles will not accept that and spend thousands to try and mitigate a problem that simply does not exist.

  5. What I do know is:
    1) Digital cables like USB cables sound different
    2) It doesn’t make sense that they do

    Our regular listening group has blind tested different USB cables, and they indeed sound different. Much the same as any other cable, the differences vary from little to very noticeable. In contrast, I found no noticeable difference in different lightning cables between my MacBook Pro and external hard drive. Is that because my computer reads music first into core memory, then plays it from there, making timing less an issue? Who knows.

    The thing about digital is that moving bit perfect data is solved. When I move a Word document from one location to another, random bits don’t get dropped. Similarly, when I download music files from an online vendor, I have supreme confidence that a comparison of those files (unless something is broke) is bit accurate. When my computer is passing music data to my DAC, I’m also supremely confident that it is bit perfect. Similarly, it should not make a difference if the the USB cable is 20 meters long or 1/2 meter. The data either gets there or it doesn’t.

    It makes hearing differences intellectually disturbing.

    1. Respectfully, I'd like to quibble with you on both points.

      1) It's good that your group tried a blind test of USB cables. This is a fun hobby, and doing those comparisons with friends can be both amusing and instructive. My audiophile group does them too. However, I've never found them particularly meaningful. If people are listening together and are allowed to talk -- or even look at each other -- then the results are pretty worthless. As has been shown over and over in psychological experiments, it is shockingly easy for the group to be influenced by individual members (even subconsciously and nonverbally).

      2) USB 2.0 standards limit the cable to 5 meters. I believe longer USB cables have to be actively powered to prevent a substantial drop in power along the way. Longer cables will pick up noise, and some generic cables are not well shielded. With a poorly designed DAC -- and there are some made by highly-regarded audiophile brands -- I can imagine the quality of a USB cable making a sonic difference. (At least the idea is not intellectually disturbing.)

      I use a .5 meter USB cable that is known to test well (oscilloscope, not subjective). It was about $20, which is probably more than I needed to spend, but it provides complete piece of mind, allowing me to worry about other things. 😉

  6. I thought today’s post was going to be one causing me to think “whattttt??” but it does not. Bits-are-bits, I can agree with that, and absolutely then agree that it’s the noise of the systems (computer, switches, etc.) and the way the bits are rendered (computing power of your device, headroom, etc.) that makes the difference. It will be interesting now to see the rest of the day’s posts as others weigh in.

    1. Generally agree - eliminate surplus components, eliminate airborne noise, eliminate mains noise, use optical cable and let the DAC/streamer do the reclocking and the fun is nothing needs to be esoteric or expensive.

  7. As I said yesterday, modern telecom transmission and receiving are performed using packet switching technology. The data that is transmitted is sent from the source to the ISP which sends it to the telecom company. There the data is broken down into packets in time by the telecom company's switches of the types I described yesterday. Each packet has a "header" containing the source and destination data and the "payload" which is the actual data being transmitted for the end user's application. The packets can each travel different routes. The networks must be synchronized in time and the packets must be reassembled in the correct sequence at the destination switch. Between the telco company and the ISP the signals are multiplexed also require stable time based clocks.

    https://www.computerworld.com/article/2593382/networking-packet-switched-vs-circuit-switched-networks.html

    https://en.wikipedia.org/wiki/Packet_switching

    https://www.meinbergglobal.com/english/info/time-synchronization-telecom-networks.htm

    What ever happened to "digital jitter", the bugaboo of audiophiles? Reclocking and wavefrom reshaping eliminates it.
    I'm not aware of how internet transmission checks and corrects for errors but they must because these transmissions are invariably received apparently errorlessly when functioning properly. RBCD uses belt and suspenders time ten to find and correct errors as an inherent part of the system.

    Since the messages have the source and destination data in the header and the payload contains the url addresses of the sender and receiver, anyone tapping into the ISP network can determine who said what to whom. The US government monitors every email, text message, and telephone call in the United States and at the request of some foreign governments, others unwillingly theirs too. It seems like an impossible task but the US government has managed to do it with its vast resources..

    To protect privacy of top secret US government messages, the United States Navy invented Block Chain technology. This makes it impossible for anyone including the US government to track the source and destination URL addresses. Years later block chain was released to the general public for their use. It is a critical element in crypto currencies like Bitcoin. It is used for whistle blowing informants to communicate with the press or authorities anonymously. It's also used for all kinds of criminal activity. Yet it is considered a net benefit and is even endorsed by Edward Snowden. To access much of block chain you have to download the Tor or Onion Skin browser. This gives you access to "the dark web." If you do, you can be 100% guaranteed that the US government will know what you are transmitting and receiving even if they don't initially know who you are. They'll entrap you if you are a criminal as buyers of contraband such as drugs, arms, child pornography, and people trafficking. Frizzledrip was posted on the dark web. If you know how you can find it you can see it. It is illegal to download it but apparently not illegal to watch it. Psychologists warn people not to watch it because it is so horrifying it will affect you very badly, possibly for life. NYC crime hardened cops cried when they saw it on Anthony Wiener's laptop.

    Another workaround to identity lately are VPNs, or virtual private networks. This is how foreign companies and some individuals were able to access the internet freely in China to get around the great firewall of China, the government's internet censorship there. China has outlawed VPNs and is systematically shutting them down. This is one of a number of factors driving foreign companies to move out of China. Google is being investigated by Congress and the federal government for assisting China with censorship technology and AI but will not assist the US government with AI. Perhaps its execs will go to prison for treason. In fact all of silicon valley is under increasing government scrutiny. The government wants to know how they got so big and so rich so fast. They may have to be broken up simply because they are much too powerful and are controlling the internet.

    The systems are becoming so complex that it is predicted eventually in the not too far distant future the entire system will be run by AI. Humans will become increasingly obsolete. People are starting to think about how society will function when there is no more work left for humans to do.

    1. When there is no more work left for humans to do, we will all be actors in our own reality shows, destined to entertain the AI. In short, we will become the pets of our machines. But I trust there will be glitches and that some of us will remain rogue or feral.

  8. Hi Paul- great post. I’m in this hobby longer than you are and probably longer than most or not all replying. I’ve seen quite a bit of audiophile BS over the years but it pales in comparison to the computer audiophile BS. The only thing that makes a REAL difference is from the network bridge forward, not from the bridge back.

    Russ

  9. Though I believe bits are bits, having nice cables are not only aesthetically pleasing, but having well constructed connections are nice as far as reliability over a lifetime. Cheap cables will do the job, but for how long?

  10. I must not understand today's post correctly, since I've found noticeable differences in sound characteristics (and quality) simply by changing a SATA cable for the music drive in my desktop computer, or changing the USB cable - or sticking in a LAN Rover or Regen - between my computer and Directstream dac. Or in the choice of USB cables between an SSD and Oppo in my living room system. Some reputable developers (and others) claim that there are sound differences between HDDs and SSDs, although that seems to be more related to isolation or perhaps other factors.

    1. highstream I think a number of us would not argue your point, but may say that the differences you hear may be due to better construction of the connectors, or filtering (such as with the LAN Rover or Regen), or cable shielding in general that might block or reduce noise or interference. These are secondary to what I thought was the primary point of the post: that having to do with the bits at each end of a cable being the same. I personally think these secondary issues of noise and filtering for me have been the most helpful in my sound quality improvements.

      1. But then it seems that no one really knows, do they, anymore than anyone knows with other types of cable interconnections. At this point, it seems to be logical speculation: people think one variable is controlled with computers, storage drives, streaming - bits. I'm not an engineer or technician or pretend any special insight, just reflecting back the arguments and logic I hear.

  11. Ethernet as a transport is connected only to the ASIC. Either the packet's CRC matches at the endpoint or it doesn't and has to be retransmitted, which will add latency, which if the buffer is too small can cause issues with playback. As long as the media used, whether wired or Wi-Fi doesn't have to be retransmitted, and is of sufficient speed to support the data sent, (generally a 100Mb connection is adequate, and the most popular of all) then noise isolation is dependent on quality of the design by the playback device. Remember, cabled ethernet has 2 pairs which no data is used, and can act as an antenna if the cable isn't terminated properly. Point in all of this, use a decent switch, with good quality cable with good terminations, or use Wi-Fi. Wi-Fi brings with it it's own headroom issues, especaily in today's over saturated 2.4GHz range with most routers using bad neighbor technology with overlapping channels bonding, which does interfere with other routers in close proximity which only adds to the poor reception/packet transmission issues. If you feel your Wi-Fi performance is suffering, use a device with software that can allow you to see signal strength of all the routers in close proximity to help you diagnose your router settings, or better yet, use wired connections.

Leave a Reply

Stop by for a tour:
Mon-Fri, 8:30am-5pm MST

4865 Sterling Dr.
Boulder, CO 80301
1-800-PSAUDIO

Join the hi-fi family

Stop by for a tour:
4865 Sterling Dr.
Boulder, CO 80301

Join the hi-fi family

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram