blue jeans cable logo shopping cart
shopping cart

HDMI Cable: An Overview

Ever since HDMI-capable devices started to come onto the market a few years ago, there have been a lot of questions--and a lot of misconceptions--about HDMI and HDMI cables. A "FAQ" on this subject, as we've found in trying to assemble one, would be so long that the better way, we think, to provide some answers is to simply address the major question groups: What is HDMI, anyway? What's in an HDMI cable? Why might I, or might I not, want to use an HDMI cable as opposed to, say, component video? What makes one HDMI cable better than another, and when does it really matter?

So, What is HDMI, Anyway?

HDMI stands for "High-Definition Multimedia Interface." The HDMI standard was developed by a consortium of consumer electronics manufacturers and content providers, to address what, from the content-provider industry's standpoint, was a serious problem: existing analog video formats such as component video are not easily copy-protected. HDMI, being digital, provides a perfect platform for the implementation of a copy-protection scheme (HDCP, or "High-Definition Content Protection") which enables the content providers to limit the consumer's access to, and ability to copy, video content.

HDMI is a horrid format; it was badly thought out and badly designed, and the failures of its design are so apparent that they could have been addressed and resolved with very little fuss. Why they weren't, exactly, is really anyone's guess, but the key has to be that the standard was not intended to provide a benefit to the consumer, but to such content providers as movie studios and the like. It would have been in the consumer's best interests to develop a standard that was robust and reliable over distance, that could be switched, amplified, and distributed economically, and that connects securely to devices; but the consumer's interests were, sadly, not really a priority for the developers of the HDMI standard.

What's in an HDMI Cable?

The HDMI format is essentially a digital version of RGB analog video; the principal signal in an HDMI cable is carried on four shielded twisted pairs (yes, just like a CAT5 cable, but with shielding added), which carry a clock signal plus red, green and blue signals (though in some cases the red/green/blue division does not correspond neatly one a one-color-one-pair basis). Sync pulses, which tell the display where a line or frame ends or begins, are carried on the "blue" line. In some cases, rather than RGB video, HDMI carries Y/Pb/Pr "color-difference" video, which represents the same information as RGB but differently conveyed. In addition to the data pairs, seven miscellaneous additional conductors carry some signaling and incidental functions (and, in some possible future applications, an Ethernet channel).

Why Might I Want to Use HDMI?

HDMI cable is often the handiest way to connect two devices; at the moment, that's really the best reason to use it. However, in the future, it may become necessary to use HDMI connections with certain devices, or certain recorded media, in order to get full HD content. Beyond that, there aren't a lot of compelling reasons to use HDMI as your connection method. Most of the arguments we hear are based upon common misconceptions about the benefits of HDMI, and one really needs to get past those to understand just what the real reasons to use--or not to use--HDMI are.

HDMI Myths and Misconceptions:

1. "Only HDMI carries High-Definition Signals." Wrong, wrong, wrong. Analog component video and RGB both support high-definition resolutions, and what's more, they're more robust and dependable over distance. There likely will be cases, in the future, where high-definition signals are available from certain source recordings only through the HDMI port, and only downconverted standard-definition video will be available on analog outputs. However, as of this writing, none of the recordings available on high-definition disc formats have the "flag" set to limit HD output to HDMI. Some "upconverting" DVD players will output their upconverted signals only on HDMI, but the value of DVD-player upconversion ranges from dubious to clearly negative as in most cases it only adds an additional rescaling step to the signal chain.

2. "HDMI provides a pure uncompressed HD signal." This is one of those statements which is true if taken in a wholly irrelevant sense, and untrue if taken in its only meaningful sense. Unless you work in a video production facility, chances are that you've never seen uncompressed HD video. That's a shame, because it's gorgeous; side-by-side comparison with, say, an ATSC broadcast signal or a Blu-Ray signal can be a rude awakening, and just serves to highlight how heavily-compressed and artifact-laced all of the HD video sources we view are. No broadcast, and no recording medium, on the consumer market provides uncompressed HD video, and none are likely to do so in the near term.

So what is meant by the assertion that the HDMI signal is uncompressed? What this too-often-repeated statement actually means is that the signal is not further compressed when it is translated from its source format to HDMI. But the same is true of all source-to-display baseband video formats; component video and RGB are not compressed after the signal is decoded from a DVD or a broadcast signal. The assertion that HDMI is "uncompressed HD video" means only, then, that HDMI is no worse in this respect than any competing video format.

3. "When I use a digital source, I get a pure digital-to-digital signal chain using HDMI." This, again, is true in an essentially meaningless sense, and untrue in the sense in which most people actually understand it. The assumption behind the statement is that the signal flows, unaltered and without degradation, from a digital source to a digital display without ever being converted, and that by eliminating these conversions--specifically, digital-to-analog conversions--one gets a better picture. But the HDMI signal is not the same as the signal recorded on a DVD, or sent in an ATSC or QAM transmission; all of those are compressed formats which encode video in an entirely different way from HDMI. Accordingly, to get from the one to the other requires decoding and conversion. In every case, the signal is decoded and rendered as a video stream. If the original signal is in one resolution, and the output format is in another, the image will be rescaled; if the original signal is recorded in one colorspace and the output format is in another, it'll be converted. There is nothing inherently perfect or error-free about digital-to-digital, as opposed to digital-to-analog, scaling and conversions, and some things -- scaling, in particular -- are often more easily done well in the analog domain than in the digital domain.

So, yes, a DVD player putting out an upscaled HD resolution through an HDMI cable into a plasma display is an "all-digital" signal chain--but it's an all-digital chain in which the colorspace is being converted, the original signal is being decoded and converted to another format, and the image is being rescaled not once, but twice along the way. Is doing this digitally superior to doing it in a chain that involves analog conversions? In any particular case it may be, or it may not be, but there's no reason in principle to think that it necessarily will.

4. "Because HDMI is a digital signal, it doesn't degrade when run over a long distance like an analog signal does, because it's just ones and zeros." Yikes! Not true at all. To explore this issue calls for a bit more detailed discussion.

First, it's true that if a digital video signal stays intact from one point to another, there's no degradation of the image. The digital signal itself can degrade, in purely electrical terms, quite a bit over a distance run, but if at the end of that run the bitstream can be fully and correctly reconstituted, it doesn't matter what degradation the signal suffered--once that information is reconstituted at the receiving end, it's as good as new.

That's a big "if," however. Ideally speaking, digital signals start out as something close to a "square wave," which is an instantaneous transition from one voltage to another; these transitions signal the beginnings and ends of bits. (In practice, such transitions aren't strictly possible, and trying to achieve them can generate harmful noise; consequently, high-order harmonics are usually filtered out which results in the wave starting out squarish but not-quite-square.) A square wave, unfortunately, is impossible to convey down any transmission line because it has infinite bandwidth; to convey it accurately, a cable would have to convey all frequencies, out to infinity, all at the same level of loss ("attenuation"). What happens, therefore, in any run of cable is that a digital signal starts out looking relatively nice and somewhat square, and comes out the other end both weaker and rounded-off. The transitions that mark the edges of bits get smoothed and leveled to the point that, far from that ideal square wave, they look like relatively gentle slopes. Portions of the signal lost to impedance mismatch bounce around in the cable and mix with these rounded-off slopes, introducing an unpredictable and irregular component to the signal; crosstalk from the other pairs in the HDMI bundle also contribute uneven and essentially random noise. As a result, what arrives at your display doesn't look very much like what was sent.

Now, as we've said, up to a point, this degradation along the HDMI cable won't matter; the bitstream gets accurately reconstituted, and the picture on your display is as good as the HDMI signal can make it. But when it starts to fail, it starts to fail conspicuously and dramatically. The first sign of an HDMI signal failure is digital dropouts--these are colloquially referred to as "sparklies"--where a pixel or two can't be read. When these "sparklies" are seen, total failure is not far away; if the cable were made ten feet longer, there's a chance that so little information would get through that there would be no picture on the display at all.

The shame is that, with HDMI, this is prone to happen at rather short lengths. When DVI was first introduced (same basic encoding scheme, same cable structure, but a different connector from HDMI), it was hard to find cables that were reliable in lengths over 15 feet. The fact that these multipin cables aren't economical to manufacture in the US and so were being produced exclusively in China, too, didn't help; Chinese cable manufacturers are very good at keeping costs down, but not the best at keeping tolerances tight. Today, a good HDMI cable can be relatively reliable up to about 50 feet, but because different devices tolerate signal degradation differently, it's impossible to say categorically that a 50 foot cable will work; it's only possible to say that it will work with most devices.

Why is that? Well, it all has to do with bad design. The designers of the HDMI standard didn't really think much about the cable problem, and it shows. This topic is fairly complex in itself, so we've split it out into a separate article: What's the Matter with HDMI Cable?

Analog video signals, contrary to what seems to be the conventional wisdom in home theater circles, are extremely robust over distance. We have run component video for hundreds of feet without observable degradation; the bandwidth of precision video coax, rather than being horribly overtaxed like that of an HDMI cable, is greatly in excess of what's needed to convey any HD signal. It is true that an analog signal degrades progressively with length; but that degradation, in practice, is so slight and slow that it rarely gives rise to any observable image quality loss in home theater applications.

5. "An HDMI connection is always superior to an analog component video connection." Not so, for the reasons we've addressed above. Further, we've noticed that it's not at all uncommon for the HDMI input to a display to be calibrated very differently from the analog inputs. One plasma display we set up looked very bad when fed an HDMI signal--scenes became washed-out and slightly greenish, and the black level was set all wrong so that high-contrast scenes really had no black to them at all, just a sort of muddy-gray color. After some display tweaking, we were able to rehabilitate the HDMI input so that it looked as good as the component video input--but depending on what calibrations are available to you, how your display's been set up, and ultimately perhaps upon some subjective aesthetic considerations, it's not necessarily always going to be possible to get your best picture out of an HDMI input. Whether it looks better, or worse, than the component video input in any particular case will depend on the source, the display, the calibration of the source, the calibration of the display, and, ultimately -- since these matters can be somewhat subjective -- your judgment.

One note: HDMI will almost always look better than an s-video or composite (not component!) video input. S-video and composite video are both limited to 480i resolution, and do not render color as well as a three-color format like component video or HDMI.

What Makes one HDMI Cable Better than Another, and Does it Matter?

HDMI cable quality is a bit complicated, and unfortunately, it's hard to judge from a spec sheet, especially because very few manufacturers provide any useful product specs. There are a few things to bear in mind.

At present, to our knowledge, all of our competitors' HDMI cables are built in China (for more detail on this point, see this article). The Blue Jeans Cable Belden-based HDMI cables are the only HDMI cables which are manufactured, in principal part, in the USA (for a variety of practical and economic reasons, we have been unable to do the cable termination in-house and so rely on Chinese vendors for connectorization). We are often told that some brand or other of HDMI cable is manufactured in the US, and in every case, we've found that not to be so; rather, what often happens is that while the cable is sourced from China, the marketing materials obscure the fact. Don't let the fact that an HDMI cable bears a U.S. brand name lead you to believe that that HDMI cable contains American products, American labor or American know-how; none of them, other than ours, do. And China may be an easy place to get a good price, but it is not a good place to get a leading-edge technological product; for top-quality data cables (and HDMI is a data cable), the US is still the place to go.

The Chinese source problem makes it very hard to get a spec sheet, and very hard to know what that spec sheet means, when dealing with an HDMI cable. Most vendors of HDMI cable in the US don't know what attributes would make a good HDMI cable, and since they don't participate in the manufacture beyond specifying jacket printing and the shape of the molded connector, they don't really have much reason to find out. The result is that most citations to product spec that one finds in connection with the sale of HDMI cable are references to the product's wire gage. Wire gage is somewhat meaningful, but judging HDMI cable quality by comparing wire gage is like judging automobile quality by comparing engine block length--a very, very inexact way of looking at the problem.

The primary work of an HDMI cable is done by the four shielded twisted pairs which carry the color, sync, and clock signals. The designers of the HDMI standard made an error of judgment in running these signals balanced, in twisted pairs, rather than unbalanced, in coaxes; attenuation (the tendency of the signal to get weaker with distance) is much greater, and impedance and timing are harder to control, in twisted pairs than in coax. Control of the cable impedance is critical to keeping the rounding of the bit edges under control; the more the impedance wanders off of spec, the more the signal will round, and the closer the cable comes to failure. Where a coaxial cable's impedance can be controlled within two percent of spec, it's a challenge to keep a twisted pair any tighter than about 15% plus or minus.

The HDMI signal will fail if attenuation is too high, or if the bit transitions become excessively rounded so that the receiving unit can't reconstitute them accurately. There's no really reliable benchmark for just how much attenuation is acceptable, or how round the shoulders can be, before the "sparklies" will start. (Yes, there are specs for these things in the official HDMI spec document, but real-world devices vary so much that meeting the spec is no guarantee of success, while failing it is no guarantee of actual failure.) But while wire gage has something to do with the former, it's really the latter that's important; and wire gage has nothing to do, at least directly, with impedance control.

Transmission line impedance, in any cable, is dependent on the cable's materials and physical dimensions. For purposes of an HDMI cable, these are:
1. the shape and size of the paired wires;
2. the thickness, and dielectric properties, of the insulation on the paired wires;
3. the dimensions of the shield over the pair.
These seem, in principle, like simple things to control--that is, until one spends a bit of time in a wire and cable factory and finds out just how many little problems there are. Wire is never perfect; its dimensions and shape vary from point to point, and small dimensional variations can make for significant impedance changes. Wire can suffer from periodicity (in fact, strictly speaking, it not only can, but always, at some level, does) because (for example) it's been drawn over a wheel that was microscopically out-of-round, and that periodicity will cause the wire to resonate at particular wavelengths, which can really wreak havoc. The plastic dielectric has to be consistently extruded to the correct diameter (and thousandths of an inch matter here!); if it's foamed, it needs to have highly consistent bubble size so that one side of the dielectric isn't airier than another, or one foot airier than the next. The two wires in the pair need not to wander in relation to one another; as they "open up" or are pressed tightly together because of tensioning on the wire-twisting machine (or tension applied to the cable by other handling, or by shield application, or...), or because the finished cable is being flexed, the impedance changes. The shield is a factor in the impedance as well, because both signal wires have capacitance to the shield, and if the foil is wrapped more tightly in one place and more loosely in another, that, too, will cause impedance to vary. (And these are just a few of the obvious problems; manufacturing processes involve other problems that nobody not involved in manufacturing would ever think of. For example, the lube that's used to assist in wire drawing needs to be washed off the wire before dielectric is extruded over it; what if the side from which a jet of cleaner is fired at the wire gets cleaner than the opposite side, and the dielectric winds up conforming differently to one side of the cable than the other? What about the other thousand things you and I, not working in a wire factory, have never even begun to think about?) As a result, although every manufacturer's HDMI cable is built to meet a nominal 100 ohm characteristic impedance, every foot of every cable is different from every other. The best one can do is to hold impedance within a range, centered on 100 ohms; the official HDMI spec calls for 100 ohms plus or minus 15%, which for a coax would be horribly sloppy. The tighter that tolerance can be kept, the better the performance will be.

Worse still, impedance is not a one-dimensional characteristic. HDMI cable operates over an enormous frequency bandwidth, and impedance in a twisted pair is frequency-dependent (in a coax it is, too, but far, far less so). A twisted pair's impedance will rise relative to frequency; how much it will do so, and how evenly and regularly, will depend upon subtle physical characteristics. So, strictly speaking, no cable can actually be within tolerance for impedance over the whole operating range of the cable; it can only be within tolerance by the method the spec designates for measurement.

Impedance control is important for another reason: timing. As impedance varies, so will the time it takes a signal to travel down the cable. Electricity travels at nearly the speed of light; how close to the speed of light it travels depends on the dielectric, and is referred to as the "velocity of propagation." The objective, in putting together the four pairs in an HDMI cable, is to have them be identical; but in actual practice, each pair in a four-pair set will have its own delay. If the delay of one pair is sufficiently greater than the delay of another pair, the receiving device will not know which "red" pixel belongs to which "blue" and "green" pixel, or if the clock circuit is off, it may be impossible to time any of the color signals reliably. Since this delay depends on the consistency and dimensions of the dielectric, and the consistency and dimensions of the dielectric are important factors in impedance, the same requirement for consistent impedance applies here; if impedance is too inconsistent, timing will be too inconsistent, and the whole system will fail.

One way of looking at cable performance is to chart the attenuation for a given length of cable against frequency. For any cable, attenuation (measured in dB) will increase with frequency; this attenuation comes from a few factors. Loss to resistance goes up with frequency, because higher frequency signals are able to use less and less of the cross-section of the wire (this is known as "skin effect") and so have less copper to travel through. Losses to reactance -- capacitance and inductance -- also increase with frequency. Then, what we call "return loss" adds the most irregular, and difficult-to-control, component to the loss. "Return loss" is the loss to impedance mismatch, and is so called because it represents the portion of the signal which is lost when, upon encountering a change in the impedance of the circuit (this may be a change in impedance along the cable, or a change of impedance on entering or leaving a connector, or a circuit board trace, or encountering a different impedance than expected at the load end of the circuit), it reflects back along the cable towards the source rather than being delivered to the load. While basic resistive and reactive losses are pretty reliable and have a definite relationship to frequency, return loss can be quite irregular. A graph of return loss against frequency, rather than showing a nice, consistent curve, is characterized by sharp, spiky lines. Why is this? Well, return loss has to do, more than anything else, with those manufacturing tolerances and their impact upon impedance. Every wire, at some level, has some periodicity, and so resonates somewhat at some unintended frequency. Every dielectric extruder fails, at some level, to extrude the dielectric consistently; every spooler that winds wire or dielectric-covered wire, every wire twister, every unreeler that handles that wire as it goes back into another stage of processing, every foil-wrap and drain-wire machine, every planetary cabler (which bundles and twists the pairs together with one another), every jacket handler and extruder--all of these machines, in all of these processes, apply microscopic irregularities to the cable which show up as return loss. Return loss can't be eliminated, at least not in a real-world cable; but it can be, within limits, made as small and as consistent across a range of frequencies, as possible.

Generally speaking, devices handle very linear or predictable losses very well. If one knows that one part of a signal will come in a thousand times weaker than another part, it's easy to "EQ" the incoming signal to boost the weak part to match the level of the strong part. But return loss can't be EQ'd out because it's too uneven and unpredictable.

Return loss, not resistance, is the critical consideration in determining the quality of an HDMI cable; if one were comparing cables with similar resistance, capacitance, and inductance values against one another, and consulting a chart of attenuation relative to frequency, what one would generally see would be that cables with superior return loss characteristics would show a flatter attenuation curve than the others. This is very important in HDMI because the required bandwidth for an HDMI signal is enormous, and the higher the frequency, the harder it is to control return loss.

Generally, in looking at HDMI cable products currently available on the market, we've found that these issues get overlooked. Instead of trying to control impedance well, which will result in flattening the curve on the attenuation chart, manufacturers generally try to control resistance. Why? Well, resistance is a lot easier to control. Bigger wire (smaller AWG number) has less resistance, and choice of materials can play a role, too (silver-plated copper is lower in resistance than bare copper, and bare copper is lower in resistance than tin-plated copper, for example). But as the frequency demands placed on the cable increase, bigger wire doesn't really help all that much (and, for a whole slew of reasons having to do with manufacturing process control, it can actually hurt), because it's not the total loss that's limiting performance; it's the non-linear component of the loss that's the real problem. With return loss specs not generally available for Chinese-sourced cable, one often can't get a good idea exactly what basis there is for comparison between two HDMI cables.

So, how does one compare? We provide a spec sheet for our HDMI cable, but you will find, if you go looking for similar specs on competing products, that spec sheets are few and far between. Another basis for comparison is to get your vendor to send you a copy of his compliance testing certificate, showing the length and testing "category" of his HDMI cable approval--but, for various reasons, many vendors cannot or will not supply this information. Sometimes, all one can do is try a cable on a given set of devices, and see if it works. That may not be the most satisfying answer, but it is often the only answer there is.

HDMI Cable

Back to our Articles Index

Back to Blue Jeans Cable Home

Blue Jeans Cable
3216 16th Ave W
Seattle, Washington 98119
206 284 2924