Ever since the HDMI 2.0 spec was launched, we've had plenty of questions about cabling for HDMI 2.0. If you want to cut to the chase, there's a quick and simple answer which disposes of most of these questions: any cable which is compliant under previous versions of the spec is compliant under HDMI 2.0, though support for new resolutions that result in high bitrates will probably, in practice as well as in theory, require that the cable be a compliant "high-speed" (a.k.a. "Category 2") cable.
There is, therefore, for most users no reason to rush out and buy new cables to handle HDMI 2.0. It's quite possible, for reasons we'll get to, that you may find that you need to do this, but it's by no means a foregone conclusion.
As many readers will already know, under the HDMI 1.3 spec, two "Categories" of HDMI cable were recognized. Category 1, or "Standard," HDMI cable is tested and certified for handling lower bitrates associated with conventional 720p or 1080i HD video and the like, while Category 2, or "High Speed," HDMI cable is tested all the way out to the highest bitrate allowed under the 1.3 or 1.4 specs, which is 3.4 Gbps (Gigabits/second) on each data pair (also commonly referred to by the sum of the three pair bitrates, 10.2 Gbps).
3.4 Gbps is a crazy amount of data to shove down a twisted pair, especially given that HDMI is a simple binary encoding scheme with only two values (that is, literally 'ones and zeros'). Other standards which use twisted pairs to carry high speed data, such as 10GBaseT, do it through a mix of techniques including multilevel encoding, to cut down on the frequency bandwidth required to carry the data. But that's not what was done in HDMI 2.0. The problem facing the authors of the new spec was, with demand building for such things as 4k video, how to get still more data down that already-overburdened pipe -- 6 Gbps/pair, for a total 18 Gbps throughput.
How to do that? Well, there were a variety of options. Multilevel encoding might have helped. There is an HDMI "dual link" connector, "type B," that has never been implemented but that would have doubled the number of data pairs carrying video from three to six, and that might have been deployed. Some form of compression might have been used. But the spec authors decided to stick with the already troublesome standard single-link HDMI cable and with simple binary encoding of an uncompressed pixel stream, and increase the allowed bitrate to 6 Gbps per pair without an alteration in cable design.
What one would naturally expect, given that decision, was a new "Category 3" cable certification, for cables to handle bitrates above the 3.4 Gbps of HDMI 1.3 and HDMI 1.4. Both Category 1 and Category 2 testing involve feeding a worst-case output signal at the high end of the bitrate range through a cable, observing how it comes out, and comparing that against a minimum standard. A "Category 3" test, by this model, would have involved generating a 6.0 Gbps stream, feeding it through the cable, probably subjecting it to some new equalization function, and then evaluating the output. But the spec authors had a different notion of how to proceed.
The HDMI 2.0 spec contains a description of a mathematical model of signal degradation called the "Worst Cable Emulator." The idea of the Worst Cable Emulator is that if a cable has been shown to pass Category 2 testing at 3.4 Gbps, one can make some assumptions about how it will perform at higher bitrates -- so the Worst Cable Emulator is essentially a formula which characterizes what the worst possible Category-2-passing cable would do at bitrates up to 6.0 Gbps. And then, in a sort of reversal of affairs, instead of requiring the cable to meet a spec determined by the quality of the input and output circuits of the connected devices, the 2.0 spec requires sources and "sinks" (receiving devices) to be able to handle whatever signal degradation the Worst Cable Emulator deals out.
In this way, the designers of the spec avoided needing to create a third tier of mandatory HDMI cable compliance. Any actual certified Category 2 "High Speed" HDMI cable should carry ANY bandwidth called for by any HDMI 2.0 compliant device. This is a bit of a cheat -- one cannot really validly extrapolate the high-frequency performance of a cable from its lower-frequency performance -- and as a consequence, after discovering that in fact some valid "high speed" cables (and, of course, the many counterfeit "high speed" cables which are out there in the market) will NOT handle 18.0 Gbps, HDMI Licensing created the optional "Premium HDMI Cable" certification.
The Premium HDMI Cable specification requires testing the cable, under specific conditions, up to 18.0 Gbps, and ensuring performance that will guarantee function, when hooked to spec-compliant devices, all the way out to that maximum bitrate. While any "high speed" cable is supposed to handle any HDMI 2.0 signal, only a Premium HDMI Cable has been actually tested and certified to that rate, and the labeling requirements for Premium HDMI Cable include a counterfeit-proof label which assures the purchaser that he really is getting a cable that really does bear the certification as claimed. More details on what the Premium HDMI Cable program is about, and what it means, can be seen in our article: Premium HDMI Cables.
Do you actually need a Premium HDMI Cable? Well, HDMI is, as it has always been, a signal type where it's pretty easy to tell if things are going wrong. If you see "sparkles" where bits are dropping out, or whole line dropouts, or a jumping or flashing picture or no picture at all, well, you've got some sort of a problem, and it's very possible that HDMI cable quality improvement will resolve that problem. If you do not see any of those problems when running the highest-bitrate signals you need to run through your system, you're good to go. If not, and if you're looking for a cable that's actually been tested to the very limits of what HDMI can do, a certified Premium HDMI Cable may be the ticket.