Paralleled HDMI Cables Connection Improves Video Quality


I obtained two female HDMI Y-splitters and used them to connect my Oppo BPD-83SE Blu-ray player to my UHD/4K monitor using two inexpensive, 6’ lengths of 4K HDMI cable from CZoom.

The resultant Blu-ray DVD video presentation yielded improved resolution and enhanced color contrasts with this configuration. The change in video quality was like going from 2K to something much higher than 2K (2K+). I don’t know if audio quality improved, as I haven’t connected my monitor to external speakers, home audio or video system. I’ll see whether the paralleled HDMI cable connection improves video quality from an UHD/4K player with an UHD/4K Blue-ray DVD.

This connection is analogous to the Schroeder Method of IC placement, as discussed below:
https://www.dagogo.com/audio-blast-schroeder-method-interconnect-placement/

https://forum.audiogon.com/discussions/doug-schroeder-method-double-ic

The possibility that it worked at all in the HDMI context was surprising. And I’m using very inexpensive parts for this evaluation. CZoom 4K HDMI cables cost ~$15 per cable from Amazon and the HDMI Y-splitters ran ~$8 per splitter from eBay. One wonders whether significant improvements can be wrought with better quality components or an integrated dual HDMI assembly. 

I would be interested in learning from others who try this technique. 

128x128celander
I use monster cable 50 dollars hdmi between tv and freebox delta devialet 
The sound  and the video are best with this cable
No. 
 
It is literally impossible for the colors in the video feed to look different.
Funny but we have had the same result when we insert a version of our cables into a video feed chain.

And this could well be related to the significant increase in bandwidth that our cables or a doubled or paralleled wire assembly bring to the table ( relative to cables using bog standard solid wire and/or standard assemblies ).

Visual resolution in video systems is defined as the smallest detail that can be seen. This detail is related directly to the bandwidth of the signal: The more bandwidth in the signal, the more potential visual resolution. The converse is also true: The more the signal is band-limited, the less detail information will be visible. This article addresses this relationship and provides a straightforward way to determine the bandwidth required to achieve the desired visual resolution.

Settling time on receiver chips can stabilize framing to some degree. It is similar to the jitter problem with digital audio.

The eye can apparently see this, as video is a thing built up out of a very complex scenario for a very complex device called the human eye.

Eyes, like ears and brains, are an individual difference package. One can see and hear and ruminate differently than another. IQ, earQ and EyeQ? Yes, a known set of parameters. Differences in all. If one can see it and another can't... but that latter one can move numbers around on paper, well, that's not a solution to an astute observation. It's denial via the tool at hand. Dogmatism piles rolled forward. 

There are reams of articles about how HDMI improvement aspects of any kind can’t possibly be true.... but not one whit of it goes after anything other than a paper based number analysis of engineered hardware... and bits being bits.

None of that... takes into account the other 50-60-80% of the over arching complexity of the scenario. You know, all the real world stuff.

It’s more like an angrily and confusingly asked question - than an answer to anything. (just hit it with the hammer you’ve got)

The monkey, prior to the development of the tool? He’s still in there, at the root of it all, in humans. Still angrily banging the coconut against the rock. The development and application of intelligence is core to the act of getting past that.


Maybe it’s only improving video resolution. Perhaps the mind tends to focus on the color contrast being presented once the overall video signal resolution is improved. 
But how do you know they’re in the correct direction? Answer at 5.
I don’t. So the impact could be much greater.

I searched the Forum threads for recommendations about HDMI cables. Landed on this one:

https://forum.audiogon.com/discussions/one-of-the-best-hdmi-cable?page=2

Many posts by my old buddy, Geoff, who recommended AQ Carbon HDMI cables. Directional and (more) expensive. Might be my next test.
Hint: the human eye is very temporally sensitive. (a re-iteration of the opening bit in my prior post)

I was initially confused after I installed the splitter configuration and turned on the DVD player: it seemed to pause for some time before presenting video images. I am not sure whether it was user/operator error or something with the newly inserted cables. Everything resolved in short order, however.

@taras22: makes sense that bandwidth would be in play in both instances.
@teo_audio: by "receiver chips" are you referring to monitor hardware or biological (eye/brain) hardware?
No, bandwidth plays no issue with HDMI. If your cable doesn’t have enough bandwidth, it simply drops the frames or just won’t display anything.  
  
As of know, 18Gbps is what you want for 4K + HDR. While not required, ones that are Premium Certified have been tested to truly output 18Gbps for that length. For long HDMI runs, >25ft, you want active cables, or do it over Ethernet/HDBaseT. 
 
The only color differences are with resolution or HDR. And no, if a cable can’t output a lower resolution than what it’s sent (the source/tv may detect something isn’t supported though).
Meant to say: Bandwidth plays no role in increasing pixel clarity in HDMI; meaning it’s not gonna be doing compression like a low bandwidth video stream.