Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

DVI cables?

Status
Not open for further replies.

goldeneye

New Member
Hey all,

I have heard about using a DVI cable for my flat screen display, can anyone tell me about using this cable and if it is worth getting one?

Many thanks;)
 
My fiance recently purchased a 1900x1440 widescreen LCD display, it only shipped with a VGA cable so she used that for a few weeks. She got a DVI cable free from work so we swapped them out, both of us noticed the difference, there was just a general overall improvement in display quality, and with Windows ClearType on and the DVI cable the fonts look significantly clearer, and the overall contrast was better as well as slightly less overall blurring and conversion artifacts in various video modes. Just don't be conned in to buying 150 dollar AV cables. Find the cheapest DVI cable you can get your hands on, you should be able to find one for 15 dollars. The local circuit city sells an HDMI AV cable for high def TV's that's only 6 feet long for over 150 dollars (I kid you not) Anyone that tries to convince you that gold contacts or oxygen free copper wiring is going to improve your image quality on a digital link cable is selling snake oil.

Afterall how much sense does it make to take a native digital signal from a computer, turn into into an analog one to send it to the monitor which then turns it back into a digital one? The DVI cable skips the entire D to A to D steps and just deals with the digital image the whole way through, that can only improve image quality.

I do inspect for a living so I notice things visually that others might not, but on my brother in laws computer even when it's running in it's native resolution and has been properly synced I can still see pixel crawl because he's using an analog link. I noticed the same thing on my fiances monitor but to a significantly less degree because the monitor was of higher quality.
 
Last edited:
Sceadwian said:
Afterall how much sense does it make to take a native digital signal from a computer, turn into into an analog one to send it to the monitor which then turns it back into a digital one? The DVI cable skips the entire D to A to D steps and just deals with the digital image the whole way through, that can only improve image quality.

Assuming it's a digital connection?, DVI can be either! - as with anything, try different leads, but don't expect spectacular improements!.
 
DVI does have pins for handling analog signals, but LCD monitors that have both analog and digital in use seperate connectors, so if he has both a DVI plug and an analog VGA plug on the monitor you can garuntee that the DVI plug is going to be digital, and no it won't be dramatic, but I think my fiance said it best when she said it was 'subtley dramatic' the biggest thing I noticed is a rock solid no flicker display and an increase in the contrast rato. It just seems more 'there' when it was using the digital cable. Probably because there were no sync issues matching the monitors native refresh rate and the VGA refresh rate. As Nigel states though "Your mileage may vary"
 
HDTV has been in the UK since last May, and the Sky HD boxes used have both HDMI and Component sockets provided that output HD.

There's been a LOT of discussion about which is best to use (the boxes come ONLY with an HDMI cable) - we've run two identical Sony Bravia's, side by side, off the same HD box (one via HDMI, one via Component). Side by side, you can't tell which is which, there's no visible difference to anyone we've asked.

This obviously depends on the quality of the box providing the source, and on the quality of the sets - some people find HDMI better, some prefer Component - probably depnding how well their sets processes both signals.

In theory HDMI should be better, but the difference is usually slight to none :D
 
I'd bet you noticed it if you increased the cable length, digital would be best for long runs, then again few to no people have their boxes very far from their set. For TV I can understand people not being able to see the difference between component and HDMI but computer displays are a whole different ball of wax. Put up a static PC screen image on both monitors and see if you can telll then.
 
if anyone reading this thread is thinking "darn I should upgrade to dvi", check the surplus sites for a cable. normal computer stores charge $30-$50 or more for a stupid DVI cable, when you can get them surplus typically under $10
 
Just to throw this out there, I've got two 17" LCD monitors that each have both DVI and VGA input connectors, and there is an improvement when using DVI vs. VGA, the picture is a bit clearer. It's not a difference like night and day, and if I looked at a monitor I don't think I could easily tell which it was using, but when I have my two monitors side-by-side (one on DVI, one on VGA) I can see a difference, albeit slight.

Sure, if you're just using a DVI cable to pass the analog VGA signals as nigel seems to be talking about, then you're not going to see any real improvement...

But if your monitor has a separate connector and/or you know it actually uses the digital signal instead of the analog with a DVI cable, you might find it worthwhile, especially if you can get a DVI cable for a reasonable price online. For instance, Newegg has one for $12.99 plus $5 shipping.
 
Isn't a DVI cable just an RS-232 cable? Just the signal being sent is digital rather than analog? My new LCD monitor has an HDCP plug, and I only have an adapter that converts DVI-HDCP. Nothing that will actually lead the extra 4 pins on my video card to the monitor!
 
goldeneye said:
Thanks for all the info. I have seen a 5m DVI cable for £23.95 at Lindy https://www.lindy.com/uk, that is probably about $50, I think this is a good price?:)

In the US, Newegg has a 15' (about 5M) cable for US$16 (plus another $5 for shipping), personally I'd be happy if something like that were in the $30 range.

For cables of that length, you probably want to make sure you can return it though - A coworker purchased a 15-20' DVI cable a couple years back and it absolutely refused to work with any of the LCD's at the time. I was suspecting that it was meant to be used with some low-res plasmas or something.

BTW, the case where DVI is absolutely required is when doing lots of text work. Otherwise those single pixel transisitions end up smearing over 2 pixels which look pretty horrible. The ugliest artifact was having these vertical lines of double pixels going through a block of text.
 
We're going to buy a new TV and it'll be a standard CRT type. We don't watch any DVDs and we won't pay for TV so HD is (for us) a waste of money. CRTs give better quality than thin screens and they are also cheaper too. A top of the range TV would be a waste of money for us.
 
Hero999 said:
We're going to buy a new TV and it'll be a standard CRT type. We don't watch any DVDs and we won't pay for TV so HD is (for us) a waste of money. CRTs give better quality than thin screens and they are also cheaper too. A top of the range TV would be a waste of money for us.

Only problem is you can't buy any decent quality CRT sets now - all the decent manufacturers have stopped making them. All that's left is mostly cheap Turkish junk!.
 
dknguyen, this is a DVI plug
**broken link removed**
And it's definitly not RS232, or anything even remotely related to it.
It uses something called transmision minimized differential signalling, which deals with signals only a couple hundred milivolts and some data encoding that reduces the number of transistions to keep the bandwidth as low as possible, and eliminates common mode noise, similar to ethernet. Full sized DVI connectors as pictured in the link actually support two seperate signals simultaniously though I've never seen anything that uses both of them.
 
My monitor called that the HDCP plug. The "DVI" monitors I've seen use the RS-232 type plug. It was weird. The "DVI" compatible monitors had these RS-232 VGA or RGB type plugs so I just got the monitor with that plug you showed. It was called HDCP.
 
Last edited:
That "Rs-232" plug you're refering to is a standard VGA connector, it has nothing to do with DVI at all and is purely analog. You're getting your terminology and plugs mixed up.

HDCP is a content protected version of DVI which uses the same plug but some kind of copy protection protocol to prevent copying of the outputted signal, I consider HDCP an 'abomination' because people get it confused with standard DVI and the protection methods it adds to standard DVI have already been cracked makeing it completely useless. You won't even hear about it in a few years unless they come up with a completly new version of it.
 
Hero999 said:
We're going to buy a new TV and it'll be a standard CRT type. We don't watch any DVDs and we won't pay for TV so HD is (for us) a waste of money. CRTs give better quality than thin screens and they are also cheaper too. A top of the range TV would be a waste of money for us.

We watch mostly analog tv off the cable, and also some digital tv through the set-top box. When shopping for a new TV we went ahead and got a big one with new technology. I was prepared for disappointment when we fired it up because I had read and seen examples of very poor off-air analog performance in new digital sets. But to my big surprise our new TV shows analog cable signals much better than our old Sony 32 inch CRT set. Clarity, colour depth, focus, accuracy, sound, pretty much everything is improved except maybe brightness which is slightly less. The new set is a Samsung DLP unit, 56 inchs. We watch it from about 9 feet away. It is terrific with analog signals. It appears to be doubling the lines automatically which fills in very nicely and smooths out the picture. Also, the DLP function doesn't have any screen-door effect (you can't discern individual pixels) and I really like that. The only annoying thing is that we watch everything with black bars on the sides because the images are not 16:9, wastes some of the screen for sure. And to make things more annoying, more and more tv stations intentionally letterbox their shows even further knowing that many watchers are using "zoom" on their new sets to fill the screen. So they put black bars on top and bottom too! In those cases, our picture is the equivalent of a about a 46 inch conventional screen!
 
Status
Not open for further replies.

Latest threads

Back
Top