DVI-I / DVI-D Problem (uber newb)
My first post here and I hope it's appropriate to these forums. If not please just let me know. But this is where I've been coming for all my information so far and I'm sure it's an easy question.
I've just been lucky enough to get a Samsung 245B monitor which has separate DVI-D and old fashioned analog input. My Ubuntu box has a Radeon 2600HD card which has DVI-I outputs. So far I've only been able to get the monitor working using an analog converter. Obviously I want to get the DVI going. Is this possible? Any help would be really appreciated. I've not found any answers through searching yet - possibly because it's too obvious.
DVI-I (Integrated analog/digital) has both the DVI-D (Digital) and the analog signals on it, so there should be no problem hooking the DVI-I output of the HD2600 to the DVI-D input of the monitor. I don't remember if the connectors are the same physical size but I think they are, and that any DVI cable should work.
If it doesn't work then we start looking at details (and probably logs).
Thank you very much. This is what I understood from my reading, but I'd come across a DVI FAQ that said DVI-I is only sometimes compatible with DVI-D because the DVI-I doesn't always transmit a digital signal, but only the analog. I don't know if that's right though it doesn't sound it. But if it is, I have no idea how to tell if my graphics card is one of these cards or not.
Originally Posted by bridgman
I used a DVI-D single link cable initially, and then tried a DVI-I single link cable. Neither worked as the monitor just kept cycling through its "Analog | Digital" icons, though it did detect when the cable was unplugged.
If there's anything I can check - logs, etc. - just let me know and I'll post the results.
If you are using the FGLRX driver, try this configuration in Device section (xorg.conf):
Option "EnableMonitor" "tmds1,tv"
Originally Posted by h4rm0ny
Thanks. I am using the fglrx driver and I've now done that. It hasn't resolved the issue, I'm afraid.
Originally Posted by epinter
HOWEVER - I realised that I made a mistake in my last post. Both of the cables that I've tried have been DVI-D, merely one is dual link and the other single link. I was under the impression that this would work but maybe I'm wrong? Do I have to go and buy a DVI-I cable?
I've now tried the monitor with a different computer and graphics card (same two cables, however). Again, I could only get the monitor to display using the analog input. This is starting to look like my monitor is broken. That's really going to suck as I (a) need it at the moment and (b) returning it to Amazon and going through the delivery system again is going to be a nightmare.
If anyone has any other suggestions as to what I can try, I'd really appreciate it.
Thanks for responses,
I guess I would make sure I knew the cables were OK before going any further. Find someone who is already running a panel over DVI digital and swap in your cables just to be sure.
It's always the cable. Always.
Ah, but it wasn't, it wasn't... </suspense>
Originally Posted by rbmorse
First off, thank you very much for all replies - it is now working. I've learnt more than I ever cared to about DVI these past couple of days.
Because this very nearly resulted in me sending back a very large and heavy monitor, and because the solution was not (to me) obvious, I'll post what I did to help anyone else Googling around the Web in the same situation. My posts here and at Ubuntuforums are the top results for searching "dvi-i dvi-d problem" so it doesn't look like there's much easy information around.
I have not got it to work from digital output from the bios / boot screen. I've only got it going post loading of the O/S (Ubuntu Gutsy in this case). What I did was run things as normal using Analog whilst having the DVI-D cable plugged into the second screen output on my Graphics card. Or maybe... goes to check, yes it's actually plugged into display output 1 - I had them the wrong way round (though this was not the problem). The monitor has both outputs plugged into it (it has a VGA and a DVI-D input). From there, I started the ATI Catalyst Control Centre from the menu and entered the Display Manager section. I selected the relevant display (the one listed as "Digital Monitor" and enabled the display. This then enabled me to switch to digital input with the monitor (I had previously set the monitor to let me manually choose inputs rather than doing it automatically - this is under the monitor OSD menus). At this point, I got a hideous mess of scan lines, but it was at least something: video corruption had never looked so good. I then set the display type to Clone, so that whatever is on one screen appears on the other. After rebooting (during which digital input was still not available until the system had fully started), I manually switched the monitor to digital and got a great picture. For example, the gradiated grey blocks on either side of this forum had a sort of wave distortion under Analog. On the digital input, it's all very steady.
Now this is not the final stage: I don't get a digital signal during boot and my computer thnks it has two displays. But I'm a lot, lot happier than I am and I think now that I've established the hardware is actually all working, I'm okay to fiddle around on the details at my own pace.
I'm just going to post a link which I found helpful and which will hope be of use to anyone else stuck in this situation:
I should also say that I did enable that option that epinter suggested in case it's relevant. I think it's not in this case, but it will probably save me more panics later if I connect to a TV as well.
A couple of questions...
1. When you boot using the analog connection, does the monitor indicate what frequency and resolution you are running at ? Some do, some don't...
2. Can you describe the connectors on the card and how your analog adapter fits into the picture ? Do you have two DVI connectors plus a 7-pin TV, or something else ?