Jump to content
libertyspike138

No DVI out on nVidia GTX 770 help!

Recommended Posts

I have been running High Sierra on an Intel DH67CL board with a GTX 770 card for some time now. It has always displayed out of the DVI port where I need it to. I recently upgraded my motherboard to a Gigabyte GA-B150M-DS3H and put the GTX 770 back in it. So the only thing that has changed is that I used one of your Vanilla EFI folders to be compatible with the new board. Everything boots up fine but now as soon as I get to the GUI my screen goes black and it only displays out of the HDMI port which will not work with my KVM. How can switch it to use DVI instead?

Link to comment
Share on other sites

  • Administrators

check if have CSM option on bios/uefi setup

-Donations-

PayPal HERE - Stripe HERE - Ko-Fi HERE - BuyMeaCoffee HERE - Mercado Livre HERE

Skrill danielnmaldonado@gmail.com - BTC 33HeGCuCSh4tUBqdYkQqKpSDa1E7WeAJQ3 - BNB 0x10D1d656eCa00bD521f9b4A43B83098B8142e115 - USDT BSC BEP20 0xb57cfdfa371fad1981910f0e8332409ab99f74d9 - USDT TRC20 TUR6Z9AVS4AYzqPnULoHrfFvppRbhXmNbZ - KASPA kaspa:qpxzufgfj8p6r0krg58yzvs0009h2mwqgvcawa0xc2pth7sgzpv56j4f6dtvk - PicPay @danielnmaldonado - PiX @danielnmaldonado@gmail.com

Premium Users HERE - Problems with Paypal HERE

xcd5u2Y.png

Sign up for a Bybit account and claim exclusive rewards HERE

New ways to earn money with Linkvertise HERE

Link to comment
Share on other sites

9 hours ago, MaLd0n said:

check if have CSM option on bios/uefi setup

My BIOS has an OS selection consisting of "Other OS, Win8/10, Win8/10 WHQL." When one of the Win options is selected you have the option to enable or disable CSM and a couple other options such as "Storage control option rom & Other pci devices." These can be set to "do not launch, legacy, or UEFI." I'm not sure what it should be set to but I believe I have tried all available options.

Link to comment
Share on other sites

My GPU has 4 ports on it. DVI-I, DVI-D, HDMI, & DisplayPort. It is the DVI-I that I have always used so that it will work with my VGA KVM via an adapter on the back of the card. Even though I never had issues using it before I swapped motherboards it is this port that always goes black as soon as the login screen comes up with this new motherboard. I decided to try and test the other ports by running cables to my TV as a mirror. Using HDMI on my monitor and then plugging a DVI-D to HDMI cable to my TV and it said it was unsupported until I changed resolution and it worked but both displays are identified as my monitor. I also ran a DisplayPort to HDMI cable to my TV and it worked as well after changing resolution but both are also identified as my monitor. I then tried running a DVI-I to VGA cable to my TV but it says there is no signal and no mirror options come up.

I also tried swapping monitors thinking that maybe it wasn't displaying because the resolution was defaulting too high as the monitor is at 1920x1200 which my TV apparently doesn't support. So I plugged in a smaller 1080p monitor and I still get no signal on that port but also noticed it still has my original monitor listed. I noticed in Hackintool it says it is injecting the EDID for that monitor but my only other options are to inject EDID for other Apple displays. I extracted 2 separate EDID files in hex for the smaller monitor, 1 straight to it & a separate one that gets generated when running through the KVM.

Then again I feel that I'm going down the wrong rabbit hole by switching monitors as I believe that something is just disabling this DVI-I port. To confirm that I connected it directly to the TV bypassing the monitor via a DVI-I to VGA cable. My BIOS and first clover screen display, as soon as I select MacOS the TV tells me that the mode is unsupported for a while until I can tell that the GUI has booted and then the TV says that there is no signal.

Link to comment
Share on other sites

Original monitor hooked back up. Instead of using the DVI-I to VGA adapter I took a DVI-D to VGA cable off of another machine and it all works fine so I either have to buy another DVI-D to VGA cable for this machine or preferably just figure out why the DVI-I port is being disabled.

Link to comment
Share on other sites

Apparently nobody visits many forums anymore or nobody across multiple forums understands why it disables DVI-I so I think the easier yet more expensive option is to just use the DVI-D to VGA cable but then I experienced another issue when booting into the latest Ubuntu Linux. Connecting over DVI-D would not properly display 1920x1080 like it will over DVI-I so I reconnected the smaller monitor and it seems to be working fine on both MacOS & Ubuntu now.

Link to comment
Share on other sites

So the issue appears to finally be resolved and I wanted to leave an update in the event that anybody else has this issue using a GTX 770 on Skylake. As I stated previously, using the DVI-D port was not working as my screen kept going black so I wanted to use the DVI-I port that I previously used when the GTX 770 was in an Ivy / Sandy board but it just wouldn't work on MacOS with the new board even though it worked fine in Windows & Linux. I tried using the same DVI-I -> VGA adapter and cable I have always used, I tried other adapters and DVI-I cables but it just would not work. What I eventually ended up trying was the more expensive DVI-D to VGA cable but in the DVI-I port and it works. DVI-D can go into DVI-I ports as they are missing some pins but not vice versa. I assumed it being a DVI-I port that I should use a DVI-I cable but apparently not with a more recent board / EFI. This leads me to believe that under a newer EFI or board changes it is not sending out the extra analog data on those 4 extra pins and that is what the monitors & TVs I tried were looking for when using a DVI-I cable, saying no signal. It is also important to note that this method would not work on my original monitor for some reason but works on a more recent LED monitor. While I'm here I will also mention that I run a mirror to the TV and I have to use a DisplayPort -> HDMI cable for that because if I use the HDMI port my BIOS & login screens treats it as the primary instead of the DVI that is going to the monitor.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.





×
  • Create New...