Yea I think you are right but all the standards and audio/video formats get very complicated very fast for me. With MPC-HC and MadVR I can play anything no problem though so that is OK. But if you actually want a surround receiver and for the audio to work as intended you need to use HDMI as far as I understand it.
I have 5 PCs and 8 monitors in my home and they are all using DP except one that’s really old and uses DVI. DP is the standard connector for PC monitors.
I also don’t use monitors. LG CX OLED as main monitor, sometimes hook a PC up to the regular LED bedroom tv.
No interest in using displayport, HDMI is fine for the vast majority of use cases. Not sure why we need competing connectors. Frankly, not sure why we don’t all just switch to USB-C long term.
There’s just no reason to not standardize with USB-C for every display.
Because HDMI is a pain in the ass and everybody who is using a PC is expecting to use DisplayPort.
Unless you’ve ascended to the PC-powered couch gamer. Don’t knock it until you try it!
PC
flagship OLED
comfy couch
ATMOS surround system
Takes PC gaming to a whole other level. HDMI is nice for eARC (PC direct to display, audio feeds back to the AVR/amp via HDMI). It’s a clean, awesome setup. Also lets you run TrueHD 7.1 ATMOS for all those Linux ISOs :)
There was some kind of recent issue with a new fatty monitor requiring 2.1, right? I remember people rubbing it in Nvidia’s face over it. This article just reads funny, like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future and then laughing at them for adding it when there’s finally tech to use it.
Samsung Neo G9 (G95NC) 57". It’s a dual 4K width (7680x2160) at 240Hz, but existing Nvidia cards can only drive it at up to 120Hz. Radeon 7000 series can do the full 240Hz at native resolution.
The display supports it’s full resolution and refresh rate over HDMI 2.1. It’s just not working right now at full rate over HDMI with either AMD or NvIdia graphics cards for some reason.
The table included in this article is misleading, as they’ve cropped off the original Korean text which states that it’s currently not working at 240Hz over HDMI 2.1 with AMD either but that they contacted AMD who said it would work with a future driver update. The only reason that NVIDIA is listed at 120Hz in that table is because Quasarzone didn’t get a reply from NVIDIA in time for publication.
It’s unclear why no cards can do 240Hz over HDMI with that monitor when it’s in spec.
The display only has one high refresh HDMI port (out of three), but that input is limited to 120Hz. It’s stamped on the shell and listed as such in the manual, so it would appear to be on Samsung.
like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future
That’s not how this works. GPUs have an effective lifespan of 5-10 years, and hardware vendors look at what’s out there to set roadmaps and releases. Someone has to go first and make it available, and Nvidia skipping 2.1 on 4xxx is only slowing down next-gen display releases.
That would be silly of Nvidia, giving you things like vram and connections to plug in your monitor. How else will they sell you am expensive card every 2 years?
I wonder why everyone holds AMDs display port 2.1 in such high regard. Its barely more bandwidth than hdmi 2.1 since its not the full UHBR20 80GBPS
its UHB13.5 at 54GBPS on RDNA 3 vs 48GBPS HDMI 2.1 on ADA GPUs
Actually RDNA3 dGPUs do have the full 80Gbps bandwidth, but it’s artificially limited to 54 Gbps on consumer GPUs.
48 vs 54 is roughly 12.5% more bandwidth. In the PC hardware world, 12% isn’t often considered very close.
That 12.5% increased bandwidth allows this 8k monitor, to be easily run at 240hz 10 bit color with DSC.
That monitor theoretically supports it’s full resolution and refresh rate over HDMI 2.1 too, the extra bandwidth isn’t the difference maker.
Right now no cards seem to work with it at 240Hz over HDMI, but it’s listed as supported on the monitors end.
Because HDMI is a pain in the ass and everybody who is using a PC is expecting to use DisplayPort.
not to mention that HDMI is licensed port, that is not fully implemented under Linux, while DP is license-free port.
The only time we use HDMI is when you want to connect you’re PC to a projector or a surround system because the standard there is HDMI.
And unless I’m missing something, the ONLY way to get TrueHD 7.1 + ATMOS is via HDMI. The old optical standard (name escapes me) can’t do that.
Yea I think you are right but all the standards and audio/video formats get very complicated very fast for me. With MPC-HC and MadVR I can play anything no problem though so that is OK. But if you actually want a surround receiver and for the audio to work as intended you need to use HDMI as far as I understand it.
Which monitor are you running dp on right now?
Most of them? What kind of question is this?
I have 5 PCs and 8 monitors in my home and they are all using DP except one that’s really old and uses DVI. DP is the standard connector for PC monitors.
What do you think most people globally use dp or hdmi?
On PC monitors? DP.
Everything in my home uses HDMI.
I also don’t use monitors. LG CX OLED as main monitor, sometimes hook a PC up to the regular LED bedroom tv.
No interest in using displayport, HDMI is fine for the vast majority of use cases. Not sure why we need competing connectors. Frankly, not sure why we don’t all just switch to USB-C long term.
There’s just no reason to not standardize with USB-C for every display.
Unless you’ve ascended to the PC-powered couch gamer. Don’t knock it until you try it!
PC
flagship OLED
comfy couch
ATMOS surround system
Takes PC gaming to a whole other level. HDMI is nice for eARC (PC direct to display, audio feeds back to the AVR/amp via HDMI). It’s a clean, awesome setup. Also lets you run TrueHD 7.1 ATMOS for all those Linux ISOs :)
Im using it to connect My PC to my OLED TV and AV Reciever. Way better than a monitor to me
They don’t care about that detail
There was some kind of recent issue with a new fatty monitor requiring 2.1, right? I remember people rubbing it in Nvidia’s face over it. This article just reads funny, like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future and then laughing at them for adding it when there’s finally tech to use it.
Samsung Neo G9 (G95NC) 57". It’s a dual 4K width (7680x2160) at 240Hz, but existing Nvidia cards can only drive it at up to 120Hz. Radeon 7000 series can do the full 240Hz at native resolution.
https://www.displayninja.com/samsung-s57cg95-review/
The display supports it’s full resolution and refresh rate over HDMI 2.1. It’s just not working right now at full rate over HDMI with either AMD or NvIdia graphics cards for some reason.
The table included in this article is misleading, as they’ve cropped off the original Korean text which states that it’s currently not working at 240Hz over HDMI 2.1 with AMD either but that they contacted AMD who said it would work with a future driver update. The only reason that NVIDIA is listed at 120Hz in that table is because Quasarzone didn’t get a reply from NVIDIA in time for publication.
It’s unclear why no cards can do 240Hz over HDMI with that monitor when it’s in spec.
The display only has one high refresh HDMI port (out of three), but that input is limited to 120Hz. It’s stamped on the shell and listed as such in the manual, so it would appear to be on Samsung.
https://imgur.com/a/xx1PWyp
240Hz on this display requires the use of DisplayPort, which is what makes it a perfect (well, imperfect) example.
That’s not how this works. GPUs have an effective lifespan of 5-10 years, and hardware vendors look at what’s out there to set roadmaps and releases. Someone has to go first and make it available, and Nvidia skipping 2.1 on 4xxx is only slowing down next-gen display releases.
Probably because people expected a $1000+ GPU to have one years worth of incredibly foreseeable future-proofing built into it.
That would be silly of Nvidia, giving you things like vram and connections to plug in your monitor. How else will they sell you am expensive card every 2 years?
Does no other GPU maker other than Nvidia make 8GB GPUs?
Does anyone offer more than 24GB for gamers?
Rx 7600 8gb Arc a750 8gb
?
I rest my case