Mission accomplished. It’s radiant m8
Weird to see how he tests with standard RAM on the 7800X3D but expensive DDR5-7600 RAM for the Intel CPU. You’d think he would compare under similar conditions and not give one of them an unfair advantage.
Why does anyone buy a processor with overclocking capabilities and then conduct tests, leaving it at default settings? The difference in higher resolutions is so minimal that even a slight overclock will equalize or alter the result. Additionally, nobody tests these processors in VR games where the processor is often a bottleneck for the GPU. Another thing, the games most players engage in are almost always designed for a low number of cores, usually 1-2, with the rest having low utilization. Core speed is still much more critical than their quantity. The ideal setup would be to have a large cache, fewer cores, and a high frequency, at least 5.5 GHz, but unfortunately, it’s not possible.
14700k all day
“The worse one all day.”
Interesting choice if gaming is your concern.
I mean if you don’t care about your electricity bill sure? Lol
The 5800x3d is better than the 14700k for gaming too
I love how min maxing everything is getting out of control.
Oh, another generic benchmark video without power consumption and temps. 🥱
Pretty sure the 7800 is always consistently at 56w
Pretty sure the 7800 is always consistently at 56w
Why even make, post or repost a video like this? It’s the same story that’s already been told by more reputable content creators:
The 7800x3d wins 90+% of the gaming benchmarks, while often using less than half the wattage.
The Intel parts do much better in core heavy productivity work (no shit).
Nothing to see here, keep it moving.
HWUB posts content like this, because the performance of parts varies over time due to scheduler optimisation, driver updates, and extra game releases that become more popular than older games.
Just relying on release day reviews may not allow consumers to make the most up to date decisions when the spend hundreds or thousands of hard earnt dollars.
Why are you here?
HWUB? You mean Hardware Unboxed? (Often referred to as HUB… Hardware is one word genius)
Really driver optimizations, BIOS revisions, new games & game updates can change improve and /or affect performance? Wow, who knew?!
However none of that, or the ‘point’ you were trying to make is relevant in this specific video /review / benchmark as ‘14th Gen’ Intel is barely a month old.
Your comment was as useless as the aforementioned video.
HWUB? You mean Hardware Unboxed? (Often referred to as HUB… Hardware is one word genius)
Just like to point out that Unboxed is also one word lol
Im pretty sure noone would notice a 15% fps difference not to mention 3 or 5 % in that particular situation. For me AMD performance is not the game changer (pun intended). The power efficiency is.
I got a 13900k for $425 so I’m back on Intel, but if AMD could offer their 7800x3d at sub $300 I definitely wouldn’t have sprung for that deal. Buying under half the multithreaded performance for a mere $60 in savings didn’t really appeal to me (especially since I run an Unraid server and put parts from my gaming desktop in there once I upgrade).
It seems that if AMD is marginally better at gaming on average while drawing less power and generating less heat, the intel chip is only sensible for those who also do work that’s heavy on core performance.
AMD this generation is the clear winner on the gaming side which is great because it is something we have all needed in the industry.
But you cannot build a competitive system for overall specs and features offerings for a decent prince in comparison to what the offering on Intel can be. Which for people that are looking to continue on the all AMD builds and continue to have the same productivity offerings and gaming performance this 7000 series generations just isn’t for us. It is a tad frustrating.
AMD this generation is the clear winner on the gaming side which is great because it is something we have all needed in the industry.
Ugh i had love hate relationship with my X3D chip, constant usb disconnects, RMA’d both cpu and mobo and it didnt change. Just like with gpu (and i dont have particular favorutie brand as i had both amd cpu and gpu in the past 2 years~), going AMD is always risk of annoying, often unfixable issues that may or may not happen. Like nowadays when u buy new AMD GPU its 50/50 if u will experience idle draw bug depending on your monitor setup and 50/50 if u will experience paste pumpout/hotspot issues and will be forced to RMA/repaste brand new card. Mental.
Dude don’t believe the hype Intel knows it can compete with and that’s why they have e cores disable them and you see the truth you have enough performance at and mt with amd if you need more move up the stack
7800x3d is
4% faster at 1080p
3% faster at 1440p and 4k
if this is based on average fps… stop it… no one should care, NO REVIEWS should be using average fps anymore, we’ve had the tools for frame times and 0.1 and 1% lows… which should be the ONLY metrics used in games… i couldn’t care less that intel or amd could hit 1000fps for split seconds resulting in average frame rates inflating for no damn good reason…
As things stand right now, Intel pretty much has unlimited cash and deep ties with OEMs. If they actually spend it into R&D and try their best, it wouldn’t be long until the table is turned again. 14900KS with DDR5-9000 beats 7800X3D in every tasks according to Chinese leaks, btw.
if this is based on average fps… stop it… no one should care, NO REVIEWS should be using average fps anymore, we’ve had the tools for frame times and 0.1 and 1% lows… which should be the ONLY metrics used in games… i couldn’t care less that intel or amd could hit 1000fps for split seconds resulting in average frame rates inflating for no damn good reason…
Why even make, post or repost a video like this? It’s the same story that’s already been told by more reputable content creators:
The 7800x3d wins 90+% of the gaming benchmarks, while often using less than half the wattage.
The Intel parts do much better in core heavy productivity work (no shit).
Nothing to see here, keep it moving.
The 7800X3Ds 1080P result for Starfield is worse than the 1440P for some reason. Same with the 1% low in some other games.
Is this with the newest patch that came out yesterday? that improved things a lot.
Is this with the newest patch that came out yesterday? that improved things a lot.
That’s (Starfield) probably just run to run variance. The difference is so small it might as well be margin of error.
That’s (Starfield) probably just run to run variance. The difference is so small it might as well be margin of error.
It mostly is because of the aged Bethesda Engine. It was called Creation or something. At that time, games preferred high single core clock speeds, something out of reach for AMD even now.