• ConsistencyWelder@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Weird to see how he tests with standard RAM on the 7800X3D but expensive DDR5-7600 RAM for the Intel CPU. You’d think he would compare under similar conditions and not give one of them an unfair advantage.

  • Delicious-Anything92@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Why does anyone buy a processor with overclocking capabilities and then conduct tests, leaving it at default settings? The difference in higher resolutions is so minimal that even a slight overclock will equalize or alter the result. Additionally, nobody tests these processors in VR games where the processor is often a bottleneck for the GPU. Another thing, the games most players engage in are almost always designed for a low number of cores, usually 1-2, with the rest having low utilization. Core speed is still much more critical than their quantity. The ideal setup would be to have a large cache, fewer cores, and a high frequency, at least 5.5 GHz, but unfortunately, it’s not possible.

  • TipT0pMag00@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Why even make, post or repost a video like this? It’s the same story that’s already been told by more reputable content creators:

    The 7800x3d wins 90+% of the gaming benchmarks, while often using less than half the wattage.

    The Intel parts do much better in core heavy productivity work (no shit).

    Nothing to see here, keep it moving.

    • Trewarin@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      HWUB posts content like this, because the performance of parts varies over time due to scheduler optimisation, driver updates, and extra game releases that become more popular than older games.

      Just relying on release day reviews may not allow consumers to make the most up to date decisions when the spend hundreds or thousands of hard earnt dollars.

      Why are you here?

      • TipT0pMag00@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        HWUB? You mean Hardware Unboxed? (Often referred to as HUB… Hardware is one word genius)

        Really driver optimizations, BIOS revisions, new games & game updates can change improve and /or affect performance? Wow, who knew?!

        However none of that, or the ‘point’ you were trying to make is relevant in this specific video /review / benchmark as ‘14th Gen’ Intel is barely a month old.

        Your comment was as useless as the aforementioned video.

        • hunter54711@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          HWUB? You mean Hardware Unboxed? (Often referred to as HUB… Hardware is one word genius)

          Just like to point out that Unboxed is also one word lol

  • SeriousSkeletor@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Im pretty sure noone would notice a 15% fps difference not to mention 3 or 5 % in that particular situation. For me AMD performance is not the game changer (pun intended). The power efficiency is.

  • alman12345@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I got a 13900k for $425 so I’m back on Intel, but if AMD could offer their 7800x3d at sub $300 I definitely wouldn’t have sprung for that deal. Buying under half the multithreaded performance for a mere $60 in savings didn’t really appeal to me (especially since I run an Unraid server and put parts from my gaming desktop in there once I upgrade).

  • libertysailor@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It seems that if AMD is marginally better at gaming on average while drawing less power and generating less heat, the intel chip is only sensible for those who also do work that’s heavy on core performance.

  • SimplyNot0@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD this generation is the clear winner on the gaming side which is great because it is something we have all needed in the industry.

    But you cannot build a competitive system for overall specs and features offerings for a decent prince in comparison to what the offering on Intel can be. Which for people that are looking to continue on the all AMD builds and continue to have the same productivity offerings and gaming performance this 7000 series generations just isn’t for us. It is a tad frustrating.

    • Keldonv7@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD this generation is the clear winner on the gaming side which is great because it is something we have all needed in the industry.

      Ugh i had love hate relationship with my X3D chip, constant usb disconnects, RMA’d both cpu and mobo and it didnt change. Just like with gpu (and i dont have particular favorutie brand as i had both amd cpu and gpu in the past 2 years~), going AMD is always risk of annoying, often unfixable issues that may or may not happen. Like nowadays when u buy new AMD GPU its 50/50 if u will experience idle draw bug depending on your monitor setup and 50/50 if u will experience paste pumpout/hotspot issues and will be forced to RMA/repaste brand new card. Mental.

    • Pristine_Pianist@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Dude don’t believe the hype Intel knows it can compete with and that’s why they have e cores disable them and you see the truth you have enough performance at and mt with amd if you need more move up the stack

    • DHJudas@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      if this is based on average fps… stop it… no one should care, NO REVIEWS should be using average fps anymore, we’ve had the tools for frame times and 0.1 and 1% lows… which should be the ONLY metrics used in games… i couldn’t care less that intel or amd could hit 1000fps for split seconds resulting in average frame rates inflating for no damn good reason…

      • CreepyBuck18909@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        As things stand right now, Intel pretty much has unlimited cash and deep ties with OEMs. If they actually spend it into R&D and try their best, it wouldn’t be long until the table is turned again. 14900KS with DDR5-9000 beats 7800X3D in every tasks according to Chinese leaks, btw.

    • DHJudas@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      if this is based on average fps… stop it… no one should care, NO REVIEWS should be using average fps anymore, we’ve had the tools for frame times and 0.1 and 1% lows… which should be the ONLY metrics used in games… i couldn’t care less that intel or amd could hit 1000fps for split seconds resulting in average frame rates inflating for no damn good reason…

  • TipT0pMag00@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Why even make, post or repost a video like this? It’s the same story that’s already been told by more reputable content creators:

    The 7800x3d wins 90+% of the gaming benchmarks, while often using less than half the wattage.

    The Intel parts do much better in core heavy productivity work (no shit).

    Nothing to see here, keep it moving.

  • PantZerman85@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The 7800X3Ds 1080P result for Starfield is worse than the 1440P for some reason. Same with the 1% low in some other games.

    • EmilMR@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Is this with the newest patch that came out yesterday? that improved things a lot.

    • EmilMR@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Is this with the newest patch that came out yesterday? that improved things a lot.

    • ArseBurner@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s (Starfield) probably just run to run variance. The difference is so small it might as well be margin of error.

    • ArseBurner@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s (Starfield) probably just run to run variance. The difference is so small it might as well be margin of error.

    • Admirable-Echidna-37@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It mostly is because of the aged Bethesda Engine. It was called Creation or something. At that time, games preferred high single core clock speeds, something out of reach for AMD even now.