I’ve been thinking of when will the RDNA4 cards come out.

As MILD mentioned RDNA 4 will come out around Q3 2024. I don’t think there will be RDNA 3.5 refresh cards or RDNA 3 refresh cards out next year.

I think RDNA4 will be very similar to RDNA3 apart from very small arch improvements and update to Raytracing core.

There were rumors in 2022 that AMD had issues with TSMC 3nm node and that they will be using 4nm. Current rumors don’t say anything about which node will RDNA4 use but, seeing that TSMC N3E 3nm node is just being put in production and others will be using it, makes sense that AMD has to use the 4nm node in 2024. This could make AMD release the RDNA 4 before Nvidia does its new series. So, I’m thinking RDNA4 could come out end of May and be available in June 2024.

Further, I was looking at how big the RDNA 4 flagship chip will be in mm2 and what its performance could be. Taking the N31 which is based on 5nm and 6nm nodes and combined size of 530mm2. An RDNA4 best would be around 370mm to 450mm2 chip with 90-96 CUs like Rx7900 series, but with 256Bit bus, faster memory since it will use Gddr7 and rated TDP of below 280W. I came to this conclusion that 4nm TSMC node is a very small improvement in transistor density, of just 6% for the N4, (N4x or Nvidia specific 4N might be a bit more).

Looking at the 4nm node and doing the math is no wonder that AMD can’t produce a high-end GPU next year because by my math comes out that a 20-30% more performat GPU then a RX7900xtx would have to be bigger then 680mm2 and have a TDP of 410W, that’s what the 4nm node does.

But here are the all the good things, the GPU, let’s say it’s called Rx 8800 XT is out in middle of next year has 16gb of Vram for 600$, identical performance in raster compared to Rx 7900 XTX and somewhat better performance in raytracing.

There are two AMD patents on raytracing that I’ve read few months back. The first one, released 1 year prior to first RDNA 2 GPU, talks about raytracing core. But the second one, it was released in June this year. So, the latest AMD patent describes GPU withing its raytracing core, addition of a hardware specific traversal engine and specific BVH memory cache. Not to go into details, from what I understand of the two patents first patent describes raytracing core in RDNA 2&3 and the second patent describes similar but a much improved way of doing raytracing. I’m hopeful we will see this is RDNA4(the patent did arrive this june and next june we’ll have RDNA 4 card so it matches the schedule prior to RDNA2) (https://www.freepatentsonline.com/20230206543.pdf)

  • 3d54vj@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    They already stated there wont be a flagship this time around. Its all about midend now

  • Nameless408@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    There’s no way we’d be getting a June release of RDNA 4 without some sort of announcement or more serious leaks. People be rumoring the RTX 5000 cards as a 2025 release, and yet we hear nothing about AMD and their next release is only 7 months out?

    Your analysis about performance / size is sensible, but there’s absolutely no way we would see a release so soon.

    • SoTOP@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      There is pretty big difference in timeline of credible leaks for Nvidia versus AMD for past few major releases - AMD leaks start significantly closer to launch date. So the fact that there are no leaks for RDNA4 at this time does not mean summer release is too soon.

      Obviously for the same reason release date is pure speculation at this point.

    • Puzzled_Cartoonist_3@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That is fair to say, I am making a prediction here and its pure speculation.
      But Rx5700 Xt released in Jul 7, 2019, RX 6800Xt Nov 18, 2020 and Rx7900 XTX Dec 13, 2022. So, it would be a year and a half from previous series.

      • Nameless408@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It was 1.5 years for the 5000 and 6000, then 2 years from 6000 to 7000. 2 years from 7000 would be Dec 2024 or later. The announcements also come a few months before; if we were getting a card in June 2024 we’d be hearing about it by now, at least rumors if not official announcements.

        An early release would be smart, especially if they don’t plan to compete at the high end in the next series of cards; get the folks who’d want to upgrade now, before Nvidia releases their cards in Q1 2025.

        • Puzzled_Cartoonist_3@alien.topOPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Sure, more often then not it was closer to 2 years apart from each series releases.
          I agree with you that an early release would be smart if they are not competing in the high end with RDNA4.

      • SolidQ1@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        GFX12 was add this week. That comparision

        For RDNA3, gfx11 was added on April 29, 2022 and the cards were launched Dec. 13 2022. RDNA2 (GFX1030) was added on Jun 16, 2020 and released November 18, 2020

        • FormalIllustrator5@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Not even close - December 24’ “paper” lunch is a wishful thinking even…

          Some “Super” cards or “refresh” would be possible - but 5% chance at best…

    • bubblesort33@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      RDNA3 entries to Linux were made 6-8 months before release of the 7900 series. RDNA4 entries were made about a week ago. July would be 7 months from now.

      That being said, when RDNA3 entries were made, they also made the 7800xt and 7600xt entries, and those didn’t launch until like 12-18 months after the first entry. But I think 12 months at the latest from now makes sense. Another November release date like RDNA2.

    • CrzyJek@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yea I dunno where OP got June next year. He said around Q3 next year…which is July - September…meaning it could come end of September.

  • ET3D@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Upping the CUs from 96 to 128 (and the ROPs similarly) will increase the GCD size from ~305 to ~372 mm^(2), based on the die image (and leaving some blank space at the side), and the total to 596 mm^(2). Whether performance will increase performance enough depends on the RAM bottleneck.

    It’s also worth nothing that RDNA 3 apparently didn’t reach the expected clocks, and if AMD managed to solve this problem it would be possible to get extra performance without much or any extra die space.

    In general if you’re just aiming to reduce chip size by removing two MCDs, that’s not really that much of a cost saving. That won’t make the chip a mid-range chip as rumoured.

    • Puzzled_Cartoonist_3@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      In my opinion, on the RDNA3 not reaching expected core clocks, its mostly down to poor performance of TSMC 5nm node to what AMD wanted to achieve. There target might been RTX4090 flagship performance but on the 5nm plus 6nm, and TDP 355W resulted in high power consumption, and to keep power down, core clock has to suffer.

      Rx 7900XTX Taichi model has roughly 250mhz more in boost and a roughly 50W more power consumption then the AMD model.(https://www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/38.html)

      • FormalIllustrator5@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have Taichi on water, 3.1 GHZ clock - what more you may need? 425W~ consumption. 2.95Ghz stable game clock…

        All that is before i will update the BIOS to Aqua monster…550W+~ without OC…

      • EnderOfGender@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        thing is, people can’t seem to get much clock boost that translates into performance out of the 7900 series. 3200mhz overlocking tests doesn’t do much more than 10-15% from AIBs, and consumes 500w+

        if you look at timespy results, going from 2800-3000mhz game clock to power unlimited 3500mhz (which consumes over 500W), you gain a few thousand points

        time spy with a random “stockish” result and the 7900xtx overclock from here. most of the gains came from the 6.0GHz oveclock on the CPU, with graphics results only being about 12% faster with this overclock

        unless i’ve missed something, it seems like the 7900 series is limited by more than power

        • Puzzled_Cartoonist_3@alien.topOPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I don’t see GPUs having more then 3Ghz anytime soon as standard clocks. High core frequencies make a small chip have much better performance, so its cost efficient for AMD to make them but high core clocks are increasing power consumption. AMD is trading and evaluating size of the chip with its frequency to get the performance.

      • ET3D@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        They could, but N5/N4 is significantly more costly than N7/N6 (was estimated to be 1.7x more costly, IIRC). That’s why AMD is trying to keep the size under control by using chiplets.

  • Zealousideal-Gift-70@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    AMD memes a lot but I don’t think they’d just pull their pants down and shit right in everyone’s faces who bought a 7900xtx that soon after.

    • Put_It_All_On_Blck@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD is known for screwing launch buyers by dropping prices hard when competition doesn’t allow them to keep theie high prices. Like look how hard and fast Zen 4 pricing fell after 13th gen launched. I know someone will think ‘its good they reacted to competition’, and that’s true, but AMD knew a $300 6 core non-X3D wouldnt be competitive, but they didn’t care and let early adopters overpay.

      But I don’t think AMD is going to start pricing at $600, that’s too low of a ‘starting’ price for their ‘flagship’. I could see $750-$800 but definitely not as low as $600.

      • Zealousideal-Gift-70@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m honestly pretty sad there won’t be a 8900 XTX or whatever. I’m not excited to see Nvidia just go ape shit with their prices.

  • xChrisMas@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I don’t really care about a high end model being released. But I would appreciate a small generational gain with a large price cut.

    Wishful thinking but we need to get those prices in check

  • Friendly-Advantage79@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The dust is yet to settle around RDNA3, what’s the rush with RDNA4? Also, I just bought RDNA2 card and having tons of fun with it.

      • Keldonv7@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        offtopic totally but maybe actually getting features rdna3 was supposed to get… like antilag+ or fsr3 thats printed on boxes for months but only exists in two glorified tech demos right now.

  • Duke_Vladdy@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I love threads like these where people talk about hardware specifics. Idk wtf these magic words are but if they help my frames I’m all for it

  • Nunkuruji@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The one thing I’m banking on is vendors will prioritize expensive high-margin AI chips over consumer GPU chips in terms of their fab allocation. Expect delays or only high-end/high-margin releases in the first wave.

  • G-WAPO@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I see no one, coming to the conclusion that there’s only leaks about N44/48 because AMD doesn’t need a larger die for a multiple GCD (gpu chiplet MCM approach) SKU.

    Put two N44s together, and you’ve got like 80% of the equivelant of 2x 7900xtx…quite capable or competing with a 5090 in theory.

    • boomstickah@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      yeah I think that’s clearly going to be what they do whenever they decide to do chiplets again

    • Geddagod@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I see no one, coming to the conclusion that there’s only leaks about N44/48 because AMD doesn’t need a larger die for a multiple GCD (gpu chiplet MCM approach) SKU.

      Bcuz it sounds like copium lol

      Put two N44s together, and you’ve got like 80% of the equivelant of 2x 7900xtx…quite capable or competing with a 5090 in theory.

      Except scaling never works like that, esp not with adding in the problems of chiplets

      • G-WAPO@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It was a spit ball % guess…chiplets for CPU scale excellently (up till about 64 cores, sort of drops off steeply after that)…whether GPUs scale like that remains to be seen (doubtful)…but if you think AMD hasn’t gone to all this trouble to break away from monolithic designs with MCDs and GCDs, and not iterate with a multi GCD design…then I dunno what to tell you bro…🤷

        • Geddagod@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          …but if you think AMD hasn’t gone to all this trouble to break away from monolithic designs with MCDs and GCDs, and not iterate with a multi GCD design…then I dunno what to tell you bro…🤷

          It’s not as impressive as you make it out to be. Splitting the MCDs and GCDs is certainly pretty nice, but both Intel and AMD have shown to have better and more advanced packaging capabilities in their GPUs- with MI300 and PVC- the only reason they haven’t come to consumers yet is cost and complexity chiefly.

          However, if AMD using something MI300esque with RDNA 4… and failed, then yes, it stands to reason that only the monolithic skus would remain.

          Alternatively, the base RDNA 4 arch could just be so cooked they thought it wasn’t worth the effort of developing the more expensive and complicated chiplet skus.

          Or who knows, maybe it’s a combination of the two, or something else.

          Also, the idea that AMD has N44/N48 and can just glue the two together to act as their flagship is also wrong. There has to be additional interconnect logic among other things added to the two dies. If the chiplet dies are canned, then they have to do expensive and time consuming respins on their existing planned RDNA 4 dies (N44/48) in order for them to be allowed to be used in chiplet designs.

          • G-WAPO@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            There’s nothing to say, that isn’t the case (with what you suggested in the last part of your message)…who knows? All these leakers throw so much shit at the wall and tiny bits of it stick, and they point it out and jump up and down, and say “I told you so!” Even though there’s been a tonne of false leaks and misdirection that everyone has had to sift through along the way.

            Take for instance the recent leak of Linux driver updates, mentioning gfx1200, could be a larger “N41” die, or it could be a nothing-burger.

  • jengaFier@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    No, we wont see an AMD flagship sadly.

    As you already know, AMD wont compete with Nvidias high-end 5000 blackwell series. Sure, AMD could still release an high-end flagship GPU, like a 6950 XT, or 7900XTX, but I have high doubts tbh, just dont see it based on information we already got laid out.

    If anything, they would likely try to go for 3nm to incorporate AI chips like Nvidia perhaps. But if they had the means, then they wouldnt avoid competing with Nvidias high-end 5000 series. More realistically we will likely see a 4-5nm architecture, or at the very least, more efficient 5-6nm on an already, solid 530mm2.

    Safe to assume tho that we will get new GPUs by 2024 Q3 or Q4 regardless. Reason being that GDDR7 will be released in mid 2024, as was released officially by micron. Some GPUs have also seen over 2 years of use since their release. Timing wise its ideal to start a new GPU generation. Pretty sure they wont bother releasing anything before the new GDDR7 release however. Consumers will hold out until then.

    • Defeqel@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD wont compete with Nvidias high-end 5000 blackwell series

      Not with RDNA4, but that does not meant they won’t compete with 5000 series at all, RDNA5 high end is apparently coming out 2025.

  • LongFluffyDragon@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    MLID is a bullshit mill that spits out a bunch of self-contradictory guesses based on absolutely no evidence, in the hopes one of them will be close to correct.

  • FireSilicon@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Please don’t watch MLID all his videos come out inaccurate and only pure copium before the actual product launches.

    • floofandmemes@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not really. He can be wrong of course, everything with a grain of salt since most of what he has is targets rather than actual testing data.

      Now Redgamingtech on the other hand… I swear he just makes shit up all the time to get onto trending, everything I’ve seen him say has in fact been outright wrong.