• ExTrafficGuy@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It’s like the late aughts, when board partners were strapping 1GB of VRAM onto really low end GPUs, then marketing them as “1337 HaXorz Edition”, or some dumb fluff like that.

  • Terrykickass@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Better buying an rx6600 or above card ,bought my rx6600 asrock challenger d for £210 jn july runs everygame in my library wth ryzen 5 5600x cpu n 32 gb memory

  • Jism_nl@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The RX580 “16GB” is a edited card with the Pro bios and not 1Gbit but 2Gbit chips, making it a 16GB model. However there was barely any difference clock for clock when you put it against the 4GB vs 8GB model - i’m sure the 16GB model would not show any benefit at this point.

    The RX6700XT is 2.5x as fast, carries 12GB of memory and is quite more efficient then the Polaris generation. If you would pay a high fee for the RX580 16GB then it’s best to leave it alone or collect it as a item.

    Extra 8GB is only beneficial in workloads that can use it - other then that it would not make any difference compared to a 8GB model in games. I would like to see that they do install faster GDDR memory rather then more memory.

    Polaris has always bin bandwidth starved. You can tell the GPU clock stops scaling beyond 1000 ~ 1100Mhz. Going upwards to 1366Mhz which was the default clock is not yielding anything extra really. It comes from tuning and tweaking the memory (faster timings, higher clocks). I’d say a RX580 with 2500Mhz GDDR5 would actually compete with not just a 1070 but perhaps a 1080 even.

  • Xtraordinaire@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    3080 20GB seems a good way to futureproof it a bit more… But 580 is just too old to be relevant.

  • penguished@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    That’s pretty sick. Can the 580 do anything with most AI though? The ones I see are using CUDA.

  • Hombremaniac@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I believe 3XXX series of Nvidia would have been stellar, if 3060ti and 3070/ti had at least 12GB and 3080/ti 16GB of VRAM. Owners of these would have almost zero incentives to upgrade them which is why Ngreedia gimped these GPUs on VRAM. Both sad and horrible decision.

    Would be nice to see benchmarks of those 3080s with 20GB of vram.

  • Mightylink@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Do they even come with warranty? lol, I’d imagine these have been beat to hell already and will probably fail in less than a year.

  • shalol@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Glad to see whoever had these chips rusting in a flooded warehouse gave up the crypto mining unicorn and figured out how to turn them a real world purpose.

    • floeddyflo@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      They are diverting RX 580 into specialized driver support, which will allow them to spend more time on fixing driver issues for more modern cards and adding new features for more modern cards instead of having to make sure its compatible with 400/500 series and then troubleshoot it & make the code work with the 400/500 series, and vice versa (400/500 series bugs/features not meddling with the newer cards’ drivers). They aren’t getting new features anymore but they are still getting modern game support and security fixes. AMD isn’t dropping support, but the card won’t be getting new features.

  • pullupsNpushups@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    That 16GB 580 sounds cool for memory-constrained scenarios, but it’d obviously make more sense to get a more powerful card entirely.

    • xrailgun@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Would’ve been good budget option for hobby AI if it wasn’t about 200x slower than Intel A770 for SD/LLMs.

      • pullupsNpushups@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I agree. I did deep fakes years ago with my 580, and having 16GB of VRAM would’ve been great for dealing with larger models and whatnot, but it’s past its prime at this point. Most budget cards today would be faster than it in compute, so that would be the reasonable option compared to the novelty of a 16GB 580.

        Also, is the A770 really 200x faster in SD/LLMs or is that an exaggeration? I’m asking out of curiosity. I did some SD with DirectML and it was certainly slow, around 2 minutes for a set of images.

          • pullupsNpushups@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That sounds about right. Nvidia and Intel seem to have better computer API support than AMD, so it doesn’t surprise me that the A770 performs much better with OpenVINO compared to the 6700. Alchemist is generally powerful in computer anyways, so that helps too.

    • Julia8000@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yeah you really need like at least 3070 performance levels to properly use more than 8gb Vram without crippled performance. And that’s already more than 100% faster in most modern games. So yeah while the 580 was a very decent card for its time it obviously aged and now is only low end with horrible efficiency compared to anything modern.

    • jmas081391@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yep, Dawid already made a video comparing an original RX580 8gb vs the circumcised RX580 with 16gb. Out of all the games he tested there’s only 1 that utilizes that 16gb memory but it’s below 10fps! lmao.

      • pullupsNpushups@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Heh, caught me off guard with that description.

        That video sounds about right. The place that more VRAM would be useful in is with compute, such as machine learning and video and photo editing, CAD, etc. I was doing image upscaling for the frames of a video a while ago, and it’s easy to fill up all 8GB of VRAM by upscaling enough images at once.

        But the 580 is old by now, so it wouldn’t make sense to get a 16GB one for that either.

        • cinaak@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It wouldnt make much sense at all but Id like to have one. Really wouldnt mind seeing a Vega VII modded like this with more ram or even a vega 56 or 64. Those can be setup to sip the power and remain very capable for compute.

          Im actually loving what the crypto crash has been doing. So many cool options for a decent price. At least where I live.

          • AndyPufuletz123@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Vega uses HBM memory that is on the package, unlike GDDR which just sits on the PCB. That mod would be much more complicated if not outright impossible.

            • cinaak@alien.topB
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Ah didnt think of that was just thinking about how the frontier edition is the same exact thing just with more ram and modified firmware and how there were even larger stacks of hbm2 memory available so the possibility of a vega 64 with even more ram seemed exciting. Properly tuned with amdmemorytweak or even just a good undervolt and they can be very efficient.

              Even the 5500xt 8gb could provide some decent compute power with a bit of tuning or going in with more power tool and really unlocking the memory.

              Would be awesome to have total control over your cards like that. Like oh I want this gpu but with this much ram at this speed using only this much power and easily making that a reality.

              Idk im really into this stuff and have several of the chinese motherboards with old chipsets and modded bioses on them too. Kinda cool having a cheap unlocked xeon system that for the most part is totally usable. Before that years back was modding asus made hp motherboards to overclock xeons and running mobile amd chips in dual cpu systems to oc.

              Could keep a lot of stuff out of the dumps and in use this way which I think is a good thing. Hopefully they figure this out on better cards the 580 is pretty weak

              • Entr0py64@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                All Vega support HBCC. You can just turn HBCC on and tell games you have more VRAM, which does provide a performance boost in games that actually need it. If you don’t need it, it does nothing.

                • cinaak@alien.topB
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  I remember that made some benchmarks really happy. Havent tried it in years seemed pretty hit or miss, wonder if theres any reason to try it out again. Seemed to benefit the lows more than anything.

                  Idk Ill probably keep using these cards for quite a while. The VII is a beast for compute 64 and 56 both good too and still able to game with them at more than acceptable frame rates and resolutions. Idk I guess what I considered acceptable might not be the norm though I was watching videos about various handhelds for my son who wants to buy one and the people in them were calling 52 fps unplayable.

          • pullupsNpushups@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That’s fair. I’d love to have one for a collection as well. Crypto has indeed churned out some cool mods of pre-existing cards. The machine learning boom might also incentivize more mods of this nature, since that benefits from VRAM just as crypto did.

        • algaefied_creek@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I mean for a local llama setup or stable diffusion the 580 could be nice.

          But would have to use OpenVINO with an OpenCL backend instead of using ROCM as I think these have been removed from ROCM.

          … unless there is a good Vulkan backend these days.

      • 118shadow118@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        My previous GPU was a 4GB RX580 and it definitely was VRAM limited. 8GB is probably the sweet spot for that card (I might’ve kept mine longer if it had 8GB), but, at least for gaming, I don’t really see a point in having 16GB on a card with that level of performance

      • Current_Finding_4066@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Maybe it makes some sense in the AI arena. But I tend to agree that going with a better GPU is the way to go.

        As for using RTX 3080,… for AI. I think Chinese have issues with sanctions.

    • Frubanoid@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The power draw was already a lot with a 3080 12gb. I traded in for a 4070ti (on sale for $800 at the time for a higher end AIB) and cut my overall PC power consumption by 30%-50%! Since I pay the electric bill, this feels good. And I’m getting more frames. The 12g has been more than adequate so far and handles 4k well enough in any application so far. ☺️

      • pullupsNpushups@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Makes sense. The energy usage probably wouldn’t be noticeable over the course of a year for normal usage, but that’s still a decent trade. A more efficient card that’s also faster. Pretty good.