and still high idle power usage (cant get mine under 15w) my nvidia / amd cards idle around 4-5w even with screen attached… (im using a a380 in a server for video encoding/decode)
When ARC was first announced I was pretty excited to get Intel’s super low power iGPU and their GPU splitting tech (GVT-g) in a discrete card. Then they canceled all future GVT-g development and the cards crapped the bed on idle power consumption.
You can get it to 1w with 12th gen+ cpu with iGPU
https://www.reddit.com/r/IntelArc/comments/161w1z0/managed_to_bring_idle_power_draw_down_to_1w_on/
iGPU just bypasses the dGPU. That shouldn’t be necessary.
You can also do this with Nvidia and AMD GPUs, kinda the whole way laptops work to conserve battery is to just shut off the dGPU when not in use.
It’s cool and all, but changing the whole platform just to help the gpu’s idle power consumption sounds like a terrible solution.
Especially when you can just buy an AMD or Nvidia GPU instead and get almost all the power savings that way.
This sub is just a site for this long hair potato guy?
Where is intel’s future honestly without GPU’s? Second rate CPU’s? Manufacturing silicon for everyone else? I just don’t see how Intel continues as an industry leader without expanding into other segments like GPU’s. They have sold off a bunch of other businesses in the last few years. Maybe I just don’t understand their business enough.
I never thought Intel would get rid of Arc dGPUs, but now the rise of AI could be a big reason and solid reason for share holders to keep Arc going anyway
I don’t see AMD beating Intel in AI
Do we have any clue when to expect battle mage?
CES is the tentative announcement prediction, with wider availability in Q2.
Praying for a 3 way fight in the mid range.
Praying for a 2 way fight at the top range.
They need to beat 4070 at the beginning of the cycle. Or at least 4070 Ti level mid-cycle. And that’s just old x070 non-Ti level performance to begin with.
Arc is realistically a bigger threat to AMD than it is to Nvidia. The second half of the 2020’s will be AMD and Intel competing over second place for desktop dGPUs.
For mobile, Arc iGPUs, while obviously not matching dedicated GPUs, can realistically offer good enough performance to some people who want to do light gaming, then stepping up to a low end dGPU just to make sure Minecraft, Fortnight, etc. can at least run may not be worth the extra cost.
Either way, I think Intel’s heavy focus on putting Arc in all of their Core Ultra CPUs and heavily focusing on iGPU can be a potentially bigger disruptor than their desktop dGPUs, at least in the nearterm.
Arc is no threat whatsoever to Nvidia, not unless Intel manage to scale up the architecture to enterprise-grade levels and develop something akin to the CUDA API.
Intel’s competitor to CUDA is oneAPI and SYCL. Intel poses no threat to Nvidia GPUs in datacenter in the near term, but that doesn’t mean Intel won’t still secure contracts.
Intel’s biggest threat to Nvidia is against Nvidia’s laptop dGPU volume segment. Arc offers synergies with Intel CPUs, a single vendor for both CPU and GPU for OEMs, and likely bundled discounts for them as well. A renewed focus on improving iGPUs also threatens some of Nvidia’s low end dGPUs in laptops - customers don’t have to choose between very poor performance iGPU or stepping up to a dGPU, and now iGPUs will start to become good enough that some customers will just opt to not buy a low end mobile dGPU in coming years.
Not to mention that Intel could have consumer AI tech in nearly -every- laptop sold in 5 years with just an intel iGPU. Not to mention mini-PCs etc etc, especially if LNL pans out well. Thats a scale of deployability that Nvidia simply cannot compete with.
A renewed focus on improving iGPUs also threatens some of Nvidia’s low end dGPUs in laptops - customers don’t have to choose between very poor performance iGPU or stepping up to a dGPU
AMD has had iGPUs in laptops for a long time now, and the better CPUs for more than a couple of the past few years, yet laptops are still sold with Nvidia dGPUs even when they have decent AMD iGPUs.
It might kill the lowest of the lowest end of laptop dGPUs, but I think Nvidia’s pricing is doing that faster than Intel’s success with Arc.
The issue with AMD laptops is availability and the mixing of generations under similar SKU numbers. There’s only a handful of Zen4 laptops in the wild, and they’re mixed in with Zen2 and Zen3 parts, leading to a confusing experience for the average buyer. So, people will either go for an Intel laptop, or find an Nvidia dGPU laptop for the ‘upgrade’.
They have a better chance of doing it than AMD do.
Not to mention the inclusion of XMX cores in Arrow Lake and presumably beyond could provide XeSS video upscaling similar to what DLSS is doing, all without a dGPU
Let hope Intel continues. Nvidia seems to concentrate on ai which isn’t good for gamers and having a sole manufacturer left (amd) to make a monopoly won’t be good either.
Why isn’t it good for games?
"gamers"as s hole, just in case of misunderstanding.
Nvidia are earning af. ton of money on ai hardware, they would be fools to not move their manufacturing capacity more towards ai and not consumer graphics card. This will make graphics cards more scares and expensive. Just look what the crypto boom did and still does, graphics card cost an arm and a leg and will get much much worse with the ai boom
People fighting in the comments about what should or could Nvidia/AMD/intel do
summed up its, Intel cant be Nvidia. AMD is screwed.
So Smooth Sync is just not working in most games? Like what if you tried it in something totally unexpected like The Witcher 1 or 2? Something old, or something brand new? Is it a white list where they select which games to enable it for? Or a black list where they disable it for certain games exhibiting problems?