this post was submitted on 03 Nov 2024
272 points (98.9% liked)

Technology

59116 readers
3754 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Panther Lake and Nova Lake laptops will return to traditional RAM sticks

you are viewing a single comment's thread
view the rest of the comments
[–] 1rre@discuss.tchncs.de 34 points 2 days ago (1 children)

And here I was thinking Arc and storage were the only semi-competitive wings of intel... They just needed a couple of years for adoption to increase

[–] Buffalox@lemmy.world 17 points 2 days ago* (last edited 2 days ago) (3 children)

I've commented many times that Arc isn't competitive, at least not yet.
Although they were decent performers, they used twice the die size for similar performance compared to Nvidia and AMD, so Intel has probably sold them at very little profit.
Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too.
But maybe that's the reason Intel recently admitted they couldn't compete with Nvidia on high end AI?

[–] InverseParallax@lemmy.world 11 points 2 days ago (1 children)

Arcs are OK, and the competition is good. Their video encode performance is absolutely unworldly though, just incredible.

Mostly, they help bring the igpu graphics stack and performance up to full, and keep games targeting them well. They're needed for that alone if nothing else.

[–] Buffalox@lemmy.world 6 points 2 days ago (1 children)

They were competitive for customers, but only because Intel sold them at no profit.

[–] InverseParallax@lemmy.world 4 points 2 days ago* (last edited 2 days ago) (1 children)

I mean fine, but first gen, they can fix the features and yields over time.

First gen chips are rarely blockbusters, my first gen chips were happy to make it through bringup and customer eval.

Worse because software is so much of their stack, they had huge headroom to grow.

[–] Buffalox@lemmy.world 1 points 2 days ago* (last edited 2 days ago) (1 children)

First gen chips are rarely blockbusters

True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

But you are right, these things usually take time, and for instance Microsoft was prepared to spend 10 years without making money on Xbox, because they saw it had potential in the long run.

I'm surprised Intel consider themselves so hard pressed, they are already thinking of giving up.

[–] InverseParallax@lemmy.world 10 points 2 days ago (1 children)

True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.

Actually, they didn't.

This was their first: https://en.wikipedia.org/wiki/NV1

Complete failure, overpriced, undercapable, was one of the worst cards on the market at the time, and used quadratics instead of triangles.

NV2 was supposed to power the dreamcast, and kept the quads, but was cancelled.

But the third one stayed up! https://youtu.be/w82CqjaDKmA?t=23

[–] Buffalox@lemmy.world 4 points 2 days ago* (last edited 2 days ago) (1 children)

You are right.

and used quadratics instead of triangles.

Now that you mention it, I remember reading about that, but completely forgot.
I remembered it as the Riva coming out of nowhere. As the saying goes, first impressions last. And I only learned about NV1 much later.

But the third one stayed up!

👍 😋

But Intel also made the i815 GPU, So Arc isn't really the first.

[–] InverseParallax@lemmy.world 4 points 2 days ago (1 children)

Oof, yeah, they actually had another they didn't release, based off pentium cores with avx512, basically knights landing with software support for graphics.

They were canceling projects like it was going out of style, which is sad, that would have been amazing for Ai.

[–] Buffalox@lemmy.world 3 points 2 days ago (1 children)

Yes, there was the Xeon Phi, Knights Landing, with up to 72 cores, and 4 threads per core!
The Knights Landing was put into production though, but it was more a compute unit than a GPU.

I'm not aware they tried to sell it as a GPU too? Although If I recall correctly they made some real time ray tracing demos.

[–] InverseParallax@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

So, trying not to dox myself, I worked with the architect twice.

Knights Ferry was derived directly from Larrabee (GPU), P54Cs with pre-AVX-512, .

KNC was a die shrink with more cores. Both of these were PCIe accelerators only.

KNL had full Airmont Atom cores with smt4, basically meaningful cores with proper AVX-512. Also you could boot them with linux, or as a PCIe accelerator.

KNM jadded ML instructions, basically 8/16bit float and faster SIMD.

They cancelled KNH.

I interviewed some of the actual Larrabee guys, they were wild, there was a lot of talk about dynamic translation, they were trying to do really complex things, but when people talk like that it makes me think they were floundering on the software and just looking for tech magic solutions to fundamental problems.

Intel always dies when the software gets more complex than really simple drivers, it's their achilles heel.

KNL also had the whole MCDRAM on package for basically HBM bandwidth, but that didn't actually work very well in practice, again due to software issues (you have to pick where you allocate, and using it as an l4 cache was not always effective).

[–] hamsterkill@lemmy.sdf.org 3 points 2 days ago

Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too

I think I read somewhere that they're having problems getting AIB partners for Battlemage. That would be a significant impediment for continuing in the consumer desktop market unless Battlemage can perform better (business-wise) than Alchemist.

They probably will continue investing in GPU even if they give up on Arc, it might just be for the specialized stuff.

[–] 1rre@discuss.tchncs.de 2 points 2 days ago (2 children)

Yeah true, plus I bought my a770 at pretty much half price during the whole driver issues and so eventually got a 3070 performing card for like $250, which is an insane deal for me but no way intel made anything on it after all the rnd and production costs

The main reason Intel can't compete is the fact CUDA is both proprietary and the industry standard, if you want to use a library you have to translate it yourself which is kind of inconvenient and no datacentre is going to go for that

[–] Trainguyrom@reddthat.com 3 points 2 days ago

The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

Funnily enough this is actually changing because of the AI boom. Would-be buyers can't get Nvidia AI cards so they're buying AMD and Intel and reworking their stacks as needed. It helps that there's also translation layers available now too which translate CUDA and other otherwise vebdor-specific stuff to the open protocols supported by Intel and AMD

[–] Buffalox@lemmy.world 4 points 2 days ago (1 children)

The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard

AFAIK the AMD stack is open source, I'd hoped they'd collaborate on that.

[–] 1rre@discuss.tchncs.de 3 points 2 days ago* (last edited 2 days ago)

I think intel support it (or at least a translation layer) but there's no motivation for Nvidia to standardise to something open-source as the status quo works pretty well