this post was submitted on 22 Sep 2023
12 points (55.8% liked)

Technology

34889 readers
373 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

tr:dr; he says "x86 took over the server market" because it was the same architecture developers in companies had on their machines thus it made it very easy to develop applications on their machines to then ship to the servers.

Now this, among others he made, are very good points on how and why it is hard for ARM to get mainstream on the datacenter, however I also feel like he kind lost touch with reality on this one...

He's comparing two very different situations, more specifically eras. Developers aren't so tied anymore like they used to be to the underlaying hardware. The software development market evolved from C to very high language languages such as Javascript/Typescript and the majority of stuff developed is done or will be done in those languages thus the CPU architecture becomes irrelevant.

Obviously very big companies such as Google, Microsoft and Amazon are more than happy to pay the little "tax" to ensure Javascript runs fine on ARM than to pay the big bucks they pay for x86..

What are your thoughts?

you are viewing a single comment's thread
view the rest of the comments
[–] TCB13@lemmy.world 2 points 1 year ago (1 children)

Things that looks like slam dunks in theories are never such in practice. Weird bugs pop up from time to time; and believe me, they will!

It might be rare, you may only see it once or twice in a project; but when it happens, you’re gonna want to be ready, or people will question your ability to do your job.

Yes, however price is more important that all that. If your management knows it can save 20% on their cloud spending by running ARM they'll run ARM and have you deal with those rare bugs.

[–] mea_rah@lemmy.world 1 points 1 year ago (1 children)

"If" being the key word here. There are nuances to be considered. One DB might run really well on arm, the other not so much.

I'm saying it as huge fan of the arm servers. They are amazing and often save a lot of money essentially for free. (practically only a few characters change in terraform) In AWS with the hosted services (Opensearch, and such) there's usually no good reason to pay extra for x86 hardware especially since most of the intricacies are handled by AWS.

But there are workloads that just do not run on arm all that well and you would end up paying more for the HW to get to the performance levels you had with x86.

And that's beside all those little pain points mentioned above that you're "left to deal with" which isn't cheap either. (but that doesn't show up on the AWS bill, so management is happy to report cost savings)

[–] TCB13@lemmy.world 1 points 1 year ago (1 children)

there’s usually no good reason to pay extra for x86 hardware especially since most of the intricacies are handled by AWS. (...) all those little pain points mentioned above that you’re “left to deal with” which isn’t cheap either. (but that doesn’t show up on the AWS bill, so management is happy to report cost savings)

Exactly my point above when people start shouting about upgradability compatibility and whatnot.

[–] mea_rah@lemmy.world 2 points 1 year ago

Yeah, I was saying "no reason" in the context of SAAS. Once the management falls on the end user, it's a different beast altogether.

I think we're trying to say the same in a different way actually. 😅