this post was submitted on 26 Jun 2023
54 points (100.0% liked)

Technology

37719 readers
114 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Tech CEOs want us to believe that generative AI will benefit humanity. They are kidding themselves

you are viewing a single comment's thread
view the rest of the comments
[–] ABoxOfNeurons@lemmy.one 8 points 1 year ago (3 children)

I don't know exactly where to start here, because anyone who claims to know the shape of the next decade is kidding themself.

Broadly:

AI will decocratize creation. If technology continues on the same pace that it has for the last few years, we will soon start to see movies and TV with hollywood-style production values being made by individual people and small teams. The same will go for video games. It's certainly disruptive, but I seriously doubt we will want to go back once it happens. To use the article's examples, most people prefer a world with street view and Uber to one without them.

The same goes for engineering.

[–] exohuman@kbin.social 4 points 1 year ago (3 children)

That’s putting millions of people out of a job with no real replacement. The ones that aren’t unemployed will be commanding significantly smaller salaries.

[–] PenguinTD@lemmy.ca 4 points 1 year ago (1 children)

It's actually not as easy as you think, it "looks" easy because all you seen is the result of survivorship bias. Like instagram people, they don't post their failed shots. Like seriously, go download some stable diffusion model and try input your prompt, and see how good the result you can direct that AI to get things you want, it's fucking work and I bet a good photographer with a good model can do whatever and quicker with director.(even with greenscreen+etc).

I dab the stable diffusion a bit to see how it's like, with my mahcine(16GB vram), 30 count batch generation only yields maybe about 2~3 that's considered "okay" and still need further photoshopping. And we are talking about resolution so low most game can't even use as texture.(slightly bigger than 512x512, so usually mip 3 for modern game engine). And I was already using the most popular photoreal model people mixed together.(now consider how much time people spend to train that model to that point.)

Just for the graphic art/photo generative AI, it looks dangerous, but it's NOT there yet, very far from it. Okay, so how about the auto coding stuff from LLM, welp, it's similar, the AI doesn't know about the mistake it makes, especially with some specific domain knowledge. If we have AI that trained with specific domain journals and papers, plus it actually understand how math operates, then it would be a nice tool, cause like all generative AI stuff, you have to check the result and fix them.

The transition won't be as drastic as you think, it's more or less like other manufacturing, when the industry chase lower labour cost, local people will find alternatives. And look at how creative/tech industry tried outsource to lower cost countries, it's really inefficient and sometimes cost more + slower turn around time. Now, if you have a job posting that ask an artist to "photoshop AI results to production quality" let's see how that goes, I can bet 5 bucks that the company is gonna get blacklisted by artists. And you get those really desperate or low skilled that gives you subpar results.

[–] ABoxOfNeurons@lemmy.one 3 points 1 year ago

Somehow the same artist:

[–] ABoxOfNeurons@lemmy.one 3 points 1 year ago (1 children)

I seriously doubt this technology will pass by without a complete collapse of the labor market. What happens after is pretty much a complete unknown.

[–] hglman@lemmy.ml 3 points 1 year ago

I think its fair to assert that society will shift dramatically. Though the climate will have as much to do with that as AI.

[–] FaceDeer@kbin.social 1 points 1 year ago

Yup. We should start preparing ideas for how we're going to deal with that.

One thing we can't do is stop it, though. Legislation prohibiting AI is only going to slow the transition down a bit while companies move themselves to other jurisdictions that aren't so restrictive.

[–] hglman@lemmy.ml 2 points 1 year ago (2 children)

It will shift a lot of human effort from generative to review. For example the core role of an engineer in many ways already is validation of a plan. Well that will become nearly the only role.

[–] rustyspoon@beehaw.org 1 points 1 year ago (1 children)

the core role of an engineer in many ways already is validation of a plan.

I disagree, this implies that AI are doing a lot more than they actually are. Before you design the physical layout of some thing, you have to identify a problem, and identify guidelines and empirical metrics against which you can compare your design to determine efficacy. This is half the job for engineers.

There's one step of the design process that I see current AI completing autonomously (implementation), and I view it as nontrivial to get the technology working higher up on the "V".

[–] hglman@lemmy.ml 1 points 1 year ago

Agreed. Its a more impactful on software than physical engineering (untill robots can build more arbitrary objects) but that is my point, implementation is only a small part of the job.

[–] ABoxOfNeurons@lemmy.one 0 points 1 year ago (1 children)

That assumes that the classes of problems that AI's can solve remains stagnant. I don't think that's a good assumption, especially given that GPT4 can already self-review and refine its output.

[–] hglman@lemmy.ml 1 points 1 year ago (1 children)

It will take a very long time for people to believe and trust AI. That's just the nature of trust. It may well surpass humant in always soon, but trust will take much more time. What would be required for an AI designed bridge be accepted without review by a human engineer?

[–] ABoxOfNeurons@lemmy.one 1 points 1 year ago

We'll probably see sooner or later.

[–] HeartyBeast@kbin.social 2 points 1 year ago (2 children)

The same goes for engineering.

I can't wit to drive over a bridge where the contruction parameters and load limits were creatively autocompleted by a generative AI

[–] rustyspoon@beehaw.org 3 points 1 year ago

There's a guy at this maker-space I work out of who's been using ChatGPT to do engineering work for him. There was some issue with residue being left in the parking lot on the pavement and came forward saying it had to do with "ChatGPT giving him a bad math number," whatever the hell that means. This is also not the first time he's said something like this, and its always hilarious.

[–] ABoxOfNeurons@lemmy.one 2 points 1 year ago (1 children)

Generative design is already a mature technology. NASA already uses it for spaceship parts. It'll probably be used for bridges when large-format 3D printers that can manage the complexity it introduces.

[–] rustyspoon@beehaw.org 1 points 1 year ago

It's still just a tool for engineers though. Half of the job is determining what the design requirements are, another quarter is figuring out what general scheme (i.e. water vs air cooling) works best to meet those requirements. Things like this are great, but all they really do is effectively connect point A to point B in order to free up some man-hours for more high-level work.