this post was submitted on 27 Nov 2024
1117 points (96.2% liked)

memes

10698 readers
2921 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] PeriodicallyPedantic@lemmy.ca 124 points 4 weeks ago (4 children)

It's the mining of diamonds that kills all the children. After the diamond is mined, I can use it with almost no child deaths. Diamonds are fine.

[–] ayyy@sh.itjust.works 26 points 4 weeks ago (1 children)

No, using an already-trained model doesn’t “use up” the model in exactly the same way that pirating a movie doesn’t steal anything from Hollywood.

[–] PeriodicallyPedantic@lemmy.ca -2 points 4 weeks ago (3 children)

Use a diamond doesn't "use up" that diamond.

And yet, it's still unethical to buy already mined blood diamonds from people who continue to mine more blood diamonds. Funny thing about that, huh

[–] bob_lemon@feddit.org 15 points 4 weeks ago (1 children)

In this analogy, using the diamond does use it up. In the sense that none else can use that diamond concurrently. If someone else wants a diamond, more children must die.

This is different from the trained AI model, which can concurrently be used by everyone at the same time, at very little extra cost.

[–] PeriodicallyPedantic@lemmy.ca 2 points 3 weeks ago (1 children)

Even if the diamond mine owners stop mining, it's unethical to buy their stockpile of blood diamonds.

Also, there is a cost besides electricity - the theft of artist's work is inherent to the use of the model, not just in the training. The artist is not being compensated whenever an AI generates art in their style, and they may in fact lose their job or have their compensation reduced due to artificial supply.

Finally, this is an analogy, it's not perfect. Picking apart incidental parts of the analogy doesn't really prove anything. Use an analogy to explain a problem, but don't pick apart an analogy as though you're picking apart the problem.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

and they may in fact lose their job or have their compensation reduced due to artificial supply.

highly doubt. Any artists that do lose their job are probably mostly ok with it anyway, since it's most likely going to be graphical drivel anyway. In fields like media theres a different argument to be made, but even then it's iffy sometimes. Also i don't think this would be considered artificial supply, it would be artificially insisted demand instead no? Or perhaps an inelastic demand side expectation.

Although, it would be nice to have some actual concrete data on artists and job prospects in relation to AI. Unfortunately it's probably too early to tell right now, since we're just out of the Luddite reactionary phase, who knows.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

Any artists that do lose their job are probably mostly ok with it anyway, since it's most likely going to be graphical drivel anyway.

Replace "artist" and "graphical", and you just described most jobs. I don't think most people are ok losing their jobs even if those jobs aren't especially inherently rewarding; they're getting paid for their area of training. They're not just gonna be able to find a new job because in this hypothetical, the demand for it (a living human doing the work) is gone.

I consider this an increase in supply because it's an increase in the potential supply. Productivity increases (which is what this is) mean you can make more, which drives down the price, which means that artists get paid less (or just get replaced).

Remember: if you 10x the productivity of an employee, that typically doesn't mean you produce 10x the product, it typically means you need 1/10th the employees. That payroll saving goes right into the pockets of execs.

Also wrt luddites, they weren't wrong. It did absolutely demolish their industry and devastate the workers. It's just that the textile industry was only a small part of the economy, and there were other industries who could absorb the displaced workers after they got retrained.
LLMs threaten almost every industry, so there is a greater worker impact and fewer places for displaced workers to go. Also now workers are responsible for bearing the costs of their own retraining, unlike back in the day of the luddites.

[–] KillingTimeItself@lemmy.dbzer0.com 2 points 3 weeks ago (1 children)

Replace “artist” and “graphical”, and you just described most jobs.

yeah, it's a generalized statement, so that makes sense. It'd be weird if my statement only applied to the artist economy, not like, the rest of the economy.

I don’t think most people are ok losing their jobs even if those jobs aren’t especially inherently rewarding; they’re getting paid for their area of training.

i think this is what most people would say, and i would generally agree, however there is always going to be some level of job market upset, which is a good thing for society, and people, even if they don't like it. It's a liquidity thing at the end of the day. If you have no liquidity doing anything other than what you first started doing is going to be really hard, if you have more liquidity, it becomes easier. Although there is a point of diminishing returns where it turns into a revolving door of short term labor.

They’re not just gonna be able to find a new job because in this hypothetical, the demand for it (a living human doing the work) is gone.

yeah but that's the thing, i'm not convinced that the entire field is just, gone. Maybe a small portion of it, like 10-20% is less active overall right now. Some art communities are pretty insulated from the broader market as a whole, the furry community being one of them. I think most realistically, AI generated art is going to be used in places where you aren't actively removing art from the art pool, but adding more art to the already busy art pool.

I mean in Hollywood for example, AI is most commonly used to do what, remove mustaches and shit? Replace existing CGI that would need to be done on top of faces? Particularly the really time intensive and tedious parts of it that aren't cost effective to approach and manage. Otherwise you would just pay a normal artist to do it, because they're going to be extremely competitive. It's not like you can just delete an entire production team and smash a movie script through an AI (that was also written by AI) and get a full movie out of it.

I think a lot of it is a senseless overreaction, i'm sort of sympathetic to it, but at the end of the day, there hasn't been an artist famine to my knowledge, so it seems like things are going fine.

I consider this an increase in supply because it’s an increase in the potential supply.

counter point. It's not actually an increase in supply potential, it's both supply and demand. You can't just look at this like it's increasing productivity exclusively, although it is in some part, AI simply can't do certain things that humans can. And you can't act like an increase in productivity won't drive in increase in demand either. Because it will. That's the entire operating principle of the global economy. a steady increase in productivity leads to a steady increase in supply, which leads to a steady increase of demand, which leads to overall growth.

You might be paid less as an artist, but unless you get a significant downturn that we would've already heard about, things like covid are going to effect you more heavily.

if you 10x the productivity of an employee, that typically doesn’t mean you produce 10x the product, it typically means you need 1/10th the employees. That payroll saving goes right into the pockets of execs.

but you also need to consider that in the global market, you aren't just supplying a magical constant of demand, if you can increase demand, you can increase supply. and lower the price of the product as well, making it both more competitive, more accessible, and more productive. Since now you have an incentive to increase the production of that product to meet supply.

Of course if people don't buy things, it doesn't matter anymore, but i think both of us here can agree that people don't really seem content to stop spending money on things any time soon.

Also wrt luddites, they weren’t wrong. It did absolutely demolish their industry and devastate the workers.

isn't this primarily because they never industrialized/mechanized and then ended up being out competed in the market by people who did? Even if this is a concern, you can always shift focus, and move from making generalized textiles, to more complex, difficult to make, and more expensive textiles. Humans always have an advantage of machines and robots, we're smarter. A lot smarter. You can look at other industries like watch making for example, expert watch craftsman could have lost their jobs to mechanized industrial production, but they never did. Hand made watches are still a big deal.

LLMs threaten almost every industry

i'm not sure how much they threaten every industry collectively, though i'm sure they pose some level of obstacle, at the end of the day i've only ever seen AI being used for janitorial tasks in companies, like boiler-plating reports, and categorizing documentation. And also customer support, which is not particularly something i'm a fan of, but there is definitely utility in it. Almost never fully replacing entire positions, that would be too costly.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

I largely agree with you, and I definitely appreciate that you're being very civil in our discussion.

There are a few points I'd like to clarify and maybe counter:

I think you're putting too much burden of responsibility on the workers for decisions that the employer makes. To just say it's a liquidity problem is ignoring how many people have a liquidity problem and the sources of that problem, and the responsibility that employers should have to their community. I agree that some degree of turnover is ok, but I don't think that's what we're talking about.

I agree that the entire [field] won't just vanish, but I believe that the increase in productivity means that they'll need way fewer workers.
this isn't just affecting fine arts and support:
This is also affecting things like technical writers, marketing, copywriting, programming, paralegal, even diagnostic medicine. Pretty much any office job is in the line of fire.
And when the spread is that wide, even 10-15% of the workforce is devastating to industries and even the economy as a whole.
And the spread is only gonna get wider as they introduce "agents" who are capable of making "decisions" autonomously, so you don't need a human to tell the AI what to do, and then do something with the output.
Yes the Luddites never mechanized, that's the thing they were fighting against. They couldn't all move to complex textiles, because the market wasn't there for it, if they lowered their prices enough to generate the demand then they couldn't recoup their time and material costs.

Wrt supply/demand, an increase in supply drives an increase in demand through a lowering of prices. This is the foundation of microeconomics. It doesn't really translate to the messiness of IRL, but it's still close enough that it shows that bad things will happen.

In the end it comes down to what the LLM producers are promising. They're promising to be able to do all this. Idk if they can actually fulfill their promises, but I think it's crazy to wait and see if they can before moving to prevent it. They're saying they're gonna do it, let's make them not

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

I largely agree with you, and I definitely appreciate that you’re being very civil in our discussion.

that's the primary reason i'm here at all, i enjoy discussions about things like this, it's interesting, and sometimes you even learn things.

I think you’re putting too much burden of responsibility on the workers for decisions that the employer makes.

to be clear, i wasn't defining this as a worker issue, or an employer issue, i was stating it as a fundamental limitation of our economic model. It's sort of a fundamental limitation of how making money works at the moment.

I agree that some degree of turnover is ok, but I don’t think that’s what we’re talking about.

that probably isn't, but a significant problem i find with people is that they often don't provide enough specificity or detail surrounding their statements or claims, to the point where it's either irrelevant or simply too broad. Broad enough in some cases to the point where you could write a PHD on it, and then work for 20 years in that field, before answering the question.

People will hand waive the entirety of capitalism, in favor of something like socialism, which has no practical implementation as far as i'm aware, outside of the few tries that haven't quite worked out optimally so far. I just can't justify using logic in that way, i try to at least lock down on what i'm talking about to a point where it's broadly understandable. Which is challenging, but that's partly why im here lol.

but I believe that the increase in productivity means that they’ll need way fewer workers.

i think this is probably true, but given the accuracy and competence of most existing AI, i highly expect this to be restricted mostly due to "additional" productivity, it's essentially creating a new market segment where one wasn't previously. An AI alone can't exactly replace a human. It can replace certain aspects and parts of a human, but never a full one. So it's really hard to say how bad it will hit the industries in question.

this isn’t just affecting fine arts and support:

yeah for sure, i'm just not sure how much of this is going to be A: significant, or B: impactful.

And the spread is only gonna get wider as they introduce “agents” who are capable of making “decisions” autonomously, so you don’t need a human to tell the AI what to do, and then do something with the output.

It's also worth noting that this is a significantly more risky move to make, especially if you put it in charge of handling anything other than doing "menial organization" work. For example, money. I highly doubt you would find anybody willing to let an AI buy things for them.

A lot of this labor is already automated through things like scripting and strict data entry. This is probably only going to make it less strict in that sense.

Yes the Luddites never mechanized, that’s the thing they were fighting against. They couldn’t all move to complex textiles, because the market wasn’t there for it, if they lowered their prices enough to generate the demand then they couldn’t recoup their time and material costs.

To be clear, this is kind of the example of bringing a knife to a gun fight, it's your fault if you lose at that point. And while it's definitely true that it cost the market jobs, the increase in productivity was probably more significant than the loss from textiles. Not to mention the decrease in product prices, making the living standards of everybody higher.

You could theoretically never mechanize, but you're fighting a losing battle, by never innovating. Just look at intel, got blown out of the water by AMD since they sat on technology for a decade, and they're losing market share now. They had a huge stock crash over their recent CPU lineup being overcooked, and burning themselves out. They're not having a particularly great time right now, but that's just what happens. And as a market, we're all doing better now, the hardware capability of CPUs has improved MASSIVELY since the start of ryzen, and laptops have even seen a significant boost in productivity so much so that apple had to move to their own silicon to keep any sort of lead on the competition. Really good CPUs are a lot cheaper now, you can use ECC memory with most ryzen chips, while you have to pay intel for that privilege. The single thread speed of CPUs has increased significantly as well, making basically every task that much faster, the power efficiency of chips has also massively improved as well.

Generally, in a market like ours, losing existing jobs, and increasing productivity is going to be a beneficial tradeoff, as it opens more space for other types of productivity down the road. It's sort of the endless optimization of a specific item, but the global economy.

In the end it comes down to what the LLM producers are promising. They’re promising to be able to do all this.

and so far, they've lied. Google cheated on the gemini presentation. Grok can't even produce real facts. ChatGPT has progressively worsened since launch due to bad data. Image generation and video generation has improved, but we're at a point where it can't improve more than it has already. At least that significantly, so we're quickly approaching a wall. Unless we pivot, and they will, but it will have to be marketable at the end of the day, and that's the hard part.

I think it's also worth noting that we should in some capacity, prepare for the inevitable, never be comfortable, always be ready. You can't lie down at the sight of a sword, and not expect to be killed anyway. Fighting against it might work, but that's not historically supported in any significant capacity to my knowledge. You can do nothing, which is even worse. Or you can do your best to prepare as well as you can. There is always something to be offering over other people. Especially AI.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

It's worth mentioning two things here.

I'm not inherently against LLMs and AI.
Im against putting the power of LLMs in the hands of the employers instead of the workers. People self-hosting their own free LLMs to make their job and home life easier? I'm all about that, and I can even forgive the theft and energy usage to an extent.

And also that I'm a developer in this space - I don't train models or sell them directly, but I make products that use LLMs to increase productivity. I know I'm part of the problem, but I was transfere onto the project and my job is simply too good to quit over it, so I'm a hypocrite to some extent. What people in this space are trying to do is absolutely replace workers so that businesses can save on payroll and increase margins. They don't say it, but it's telling how they dance around the topic.

good news for you, there is a lot of open source AI shit out there that you can start fucking with today or tomorrow even. The technicality is that doing anything particular exciting requires about a billion dollars in hardware to actually train models and create usable data sets lol.

What people in this space are trying to do is absolutely replace workers so that businesses can save on payroll and increase margins. They don’t say it, but it’s telling how they dance around the topic.

i'm sure they are, business owners would have 0 employees if they could, but i'm just not convinced that AI is at a point now, or will ever be at a point where it can ever do that effectively. Maybe in an amazon warehouse, which is probably for the better anyway.

[–] bluewing@lemm.ee 3 points 4 weeks ago (1 children)

A very large amount of those dug up diamonds end up as "industrial diamonds." Because they are far from gemstone quality. And they definitely get used up. I have used up my share of them as cutting tools when I was a toolmaker.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

Ok cool, but this is an analogy. Why are you defending the use of AI by megacorps by objecting to irrelevant parts of an analogy on technicality?

[–] bluewing@lemm.ee 3 points 3 weeks ago (1 children)

It's a bad analogy and just plain wrong fact. Do better.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago* (last edited 3 weeks ago)

You're insufferable.

I know the analogy isn't a perfect fit for LLMs in general. Analogies never come close to describing the entire thing they're analogs of, they don't need to.

It doesn't matter because this is a suitable analogy for the argument. This is how analogies work.

The argument is that because the harm has already been done, it's fine to use LLMs.
That same argument can be made for blood diamonds, and it's untrue for exactly the same reason:
Because buying the use of LLMs (which is mostly how they're used, even if you pay in data instead of money) is funding the continued harmful actions of the producer.

I can't believe I have to explain how analogies are used to a grown ass adult.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

this is actually a really debatable argument. If you're buying it first hand, from somebody trying to make money, yes it could arguably be unethical, but if you're buying it second hand, i.e. someone who just doesn't want it anymore, you could make the argument that it's an ethically net positive transaction. Since they no longer want the diamond, and wish to have money instead, and you wish to have the diamond, and less money. Everybody wins in 2nd hand deals, weirdly enough.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

Setting aside "there is no ethical consumption under capitalism", which is a debatable for another time:

I don't totally agree with your assessment of 2nd hand sales: it's not ethical positive, at best it's ethically neutral, because demand trickles up the market. I could go into this more, but ultimately it it's irrelevant:

The 2nd hand LLM market doesn't work like that because LLMs are sold as a service. The LLM producers take a cut from all LLM resellers.

You could make a case that self hosting a free open source LLM like OLlama is ok, but that's not how most LLMs are distributed.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

Setting aside “there is no ethical consumption under capitalism”, which is a debatable for another time:

the simple answer is yes and no, there is both ethical, and unethical consumption under capitalism. As there is in any society throughout human history.

I don’t totally agree with your assessment of 2nd hand sales: it’s not ethical positive, at best it’s ethically neutral, because demand trickles up the market. I could go into this more, but ultimately it it’s irrelevant:

if we start from a basis of ethical neutrality, ignoring the parties as a baseline, i included the parties because i would consider two parties who wish to do business, successfully doing business, to be an ethically positive thing, as two people have had their needs/wants more closely met by this deal. Therefore making it ethically positive.

If you're starting from an ethically negative point, you need some sort of inherent negative value to exist within that relationship. Perhaps the guy who is selling was hitler for example, that would make it an ethically negative scenario. Maybe after he sold it, he would've donated the money or given it to someone who can do something more useful with it, making it even more ethically positive.

there's an argument to be made for something with perceived value and static supply to have an upwards trajectory in the market going forward, for something like blood diamonds this is unlikely, but assuming it did happen, this should actually be an ethically positive thing assuming that the families of the original diamonds ended up getting their hands on them for example. If they weren't given back to the family then it doesn't really matter since you're back to the baseline anyway. Prospective investments aren't a real tangible asset, so we can't forsee that.

although to be clear, i wasn't talking about LLMs, this is a much harder thing to do with LLMs, although the argument here is that the training has already been done, the power has already been consumed, you can't unconsume that power, so therefore whatever potential future consumption that could happen, is based off of the existing consumption already. Unless of course you did more training in the future, but that's a different story. Just boycott AI or something at that point lol.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

In both your examples, you seem to assume that the harm is already done and that there is no continued harm.

But in both cases the harm isn't finished; the blood diamond mine owners use the continued sale of blood diamonds to fund their continued mining operations, and LLM providers use the sale of LLM use to fund the continued training of new LLM models.

Regardless of if you think that buying second hand blood diamonds increases overall demand in the market (which blood diamonds sellers benefit from); it is clearly the case that selling (and reselling) LLM services benefit the LLM providers, and we can trivially see that they're training new models and not making amends.

But in both cases the harm isn’t finished;

says who? In my example i assumed that the blood mined diamonds had already ceased production, because obviously if they haven't theres no point in talking about market forces at all. The more pressing concern would be the literal blood diamonds. I was talking about the second hand market for what were previously blood diamonds, and technically still are, just without the active cost associated.

And again, what funds, there are no funds, this is a purely second hand sale. The seller is not giving a percent back to the diamond mining company that no longer exists here.

and LLM providers use the sale of LLM use to fund the continued training of new LLM models.

i would agree with this, but it seems like we very quickly hit a new technical limitation as of the last few years. The pace has drastically slowed, the technical nature of the AIs have improved less, the broad suitability has improved more. And it's also worth noting that this is an itemized cost. Not a whole static cost. Just saying "but but, ai consumes lots of energies" is meaningless, unless you can demonstrate that it's significant, and actually matters. I think there is definitely an argument to be made here, but unfortunately, i have yet to see anyone actually argue it.

and we can trivially see that they’re training new models and not making amends.

what do you mean when you say amends? Carbon capture? Paying off artists so they can "steal their jobs? This is meaningless to me without an actual practical example.

[–] huginn@feddit.it 23 points 4 weeks ago (2 children)

I mean yeah: if we went and killed every person who benefits from conflict diamonds and closed all blood diamond mines why wouldn't you be cool with using the resources? Their evil origin has little to do with their practical utility and if the original sin is expiated there's no reason not to?

Like yeah conflict diamonds have basically no purpose because we can make diamonds cheaper and better in labs but in a situation where there are more practical uses (cobalt, LLMs) once we cleanse the land of the sinners why wouldn't we use their ill gotten gains for good?

[–] PeriodicallyPedantic@lemmy.ca 4 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

But we're not "killing" every person who benefits, literally or figuratively. We're continuing to buy their diamonds (pay them in money and data) while they continue to mine (train new models, use copyrighted material).

[–] huginn@feddit.it 1 points 4 weeks ago (2 children)

Agreed.

But the entire point I'm making is there's nothing wrong with the diamonds, the problem is with the method and the people profiting from it.

You were saying the diamonds were not fine by dint of origin. I'm saying let's right the wrong and then use the diamonds.

[–] PeriodicallyPedantic@lemmy.ca 3 points 3 weeks ago

It's not a perfect analogy, models ape the work of artists and take their jobs; it's like if the diamond was bloody, and as long as it existed, the miner's family not only didn't get compensated for the loss but we're also prevented from getting jobs themselves.

We're not righting the wrong, were making the wrongs even worse. At some point you have to just burn the whole thing down.

[–] Jtotheb@lemmy.world 2 points 4 weeks ago* (last edited 4 weeks ago)

Okay, so again, no new machine learning ever, unless you can prove it’s done without environmental impact or affecting peoples’ right to a dignified existence. That’s the wrong righted. That’s what you’re advocating. Am I misunderstanding?

[–] Deestan@lemmy.world 8 points 4 weeks ago

I see your point and agree with it, but I believe you read more defense in my comment than I tried to put in.

I'm not in favor of this waste.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

i mean yeah, statistically, an already mined diamond is a child already dead. you would just stop new diamond mining, or move away from child consumptory diamond mining, you aren't going to completely demolish every child diamond in existence though, there's no point, harms already done. Might as well leave them in the market.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

But buying them funds the diamond miners to mine more. That's the point.

By participating in the market, you're perpetuating blood diamonds.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

only if they existed actively. And when it comes to the secondhand market, someone who no longer wants a product, and someone who wants that product, who can buy it from the person who no longer wants it, effectively means that two people get the same value from one singular product, essentially creating value from thin air.

The point i'm making, is that as long as you have diamond mining that requires killing people to produce diamonds, those diamonds are an ethically negative commodity, however if you are no longer killing people, those diamonds are already out there, those people have already been killed, there is nothing you can do to undead those people. So you might as well leave it in market circulation at that point.

You could argue that it would incentivize more illegal mining, but i would argue that bad regulations and enforcement incentivize illegal mining instead.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

I think that the market (profit) incentives the mining, and that lax regulations simply fail to curb that incentive. We can see from the war on drugs that simply regulating or criminalizing something won't stop it if there is a market, it just drives up prices.

But in any case, this aspect is a lot less ambiguous for LLMs because selling the use of an LLM is legal, and the sellers use the money to train new LLMs.

So let's create regulations around LLMs and enforce those regulations, before it starts to really affect the job market and environment.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

i definitely agree that there needs to be some rewriting of law, specifically copyright, which will inevitably be relevant to AI. But i'm still not convinced it's a massive market black hole or anything.

Unless you want to be cucked like farmers in the US who take in massive subsidies and cry and moan everytime something moderately inconveniences them.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

What I want is regulation that protects workers more than employers

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 3 weeks ago (1 children)

shouldn't this be the default position of the economy though? Especially in a democratic society, where voters are actually educated.

The workforce is your primary labor pool, so assuming enough governmentally enforced labor protections, and significant drawbacks in more exploitation, it will inevitably trend towards less.

[–] PeriodicallyPedantic@lemmy.ca 1 points 3 weeks ago (1 children)

It should be, yet it isn't.
There was a brief window in north America where that was the case (or at least they were making progress), but that was decades ago.

[–] KillingTimeItself@lemmy.dbzer0.com 1 points 1 week ago (1 children)

any data on this or anything? Or are we just going to say things.

[–] PeriodicallyPedantic@lemmy.ca 1 points 1 week ago* (last edited 1 week ago)

To be clear, when I said "it isn't" I meant it's not the default.

I dont really care enough to go dig up hard numbers.

But the time I'm talking about is after WW2, when the new deal was still going strong, and the top marginal tax rates for people and companies inhibited accumulation of wealth, and incentivized reinvesting profits into the company and workers instead of into exec bonuses and stock buybacks.

The era had lots of other issues, notably bigotry, and consumer protections, but worker protections were moving in the right direction.

Heath and safety have kept improving, but employment security has gotten worse because the incentives we had for companies to treat employees well have gone, and put employees at a disadvantageous position when bargaining with their employer