That seems like a Q3 issue for 2026 let's put the conversation off till then.
/s
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here.
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
That seems like a Q3 issue for 2026 let's put the conversation off till then.
/s
Q3 2026 will come around and the AI will report that revenues are down. The CEO will respond the only way they know, by ordering that costs be cut by laying off employees. The AI will report there is no one left to lay off but the CEO.
Fade to black and credits roll.
Capitalism is all about short-term profit. These sorts of long-term questions and concerns are not things shareholders and investors think or care about.
Further proof of this: Climate change.
Funny thing is that capitalism accidentaly solves global warming same way as it created it - turns out renewables are cheaper than fossil fuel, and the greed machine ensures the transition to more cost efficient energy sources
The problem is that the previous accumulation of capital has centralized a lot of power in actors who have a financial incentive to stop renewables. If we could hit a big reset on everything then yes, I think renewables would win, but we're dealing with a lot of very rich, very powerful people who really want us to keep being dependent on them.
Pathogens don't really think of what will happen after the body they're abusing dies
Back in the 1980's they told me it'd trickle down.
...eventually.
They were actually talking about how they were pissing on the living room floor while we're in the basement.
Don't think of people having money as an on-off switch. It's a gradual shift, and it's already started, before AI was a thing. AI is just another tool to increase the wealth gap, like inflation, poor education, eroding of human rights etc.
Capitalism doesn't look that far ahead.
I agree it's going to be problem. It's already happened when we exported manufacturing jobs to China. Most of what was left was retail which didn't pay as much but we struggled along (in part because of cheap products from China). I think that's why trinkets are cheap but the core of living (housing and now food) is relatively more expensive. So the older people see all the trinkets (things that used to be expensive but are now cheap) and don't understand how life is more expensive.
Ever heard of the everlasting sustainable war? https://ghostintheshell.fandom.com/wiki/Sustainable_War
If robots generate all of productivity and human labor is no longer needed, the economy would not be able to sustain itself. Instead, in trying to cope with the unneeded human labor and to ensure continued productivity, the only area where productivity would be ensured is by means of war using human resources, namely destroying things in order to be rebuild, thus generating a sustaining feedback loop. The rich will get richer and everyone else will only be employed as soldiers in a continuing war economy.
Even though this is a sci-fi concept, i believe it's not a stretch to say we are headed to this direction.
This is a common question in economics.
It's called technological unemploymemt and it's a type of structural unemployment.
Economists generally believe that this is temporary. Workers will take new jobs that are now available or learn new skills to do so.
An example is how most of the population were farmers, before the agricultural revolution ans the industrial revolution. Efficiency improvements to agriculture happened, and now there's like only about 1% of the population in agriculture. Yet, most people are not unemployed.
There was also a time in England when a large part of the population were coal miners. Same story.
Each economic and technological improvement expands the economy, which creates new jobs.
There's been an argument by some, Ray Kurzweil if I remember correctly, but others as well, that we will eventually reach a point where humans are obsolete. There was a time when we used horses as the main mode of land transportation. Now, this is very marginal, and we use horses for a few other things, but really there's not that much use for them. Not as much as before. The same might happen to humans. Machines might become better than humans, for everything.
Another problem that might be happening is that the rate of technological change might be too fast for society to adapt, leaving us with an ever larger structural unemployment.
One of the solution that has been suggested is providing a basic income to everyone, so that losing your job isn't as much of a big problem, and would leave you time to find another job or learn a new skill to do so.
A major problem is all the money from these increases in efficiency go to a handful of people, who then hoard it. A market economy cannot work with hoarding, the money needs to circulate.
The rich. Companies will stop targeting products to wider and wider swathes of people, just like nobody caters to the homeless now.
This doesn't sound sustainable at all. A billionaire only needs so much gasoline, food, medicine, TVs...
Collapse of entire industries will happen way before we even get a chance to see industries reinvent themselves to cater to billionaires. Don't believe me? Just look at what happened to the economy during the pandemic.
In theory, UBI.
In practice, it will likely lead to periodic job market crashes due to overapplying to the remaining jobs, and possibly even revolts.
If AI is really as good as its evangelists claim, and the technology ceiling will rise enough. IMHO, even the LLM technologies are getting exhausted, so it's not just a training data problem, of which these AI evangelists littered the internet with, so they will have a very hard time going forward.
They won't, they'll simply die and the market will slowly adjust to those with capital.
This is all happening because we shot a gorilla in 2016 btw.
Corporations, especially publicly traded ones, can't think past their quarterly reports. The ones that are private are competing with the public ones and think following trends by companies that are "too big to fail" will work out for them.
I'm an optimist, so I'll believe one day we'll have a utopian society like in Star Trek. I ask politely you don't criticize me too harshly
Hey, that's a reasonable thing to hope. The flip side, of course, is that I'm hoping I don't have to live through Star Trek's idea of how the 21st century goes. They definitely got all of the details wrong, but I'm afraid the vibes are matching a little too well.
In a better world machines would do the work and humans just would share the wealth and live life in peace.
AI owners will.
And if you then go around wandering "oh, but not every AI builds something those few people want", "that's way too few people to fill a market", or "and what about all the rest?"... Maybe you should read Keynes, because that would not be the first time this kind of buying-power change happens, and yes, it always suck a lot for everybody (even for the rich people).
As stated, the companies that push AI aren't concerned with the long-term consequences. But if you want to know how the individuals who run those companies personally feel, do a search for billionaire doomsday preppers.
TL;DR: They've got a vision for the future. We're not in it.
The vanishingly small amount of people that will be unfathomably rich in a privatized post-scarcity economy will give us just enough in UBI to make sure we can buy our Mountain Dew verification cans. And without the ability to withhold our labor as a class, we'll have no peaceful avenue to improve our conditions.
Why would we need anyone to buy things? Remember that money is an abstraction for resources. If you can do everything with AI, then you already have all the resources you need. Whether or not someone else needs what you produce is irrelevant when you already have access to everything you could want.
I see three possibilities if AI is able to eliminate a significant portion of jobs:
I'm guessing reality will have some combination of each of those.
If ONLY some smart fella had pushed a theory about collective ownership of the means of production or something
That, my friend, is the problem for whichever schmuck is in charge after me, a C Suite executive. By then I will be long gone on my private island, having pulled the rip cord on my golden parachute.
Everyone will be working multiple shitty service jobs that robots are not cost effective to automate. Our miserable wages will be just sufficient to keep the wheels on the cart from falling off.
Look at empires of the past.
Things were so bad in Dickens' London that living in sewers to live off whatever scraps you could find was an actual occupation.
Wealth creates its own reality.
That's the neat part. No one.
If the rich can hire a handful of the middle class to build and maintain their robots, then they can just cut the poor and working poor out of the economy entirely, and they will be willing to accept any conditions for food and shelter.
We can arrange the economy anyway we choose. Taking all of the decision making for themselves is part of the plan.
You're describing the end goal of monopoly
Look up crisis theory, the rate of profit tends to fall in capitalist systems. Because each company is driven by competitive self-interest, it is incapable of acting for the good of the whole. You simply cannot devote resources to anything but trying to out-compete your rivals and in doing so the profit for everyone tends lower and lower until you have a crisis.
You’re implying AI has the intelligence to remotely achieve this. It doesn’t. It is all venture capitalist porn for over glorified keyword copy paste. Thats it.
That's the cool part, you won't. If everything crucial is automated, people can drive things forward for passion rather than for money. Of course, this would effectively collapse capitalism, which won't happen painlessly.
This question was asked, and the answer was "Kill the poor, make line go up."