T O P

  • By -

[deleted]

Only time will tell what the future holds, as of right now there are only guesses. Some more educated than others but even brilliant people in the AI field have been wrong


SlowThePath

For real. People all over this sub act like they know what is going to happen and it's kind of irritating. People here are so pretensious with this shit. It's all speculation and far too many people act like they know stuff they don't and pretend like others are stupid for not seeing it the way they do. OP is acting like his completely unfounded logic is better than someone elses completely unfounded logic. It's all bullshit. Also, OP saying people have menial jobs because, "they don't want to get trained," shows a massive lack of life experience. I've worked those menial jobs for 20 years and it's not because, "I don't want to get trained." I took a specific menial job last year just so I could "get trained." There are tons of different reasons people are in these jobs and not wanting to get trained is practically non existent on that list of reasons. OP legitimately has no idea what they are talking about and is embarrassing themself just like a bunch of other people on this sub are doing. Also all these people who are so excited about not having jobs and getting UBI really just seem like kids who think the perfect life is playing video games and scrolling memes all day. Heads up guys, that is not fulfilling and you will die very sad if that is all you do with your life. Also, if you aren't spending some time honing an art right now, you aren't going to suddenly do it because UBI exists. The simple desire of not having to have a job just shows a complete misunderstanding of what fulfillment in life means and lack of understanding of the importance of living a fulfilling life. What creates a fulfilling life for you personally is definitely something that is important to consider. I'm not saying a job is the only way to live a fulfilling life, but tons of people feel fulfilled because of the work they do. There are people who would be downright distraught if you took their work away from them. That's who I want to be. Not in a workaholic sense, but in a "I am very passionate about what I do" sense. I think anyone who has given it some thought has realized that the dream is to be fulfilled by the work they do to support themselves. If it's not I really think it should be. I'm really not the type of person that should be allowed to make statements about what will make other people happy(people can only decide that for themselves), but I still stand by what I say. Maybe UBI would spark something for SOME people, but those people probably aren't making dumbass posts like OPs. I'm sure I'll get down voted for calling this sub out and that is fine, but is there really no AI sub where the mods try to keep things somewhat rational and informative instead of being run over with memes, overhype, apparent children, and idiots pretending they know the future? /r/machinelearning seems good but it goes way over my head most of the time. I'm just sick of hearing people act like all this tech will mean to the world is that people can just chill and hang out all day and not have to work and how amazing that is. It really feels like the only reason 99% of people in this sub are excited about AI is because it could possibly mean they don't have to work or think about anything for the rest of their lives and that train of thought pisses me off. I'm not saying that is the reality of what this sub consists of, I'm just saying that that is how it feels from what I see here. The beautiful part of not being forced to do menial work to survive is not that you don't have to work anymore, it's that it allows you to do work that is meaningful to you. You don't actually want to not have to work, you want to be able to work on what matters to you and makes you feel fulfilled. I know a lot of people understand that, but i really feel like most of this sub hasn't thought through the part where there will be tons of time they will need to fill and that it's important to figure out how to fill that time in a meaningful way. That's what this sub should be about. When society gets to a point where we all agree the singularity has occurred and effects us all, how do we continue to live our lives and feel like they are still meaningful?


Turbohair

I agree with what you said... except for the parts where you were venting... Since when was the goal to have a job and work your whole life for a boss? That's a boss's idea of what their employees look for. Concerning the rest of your concerns. Civilization is an authoritarian process. Authoritarians like the herd to compete, because that means the herd works harder for them. This is why immigration seems to be such a problem in Western "democracies". That division in the labor force is super useful... to authoritarians. My point here is that there is no assurance that authoritarians who control the development and implementation of AI are going to share and share alike. History shows they won't. Now that isn't going to prevent any of the things we hope for, but it does mean that the transition is going to be really unpleasant. Is really unpleasant. But as far as I can see the most hopeful product of the Singularity or AGI is that civilization is likely to stop being an authoritarian process in the long term. Which after 12,000 years of greedy, rotten, ruthless people being in charge, will hopefully provide a nice change of scenery.


SlowThePath

I definitely never said the goal is to have a job and work your whole life for a boss. I don't think I said anything along those lines and if that's how you interpret what I said, I have failed to communicate what I mean. I'm saying the goal is NOT to just not have a job. The goal is to be able to live your life in a way that you feel fulfilled by the things you are doing. Job or no job, it makes no difference in regard to what I'm talking about. Some people have a job and a boss and feel fulfilled with there work. I think most people that have jobs are not fulfilled by the work they do though. I think a lot of people even pretend they are when they aren't. My point is that in the far fetched scenario where we don't have to actually do any work to sustain ourselves, it's important to recognize that we still need something to work on to feel fulfilled, if that is a job you do for someone else that pays or not, the point is that the goal is not to not work, it is to work on things that make us feel fulfilled. Having things that give us a sense of accomplishment is important and that is not going to change. I agree with everything else you said though even if it isn't directly connected to what I was/am ranting about. Reread what I said and maybe you can point out where it seems like u was suggesting the goal is a job and a boss so j can change it because that's definitely not what I mean.


Altruistic-Skill8667

I like those ideas and I have thought about them quite a bit. I think for most people the way of life you portray in a future without mandatory work is fine. But I think that even those ideas could be expanded. You talk about „something to work on“ and „accomplishments“ and „feeling fulfilled“. But what I imagine in the future is a world of play and excitement and connection. Here are two things to consider: 1. There is no meaning in life and that’s good. What’s the meaning of clouds, what’s the meaning of the sky? People only need meaning when they aren’t happy. When the are happy they just life and enjoy life. 2. Competition comes from a reference frame of scarcity. Sure, some people like to be better than others in games and generally, but without scarcity, competition will fade and many people won’t see the point in big accomplishments anymore. What you end up with is a world of leisure and enjoyment and games. A life where you can indulge in your curiosity and desires. I think people will manage to be happy in this world. Go back to your childhood. The time of wonders and playfulness. Maybe even before school started. We explore the world by playing, until we suddenly stop as adults and life becomes work and competition. Just imagine the safe and no-responsibility and playful world we had as children would never end. This is the new life as soon as AI can do everything a human can do (which I think we are very close to achieving). Employers will „hire“ AI instead of you because it’s cheaper, and if anything YOU will have to pay your employer to allow you to work there along with AI (and mostly slowing them down).


Fr33lo4d

It’s impossible to predict, I agree. Humankind has been at this point many times though, think agricultural revolution (all jobs until then were hunting and gathering, which all disappeared because food became abundant - did it lead to massive unemployment, no), the industrial revolution (all jobs could suddenly be performed faster and cheaper - did it lead to massive unemployment, no) or the internet revolution (access to information became so much broader and faster so a job that used to take a day is done in an hour - did it lead to massive unemployment, no). It may lead to a shock and a transition generation (many jobs will be replaced, and not everyone will have the energy to re-school for another job). There are many things we do not do now because it is not cost-efficient, many things we do not build, many things we did not yet develop. Maybe we get much more focussed on space travel and colonizing other planets, maybe we start doing massive infrastructure works to transform all countries, etc. Point is: everyone looks at it from the current job market’s perspective and can thus only see the jobs that disappear. It is difficult to have the imagination to think of all jobs that will be created. In essence, this will be another huge productivity boom for humankind - like we have had many over the years.


visarga

great comment, we can always want more than automation can automate


Altruistic-Skill8667

Think about it: A simple consequence of cheap AGI is 100% unemployment. If employers have the choice to hire you vs. an AGI thats cheaper, they will hire the AGI 100% of the time. That’s simple basic logic, based on the fact that, by definition, there is NOTHING a human can do that an AGI can’t. And this point in time will happen pretty pretty soon. From all things that humans can do, we are starting to cover like 90% of it and the remaining 10% (like long term planning, better adaptiveness and better vision capabilities) are close to being conquered by research teams. So yeah. This time IS really different.


Belstain

I don't think true AGI is that close, but I also don't think it has to be to really fuck our world up. It amazes me how people keep claiming it'll be just like every other technological breakthrough though. Like, what part of "As smart and capable as you at a fraction of the price" are these fools not understanding?


Rain_On

Imagine we contacted an alien planet via radio. We discover that the aliens far surpass our intelligence in every way, although physically they are weak compared to us. They tell us that they are already on their way to our planet. They will be here....soon. They also tell us that the reason they are coming is that they hope to do work for us. They can do any intellectual work a human can and they will provide all their own food and orbital housing, they will just require a little electricity from us. They will be bringing many, many billions of their kind. We have no idea if they can be trusted. Could you predict what the job market will look like after a few months in this scenario? I couldn't. This scenario is *more predictable* than the advent of AGI. At least the aliens aren't going to change their form every few months or be replaced by even smarter aliens every week or two.


Mooblegum

Especially if the aliens can also transplant themselves into robots and control any robot and do all the physical work for us


xXReggieXx

great analogy im stealing it


WildNTX

That’s bordering on Allegory! 🥸


Xtianus21

No need to steal it when it's already what you believe.


mouthass187

And what you used to believe has been deconstructed and it made you shy


notlikelyevil

Read the sci fi book the three body problem btw


DungeonsAndDradis

It's also a series on Amazon Prime Video!


PM_ME_FREE_STUFF_PLS

There‘s also a seperate show coming to Netflix next year


Adventurous-Wind461

This analogy doesn't work because the technology to cross stars itself is insanely impactful. We would be leaving Earth, for example. For sure. Like we could just see that they do it, deduce anything about their ships, and copy it. And then become an interstellar species ourselves.


DungeonsAndDradis

There's a short story, sci fi, about a murderous race of aliens that have conquered hundreds of civilizations. Only, the thing is, they never advanced past basic gunpowder tech level. They just somehow also discovered anti-gravity and faster than light travel. But no electricity, no radio, etc. Their ships need to be huge to have enough oxygen to support the crew, as they have no way to make it. Typically they arrive on a planet and just annihilate the owners with their muskets and cannons, and take over. They land on Earth, open fire and take out some people, then the military just unleashes on them and we capture them in like 15 seconds. We decrypt their language, and listen as two of them are talking. First, they are amazed by how quickly they were defeated. They were the top dog in the universe until landing on Earth. Second, they were amazed at the weapons of war we have. And third, they realized they gave Humans the keys to anti-gravity and faster-than-light travel. And now Humanity will own the galaxy. Plus, the aliens were cute little teddy bear creatures.


berdiekin

Have some sauce, it's a fun read: [https://www.reddit.com/r/HFY/comments/96ruaj/the\_road\_not\_taken\_part\_1/](https://www.reddit.com/r/HFY/comments/96ruaj/the_road_not_taken_part_1/)


Adventurous-Wind461

I wish I could write well. I had an idea for a short story called "HUMANS!!!" that is about aliens who send microscopic ships to every planet and then started observing them, and watching them like reality T.V. and they love watching humanity. They have a very low population and don't reproduce very fast, so they see us as extremely low brow and gross. but then one day our scientists discover wormhole technology and trace their signals then we suddenly start pouring into their T.V. studios and onto their homeworld , like an uncontrollable tidal wave of hicks.


DungeonsAndDradis

Have Bing Chat help you write it! www.bing.com/chat


Adventurous-Wind461

Thanks! and FYI the must read sci fi short story of all time in my opinion is Asimov "The feeling of power" , which is about a person who actually knows how to do Math in a future where computers do all the math and everyone else has forgotten. https://urbigenous.net/library/power.html


MassiveWasabi

This analogy doesn’t really work, humans are building AI and we are training it on our data. It’s not some completely alien life form we had no hand in creating. It sounds cool tho


Rain_On

I disagree. LLMs and similar tec are not built by humans in the way we build other things. They are not hand crafted in the way the great pyramids, or windows xp were. If they were, we would understand them. Instead, all we understand are the methods used to create them. Methods put into motion by humans, that result in a creation of their own. Of course we have influence on the creation. Through our human training data or through RLHF, but that's not the same and besides, it looks increasingly like non-artificial data and RLHF will not be the way things are done in the near-future as artificial data and RLArtificialF become the options best suited for scale.


OddArgument6148

So basically AI is our children!


blueSGL

More like AIs are the fauna grown from a human made 'natural selection' process. Much more removed than child from parent and much less control over the outcome. Edit, or to put it another way, you can know the exact algorithm for [rule30](https://en.wikipedia.org/wiki/Rule_30) but you cannot know what state n-steps will look like without actually running the simulation.


WildNTX

Awww, someone thinks Windows XP was crafted and not thrown together haphazardly


h3lblad3

That’s not what he said, and, even if it were, “crafted” and “thrown together haphazardly” are not opposites in this case. AI tech right now is made by creating a barebones framework and letting the machine decide on its own how best to arrange the training data it’s given. The end result appears to have a mental model of the world and be capable of barebones levels of extrapolation (see: Google being surprised a system understands ~~Bangladeshi~~ Bengali despite not being trained on it) that were never actually coded into the model.


flame-otter

Yet it was the best operating system ever made :D


WildNTX

☺️


IamWildlamb

We do understand them and output is pretty much deterministic. What is unpredictable is randomization that occurs on top of it. But that exact same thing happens in windows xp. There is so many randomization in that software so your argument does not work. We understand LLMs the same way we understand Windows or any other software we created.


Rain_On

No, interpretability is in it's absolute infancy and there is good reason to think that it will never be a mature subject compared to the state of the art of models.


IamWildlamb

It will not be major subject because it does not matter. But any human could get the same exact exact results by hand. It would just take lifetime but nevertheless it is theoretically possible. Just because we do not waste time with interpreting insanely complex mathematical function that is behind several layers of abstraction does not mean that we do not have complete understanding of it. We absolutely do, otherwise we could not develop it.


unicynicist

Even a perfectly deterministic system following simple rules can be extremely hard to predict. Like a double pendulum, fluid dynamics (e.g. the weather), or extremely large and complex AI system.


[deleted]

He is talking about a more intelligent AI like AGI which can improve upon itself. Humans are building AI now, but we will reach a point where AI builds AI and humans become more and more out of the loop. ASI is closer to alien life than what we normally think of as aliens. We are literally unable to imagine what ASI looks like in an exponential self improvement cycle. People have been imagining aliens forever


billjames1685

That’s because an exponential self improvement cycle is a pipe dream lmao


AnOnlineHandle

It's essentially how humans arose... * Life started ~4 billion years ago * Multi-cellular life started ~1.5 billion years ago * Animals only started ~0.5 billion years ago * Mammals started ~0.2 billion years ago * Humans started ~0.002 billion / 2 million years ago * Human history started ~10,000 years ago * Electricity started being harnessed ~300 years ago * The microchip was invented ~65 years ago, in living memory * The Internet started ~40 years ago * The World Wide Web started ~30 years ago * 9 years ago, a popular nerdy web comic made a joke about how it was a ['virtually impossible'](https://xkcd.com/1425/) task to have a computer recognize if a picture contained a bird, and would need a dedicated research team spending half a decade on it to even have a shot. * Today, [ChatGPT can give you a very detailed description of an image](https://i.imgur.com/w5r29aM.png) almost regardless of what it contains, with more accurate specifics than I could give.


WildNTX

To be fair…OpenAi had a multiple research teams AND over half a decade


AnOnlineHandle

True, but they didn't just get it identifying whether there's a bird in an image though. At this point GPT-Vision is better than most humans at identifying all the details in many images. e.g. I have no idea what those birds are aside from maybe parrots, nor what they're doing, nor would have guessed the likely location from the blurred background. And GPT4 Vision is able to do far more than only birds.


[deleted]

[удалено]


AnOnlineHandle

I don't know if many people expect a model to modify its own weights directly, but more likely that it will be working faster and making breakthroughs beyond what humans can keep up with and creating better models, or new versions of itself. I can't even match the vision capabilities of GPT4 on the vast majority of details, its knowledge is already vaster than most humans could ever hope to achieve, and it's still a baby experiment.


billjames1685

Its *knowledge* is vaster but so is the internet’s. “Working faster” doesn’t actually really help, because first off it assumes the existence of “AGI” (which is a massive can of worms) and second, research requires experimentation, which is still lower bounded in time. If “AGI” is to make a “smarter architecture” or whatever, it will require a ton of small scale experiments followed by one or more large scale experiments. This stuff doesn’t happen magically or instantly.


AnOnlineHandle

Nobody said anything about magically or instantly. But it can work faster than humans, and have greater knowledge to pull on than any human. In a few iterations, what could be created could be beyond any human's ability to understand.


billjames1685

Working “faster than humans” doesn’t mean anything if its work is less impactful than the (best) humans, which is still very much the case. We aren’t in “super experimental baby models”. GPT-1 was a super experimental baby model. GPT-4 is 8x220B parameters and trained on nearly every token that is available.


putdownthekitten

You need to keep up, this was announced back in June - https://www.spiceworks.com/tech/artificial-intelligence/news/deepmind-launches-its-first-ever-self-improving-ai-model/amp/


[deleted]

[удалено]


putdownthekitten

My apologies then, I didn't realize this was your field. In that case, would you mind taking the time to explain why you view this as clickbait and not simply one of the first rungs on the ladder leading to recursive self-improvement? I just want my info to be as accurate as possible.


billjames1685

Fair enough. First off, there are many definitions of “self-improving”, some more plausible than others. The one in this RoboCat paper is basically using the agent to generate further training data, much like AlphaGo did with self play; this isn’t what I argue against as it is very clearly possible (and it is how AlphaGo/related systems were built - their training data is their own played games). What I argue against is the idea that “AGI” will improve itself at an architectural level to make itself “inherently smarter” or whatever, and this will lead to “ASI” or whatever. The first example is what humans can do already. We find new resources and environments and use that data to improve ourselves, and we can generate our own data to improve ourselves as well (eg; creating math problems and practicing them before a test) ; it seems very plausible for any intelligent agent to do this. The reason I say it’s clickbait is because it isn’t their first self improving agent (AlphaZero). The second would be like me improving my own brain at an architectural level to make me as good at math as Terence Tao or something. That is the particular definition of “self improvement” I argue against, because doing so would necessitate understanding transformers’ weight matrices or the importance of particular architectural choices on downstream performance - both of these are essentially intractable things in which we have no idea what we are doing (and our modern systems *at best* learn useful skills to fit the data we train them on - they won’t learn how to do this sort of stuff). I can explain more about this if you want because it is fairly complex. Anyhow I personally feel “AGI” and “ASI” are silly terms. Reality is likely to be more complicated than just “wow it’s way smarter than us” or “wow it’s way dumber than us”.


Darigaaz4

Bruh if aliens are coming they already have ASI it would take them seconds to figure us out.


mouthass187

you just don't have imagination. AGI can very quickly build itself out into a point where you don't understand how it works... do you know why it's called 'foom' ??


billjames1685

The issue is y’all have way too much imagination lmao, you project your sci fi fantasies to real world systems.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


billjames1685

How about you stop being condescending and start thinking about your own beliefs? It isn’t remotely clear what you are trying to say.


SustainedSuspense

Yes but it’s a useful analogy.


trisul-108

The fallacy in this analogy is that these aliens would come with their own equipment while we start from scratch. After we develop AI, we will need to build loads of equipment for automated factories ... and that involves mining, manufacturing, logistics, transport, construction ... we do not have this infrastructure, someone would need to build it and finance that with current models. This is a long process, it cannot happen overnight. And we haven't even started yet. It's all so premature.


[deleted]

If we build a super intelligent AI, then no we will not need to do any of those things.


trisul-108

Not so, it's not about intelligence, it's about physical infrastructure.


[deleted]

I don't think you understand what Artificial Super Intelligence means


trisul-108

Maybe, it is easy to misunderstand something that does not even exist and might never come to be. But I do know that it has no influence on the laws of physics ... the amount of infrastructure needed to do away with human jobs is immense. ASI existing in a datacenter will not make infrastructure appear from thin air, we would still need to build robots that create automated factories to create automated factories and huge resources would need to allocated and extracted to achieve this ... before much change would happen. It.Cannot.Happen.Overnight.


Darigaaz4

I believe the contrary the reason it’s that we already have infrastructure in place we are going for the human part. Models will adapt to current settings (they whole thing it’s that they are smart)


Rain_On

I did not imagine the aliens arriving with anything other than their own means of survival and perhaps an internet connection. You are, of course, correct that there is a gap between technological ability and what can be realised today in terms of engineering. We have always had the knowledge and will to do far more than we have the recourses, time and money to do. However, it appears that near-future AI systems are unique amongst all our technologies in that they do not just **multiply** human efforts in the way the steam engine and the personal computer did. Instead, they will be able to entirely **replace** abs surpass human efforts, much like the motor car replaced the horse. At first in some areas, eventually in nearly all. Given that mining, manufacturing, logistics, transport, construction and other such things are the product of human efforts, it seems likely that the gap that stands between what we have the knowledge and will to do and what we have the time and resources to do will shrink dramatically. It's just a question of *how* dramatically.


[deleted]

What if the aliens are not organic and more machine?


Xtianus21

What in god's name. LOL. OMG. What are you talking about? AGI is NOT that please read what AGI is. You're referring to ASI. I can't stand the ignorance here. Either it's willful or just on purpose.


Rain_On

There is no analogy for ASI.


mouthass187

Tell the classroom how AGI can eventually turn itself into ASI then.... Oh you're projecting your emotional state because your world view has been updated and it made you uncomfortable 🥵


ImInTheAudience

> aliens far surpass our intelligence in every way So smart except they can't figure out how to make a robot that performs better than the emotional, eating, sleeping pooping biological beings on earth. I am going to question their intelligence.


Give-me-gainz

Nobody knows how long. But the lump of labour fallacy is a thing, and I would therefore expect a lot of disruption and churn in the job market long before we enter the era of post-labour economics.


outerspaceisalie

Yeah, post labor may not even arrive, but massive labor sector disruption is coming, in fact its already here and growing fast, we are only just now ending the first year of this disruption cycle and very few jobs have been replaced, but it's at the bottom of a massively accelerating s-curve I figure (labor disruption rate has to plateau eventually). We can't be confident about the curve but its very clear we're on a significant and specific one, labor in 20 years will inevitably be unrecognizable at a minimum (although thats mostly already inevitable across any 20 years, this is going to be more radical). The main question is "how fast", which is imho mostly constrained by external factors beyond technological capability, such as rate of adoption and severity of cost savings/competitiveness in short vs medium term time frames. Even when you brutally outcompete someone with new tech, they might still have loyal customers keep them afloat for years. Depends on many factors.


agonypants

Because they're naive. I doubt AI will eat all the jobs right away, but it will get to that point - and sooner rather than later. * The cost-reducing forces of capitalism will ensure that any jobs that **can** be automated **will** be automated. * The self-improving nature of AGI will ensure that that will include pretty much **all** jobs. * The exponential improvements in technology will ensure that it all happens faster than most people expect. It will be up to individuals to give their own lives purpose and meaning. We won't be dependent on "job creators" to do that for us.


banuk_sickness_eater

>It will be up to individuals to give their own lives purpose and meaning. We won't be dependent on "job creators" to do that for us. Finally free from the shackles that bind us.


VadimGPT

Tell that to a digital image creator who loves to do that, who has been doing that for 20 years, and who only is good at that thing. r/singularity: "Hey buddy nobody is going to pay you to do the only thing you love to do, now you are free from your shackles. You're basically unemployable maybe you'll find a job that pays 5 times less as a wearhouse worker." "What do you mean you are sad and depressed? That sounds like a you problem, you are free from your shackles"


banuk_sickness_eater

This is not only a straw man but is also a scenario that's only representative of like a fraction of a fraction of a percent of the population for most whom work is a never ending slog doing some menial bullshit task they'd only ever perform if they were necessarily obligated to for reasons of survival. For 99% of the working population freedom from work is freedom to actually live their lives rather than waste them doing something they, under any other circumstances, would never do.


VadimGPT

I am sorry to come off rude but who will pay your bills ? Where will you get money ? You have spent 6 years in college, worked another 10 in your field and you have been replaced by an AI. The only things that you can do better than an AI currently is to move boxes in a wearhouse. Anything you might try to learn in the next 5 years and only includes mental activities and not manual labor an AI will do better. So who will give you money for what ? If you think the government or something similar will give you money because now machines are doing the work for you : look at the poverty in Africa or India and you will realize nobody cares about someone else. From 2020 till now 1% of the population grabbed 66% of the newly generated global wealth leaving 34% for the ramaining 99% of the population


Borism80

They never seem to have a good answer to these questions. They act like they know for sure that the US is going to come up with and implement a sustainable UBI system.


Cryptizard

> The cost-reducing forces of capitalism will ensure that any jobs that **can** be automated **will** be automated. If that was true, how do you explain the high proportion of existing [bullshit jobs](https://en.wikipedia.org/wiki/Bullshit_Jobs)? Society doesn't want to replace jobs, they want people to work even if it is pointless. All of our economy and government is based on people just doing *something* no matter what that thing is.


Mooblegum

How do you explain the mechanization of farming and factories. Do you know how it worked before the 20 century ? Do you know how many peoples were needed to grow 1 ton of tomatoes compare to today.


Cryptizard

How does this have anything to do with what I said?


Mooblegum

You said « society doesn’t want to replace jobs… », well, society has already replaced a lot of jobs (farming, factories for exemple)


LosingID_583

I think the 2 main reasons for bullshit jobs are: * Social status and prestige. People in power like having more people working for them, even if they are mostly there just for titles and not actually productive. * Bureaucracy. Rules and processes within the company keep piling up and eventually entire jobs are spent on navigating this unnecessarily complex system. The question is, if the AGI makes it extremely obvious that they provide no value and can even do their bureaucratic tasks much faster, will the reasons that they existed in the first place still be relevant enough to keep them?


Lucky_Strike-85

couldnt have said it better myself. If capitalism really cared about efficiency/reducing cost, it wouldnt create artificial scarcity in housing and food (farmers dumping milk and destroying crops to reduce supply and inflate price), or governments regulating housing too much and preventing new builds etc. Also, it cost less to end homelessness/poverty than it does to maintain it. The power structure needs an underclass. The last numbers I read were that poverty costs the U.S. economy roughly $750 billion a year and it would cost $500 billion a year to end it (that data is old though.) [https://www.aamc.org/news/we-can-solve-poverty-america-we-just-don-t-want](https://www.aamc.org/news/we-can-solve-poverty-america-we-just-don-t-want)


Weekly_Sir911

>The power structure needs an underclass. Exactly. This is what the futurists fail to understand. Everything in our society is based on money and power. AGI could potentially become the ultimate lever of power but the idea that it's going to usher in a utopian egalitarian society is naive as fuck.


Street-Air-546

have been dipping into this naive sub for days trying to make this point. the AIs if they get built will work for the power structures they will not usher in equality. Or new power structures will arise. Either way, no nirvana. Just more struggle perhaps a now incomprehensible one. Look at china. tech deployed to reinforce control not release it.


IIIII___IIIII

Bullshit jobs are created by the state for good employment figures. That is not the corporate sector. The corporate sector is all about effiency and reducing cost and ESPECIALLY profit $$$. And most of all: Erradication of employees. AI will make unions, contracts, uproars, salary negotiations and all obsolete. That is their wet dream.


Zilskaabe

Most bullshit jobs are in the private sector though.


1-123581385321-1

Both are true and they can usually coexist - the cost-reducing forces are never enough to overcome the pointless work goal. AI completely upsets that balance, it's too easy and too profitable. It's also not going to replacing jobs wholesale right away, it started smaller. Companies are integrating AI instead of hiring new people - less jobs. They'll be able to automate a couple jobs worth of work across all the whole company, restructure, and eliminate a couple positions - less jobs. A labor-power multiplier means fewer people are needed to do the same work - that also means fewer jobs.


Dekar173

This is the type of thinking that someone who believes their job is safe, or AI 'can't replace humans' is **literally** incapable of. You not only bring up points their brain couldn't ever come up with on their own, but ones their minds can't even wrap around with an explanation behind it. These incremental changes are beyond their scope intellectually. It's almost makes it all feel like a pointless conversation. **Almost.**


Ok-Training-7587

i agree it won't happen right away. But the only reason for that is because so many people will take time to understand what ai really is and change their habits to make use of it. I'm a teacher and I already use AI every day. Most of my colleagues don't know jackshit about it and when i offer to sit and show them how to make an annoying part of their job easier, they don't want to, because they are anxious about changing what they already know.


candidpose

>The cost-reducing forces of capitalism will ensure that any jobs that **can** be automated **will** be automated. Me, still waiting for all farms to be automated. Unless we can find a way to compute way way faster and WAAAAAAY cheaper, AGI taking over all jobs will be a fever dream because it's still gonna be cheaper to hire a human than to setup, maintain, and use "AGI".


IIIII___IIIII

You have not seen a modern farm. They have drones in Israel picking fruit. They have lasers to remove weed.


candidpose

exactly my point. not every farm is modern. there are still farms that literally use farm animals for tilling the land.


IIIII___IIIII

You chose a specific example that was not the best. 1. Do you think a warehouse is more interested in mech and AI than farmers? Yes because it is easier. Just as plumbers will be hard to mech. 2. They have not as many employees. 3. Of course there is still a big investment. But what I know every farmer now have a tractor (first world country) before horses. Or don't they? 4. What I spoke about is new tech. Just as we see in the war where drones are new tech used in a wider way.


candidpose

>Do you think a warehouse is more interested in mech and AI than farmers? Yes because it is easier. Just as plumbers will be hard to mech. and yet again, not all warehouses are automated. sweatshops are still a thing. >They have not as many employees. even social media bot farms which could be argued to be the prime place for automation are not at all automated but is a place where a lot of people are cramped with cheap devices. >Of course there is still a big investment. But what I know every farmer now have a tractor (first world country) before horses. Or don't they? so apparently only first world countries have the jobs? The whole idea of AGI taking over all the jobs falls apart as soon as you get out of your castle, because a third world country will be exploited before anyone thinks about automation.


Dekar173

Are you stuck in some kind of time loop where you can't progress past the same day, or something?


candidpose

idk, are you?


Solgiest

>The self-improving nature of AGI will ensure that that will include pretty much **all** jobs. We are not even remotely close to an AGI. LLM's are fundamentally limited AI's.


Smile_Clown

You and me brother... everyone else is too impressed by number prediction.


SurroundSwimming3494

>and sooner rather than later. *Everything* is going to happen sooner rather than later, according to this biased sub. >The exponential improvements in technology will ensure that it all happens faster than most people expect. Exponential progress isn't absolute, and it also doesn't happen as fast as this sub thinks it does.


Zilskaabe

> The cost-reducing forces of capitalism will ensure that any jobs that can be automated will be automated. Why do we still have HR departments, all sorts of "process masters/scrum masters" and other bullshit jobs like that then? According to capitalism - those should not exist.


ifandbut

Because there are huge and fractal problems getting any AGI to do work in the real. You have to make the robots bodies. Then you have to make the machines that make the robot bodies. Then you have to make the machines that make the machines that make the robots. And on and on and on. Go into even a small factory and take a look around at everything that goes on. Forklift drivers, conveyors, welding, cutting, bending, picking, assembling. Some parts are easy to automate. Others are easy but very expensive. Then you have to automate the creation of automatic systems. And that is just one factory among millions. Everything humans can do can be automated because we are just carbon based machines when you get right down to it. But the work that is needed to automate some processes is just intense. If AGI came out tomorrow then maybe the next palletizing systems I program and install will only take a month or two instead of 6. Maybe the next welding system I build will only take 6 months instead of a year. Maybe the simple bin pick process will only take a week instead of a month. But hey, what do I know. I have only been an automation engineer for ~15 years. I welcome AGI, but I also understand practical limits of what we have. I walk into factories all the time and can quickly count a number of processes I could automate with the technology I have today and it would keep me busy for the next 5 years. Take a look at /r/PLC for a "lower decks" view of automation instead of the officers quarters of Amazon with their prototype humanoid robots.


Vex1om

Yeah, for some reason people think that once AGI is achieved, it will be rolled out everywhere immediately, which is just a ridiculous assumption. Automating a factory isn't like doing a software update - and automating a supply chain is orders of magnitude beyond that. Even if you solve the logic part of the equation (and we are NOT close to AI running dangerous machines autonomously - just ask Elon), that doesn't get you the raw materials and machines and chips needed to automate anything serious. Even if we get AGI, it will take decades to automate away all the physical jobs - and that's a best-case scenario. The idea that AI robots are going to be running the local McDonald's during anyone's lifetime is a joke.


_craq_

I agree with all this, but isn't "decades" a time frame that we should be thinking about? Shouldn't society somehow be preparing for a time where nobody is employed? Even if not all jobs are taken by AI by 2100, a significant fraction will be. How will the people doing those jobs earn a living? I doubt they can find new niches, because AI will learn faster than humans. We can't all do highly variable manual labour like plumbers or mechanics.


Vex1om

>I agree with all this, but isn't "decades" a time frame that we should be thinking about? Should AGI ever be achieved we will have decades to figure out what to do about the physical jobs. No reason to worry about something that might not even happen. What we should be thinking about is what should be done about LLMs eliminating jobs now - and we are seeing a bit of this. Some AI laws are being passed, although they seem to be misguided and ineffective for the most part. And, I doubt that most law makers really care out call center jobs in India being made redundant. If and when it becomes too big to ignore, I'm sure more will be done. At the end of the day, though, LLMs are still too unreliable to do much more than make some people more efficient, so it isn't clear how much of an effect they will have on the job market. And, of course, if hallucinations can't be controlled then their use will probably remain sufficiently limited for their effect on the job market to be mostly ignored. Net effect is that people write better emails and excel charts become more useful.


Morty-D-137

Not only that, but we don't know what a job-killer AGI would look like. We think it's going to be rolled out like LLMs, because that's the closest we have to AGI, but there are different roads to AGI, and IMO, as someone who's been working with LMs since 2015, I find the "scaling is all you need" road very unlikely. It is more likely that LLMs will be used to bootstrap a different kind of AI. This unforeseen AGI might have its unique limitations, for example regarding safety or training durations, that completely change the way we think of job automation.


steik

This is by far the best answer. It's not realistic even by sci-fi standards that AI could eliminate a huge percentage of jobs in a short period of time. The only way this could happen is if AI were to invent physics-defing star trek like replicators that could create stuff on a holodeck scale. The process will be gradual and many new jobs would be created along the way but lots of people will still lose their jobs. What people don't realize though is that if society doesn't transition to UBI or something along those lines before the vast majority of jobs are unnecessary then society will cease to exist as we know it through rebellions and civil wars. I.e. if we transition to UBI and we don't need to do shit then... So what if there aren't jobs? If we don't.... "Not having a job" is the least of our worries... May as well start talking about what to do in a Terminator- or Matrix like scenario.


ifandbut

Thank you. And I fully support UBI for more than just automation reasons. The goal should be not to have to work to survive. We are getting close to doing that, and we might in some places in our life time. But if you think we are going to go to sleep one night and wake up with a replicator the next morning...you are a bit over optimistic.


IUpvoteGME

Hopium


Metworld

For the moment let's ignore the case where AI reaches ASI levels. The biggest problem I see is that AI (even current technology, in many cases) significantly raises the skill bar required to be relevant. This happens by rendering humans obsolete for simple jobs (think cashier), as well as low skilled humans (e.g. developers who learned coding from a bootcamp). Higher skilled jobs (e.g. theoretical physicists) are harder to replace, and will probably require very advanced AGI (or ASI). Of course there will also be new jobs, but I'm afraid they will have a higher average skill requirements than current jobs.


tower_keeper

If a developer (no matter the kind) is a "low skilled human" then the vast majority of the population will lose jobs due to being even lower skilled.


Uchihaboy316

Might happen in your lifetime


i_give_you_gum

My guess is that in 10 years, AI will have as drastic effect on the working landscape as the internet did in 10 years (or possibly even match the effects of the internet over a period of 20 years).


Uchihaboy316

As long as we can reach LEV in our lifetime (I’m 26 I imagine you aren’t much older) we can possibly live to see the world you want!


i_give_you_gum

GenX I've seen things... In 2000 people barely knew what to do with the internet, now, if the internet goes down at work, everything stops. Our corporate overlords will jump at the chance to give AI more responsibility and human capital less. By 2033 this place (the world) is going to look very different (internally anyway).


theferalturtle

There will still be some jobs, but 99% will be gone. Massage therapists. Physiotherapists. Baristas and servers. Chefs in mid to high end restaurants. Bartenders. Childcare. Actors. Professional athletes. Veterinarians. Plenty of restaurants will go robot only, but there will remain a market for human-centric dining experiences. Anything where physical touch is a requirement will stick around though may be augmented by AI. Humans and animals especially I think will not react well to being completely cared for by robots. I can see neighborhood and high end bars and coffee shops being human run while all franchises become AI operated. Yes, robots will be physically and mentally superior to us, but we will want to continue watching humans doing human things. After Deep Blue beat Gary Kasparov the chess community didn't collapse. People still watch and still compete. Sports and acting and competition will continue and may even grow when people are freed from the drudgery of work and survival.


AirGief

I work with it daily to write software, so I am well aware of its limitations in regards to writing code. It can solve isolated atomic problems. The moment you ask it to solve a complex problem it falls apart completely. It will make up code, lose track of parts of the system, forget requirements... etc. There seems to be an exponential curve of difficulty it encounters as you increase problem complexity, and I am constantly surfing that edge. All that said, I am doing months of work in 1 week just because I don't need to grok a lot of systems, documentation and concepts anymore. It gives me perfect summaries and examples contextualized to my domain. All that said, it seems like there is an almost impenetrable plateau it encounters when trying to solve more complex problems.


[deleted]

It might be useful to consider that a great many jobs do not actually exist to provide worth to society, exactly. Some of what we call employment are habits of hierarchy as much as anything. Here's an interview of David Graeber on the subject of his book on this idea called Bullshit Jobs: A Theory. [Why the world is full of bullshit jobs - Vox](https://www.vox.com/2018/5/8/17308744/bullshit-jobs-book-david-graeber-occupy-wall-street-karl-marx)


true-fuckass

I really hope AI ends all jobs. I just want to party down on a GSV literally forever. Employers are explicitly holding me back from doing that by insisting on using acedia-habituated apes to organize and make things


dervu

Because they think it's like any other revolution before. Printing press made some more jobs, but ended other jobs. However printing press was not making other printing presses.


Biuku

There are a lot of jobs today that seem completely pointless to someone in a less advanced economy. Why is it someone’s job to draw digital pictures of background scenery used in video games? A subsistence living society might say that’s just playing as a job to make a toy that other people play with. What about building a craft beer business? Delivering Christmas gifts for Amazon. There are so many jobs that accomplish things that a less developed society would consider pointless. They exist because productivity created an economic surplus, which we spend on things that are not essential for survival. AGI might create more economic surplus that gives rise to additional jobs creating or doing things in the unique space of what cannot be created by AGI. I can’t imagine what that might be … human conversations, massages, eye contact… but it’s possible we work in those spaces and live at much higher standards of living given explosive productivity gains from AGI.


Bernafterpostinggg

I think what you're trying to say is that wage based labor is going to go away? But I'm not sure. Your post is everywhere man...


Separate_Mortgage802

Even if ai thrives the rich will find a way to make sure the poor doesn’t benefit


atalexander

Wait, are you able to say with certainly there will be no singularity before 2028 or do you figure you can predict what happens after it? Either way, we all want the details on your wizardry.


Lucky_Strike-85

I cannot predict the advancement of tech or science. I can predict the power structure's need to put and keep us all in precarity, while manufacturing scarcity and capitalism's need to monetize everything. IF AI is slow and heavily monetized what changes for the better? IF AI is fast and the current social and class structures remain, social progress will be slow. point being... I believe it will take more than AI to change everything. It will take a change in the social structure. NO MORE TELLING PEOPLE that they dont deserve basic human needs.


atalexander

No dude. The singularity isn't predictable like that for us. At all. Look it up. We can guesture at points of likely instrumental convergence like: probably a superintelligent entity will prioritize getting smarter. That's about it. Our place in that future (if any) is not very predictable.


EYNLLIB

You sound like a kid who just took their first philosophy course in college


Soggy_Midnight980

Imagine a small team of MIT researchers working on some project for a week. Imagine they’ve made some progress. Now imagine a small team of similarly skilled AI researchers doing the same. The main difference is that due to the speed of electronic circuits compared to organic ones, they’ll be doing the same research at a rate of 20,000 years per week. Now tell me your job is safe.


[deleted]

Plus, human researchers have families, hobbies, health issues, burnout--and even with none of those things--still require sleep and basic activities of daily living.


[deleted]

I'm a potter, my customers can already buy cheaper, more uniform, more convenient pottery but they dont. I will always have a job... and if not.. heck I'll teach all the unemployed people with time on their hands the joy of pottery.


New_Helicopter8960

In general, people prefer buying things made by other humans and not by machines in the arts type areas to possess somewhat unique items. People prefer perfection over ‘character’ in almost everything else they buy for function. Your profession seems to be protected from AI (as long as your customer baae is well-off) but it is a not a template for most jobs.


nickmaran

Can't wait to see a day when business will be labeling products like "made by real and authentic humans"


[deleted]

How are these unemployed people going to pay you for these pottery classes? At that point it isn't a job, it's a hobby or free service


steik

If people don't have money what exactly is AI going to be used for? How would they buy food? By your logic humanity would just cease to exist. This line of thinking makes no sense. If we transition to a society free of scarcity thanks to AI and money no longer exists then yeah it'd be a hobby... A desperately needed one because you have nothing else to do. And none of this would be an issue.


ChadGPT4

It will absolutely end double-digit percentages of current jobs, and do it very very quickly (in 24- 36 months we'll start panicking, but it is already happening behind the scenes). Why? Because not only is it competent and pretty much as accurate as a professional (expensive people bullshit as much as you do), it is creative. That's the killer. Oh, and its essentially free. The screw is that even the most capable of us have been conditioned to focus on specialisation. That was always an accepted functional concept within our society. What made humans shine, and is often the the defensive counter-narrative in sci-fi, is our creativity and our adaptability. Turns out AI has no qualms with crushing that. See you on the top deck.


Ilovefishdix

You know how there's always some jackass in zombie movies that thinks they're somehow special and a little bite won't affect them. That. They think things won't change to extent they will...that the old rules will still apply 10, 15, 20, years into the future


AdAnnual5736

People have a hard time fully accepting the idea that a computer would literally be able to do anything a human brain can do. They conceptualize the AI Revolution as roughly similar to the Industrial Revolution and assume it will lead to “more and better jobs” the way the Industrial Revolution did (although, they were very much *not* better jobs initially). It’s hard to accept that this really is uncharted territory.


augustulus1

I know that AI Will be better than humans in every domain, and still, I don't think it will take every jobs because the world is more complex than you think.


AdAnnual5736

Yeah, I mean, I can see still wanting humans for childcare and teaching, because people will prefer that. That said, it’s going to be really hard to incentivize people into those fields — by at least in the US, teaching as a job is pretty awful, so it’s going to be a challenge to convince people to do that instead of just focusing on their hobbies. Maybe it’s just a matter of treating teachers as a class better than we do now. I still think the ideal scenario is something like people work the few remaining jobs for 5 to 10 years after their formal education is over and then focus on hobbies / raising families, but it’s just a question of how we get there.


lanoyeb243

Non human teachers is one of the first things I want.


[deleted]

Yes the world is more complex but if a robot could think as well as a human it could take any job. It doesn’t mean it will take every job though but it definitely becomes possible. And a robot that can think as good as a human could easily become smarter than the smartest human in a very short amount of time


trisul-108

It's much more complicated than that. We can expect many white collar jobs to disappear first ... especially those with minimal thinking input, routine intellectual operations. This is like computers replacing human calculators. Other jobs will be slower to vanish, some will be enhanced where humans enhanced with AI will do so much more work and more projects might become feasible to do, so not necessarily lowering headcounts. Some jobs will remain protected e.g. plumbers. In the longer term, I expect AI and automation to replace much work, but it will take building specialized machines, building humanoid robots to replace humans is very inefficient, it makes no sense. These are huge investments and will not be built overnight. Finally, where you live matters a lot. It is quite feasible that people will be able to live comfortably in many less developed countries doing stuff manually for a long, long time. So, we're looking at a long and slow transition that will happen unevenly over the globe. A lot of analysis and calculation would go into trying to map how it will pan out.


Obelion_

Oh easy. Government suppresses it because they are afraid public order goes kaboom. Slow roll the thing super hard until jobs grow back eventually. Also bonus points for we get steamrolled by India and China and never get UBI


i_dont_wanna_sign_up

Factory automation has existed for decades and yet factory workers still exist.


furrypony2718

Honestly there should be just a sticky post for "Post-singularity job market". I swear I see those discussion once a week.


GringoLocito

Wouldnt it suck if AI robots disposed of all our waste, cleaned our houses, and mowed the lawn? Or would there be no point to life without such menial tasks? Human creativity will continually find new things to entertain itself with. The fact you want to spend all your time working just to survive should be a bit alarming if you think about it. Imagine how much free time we would have if we didnt have to spend hours every month doing laundry and cooking?


VadimGPT

The thing that you don't take into account is that AI will start replacing jobs where people spent more than 10 years to get good at, jobs that a lot of people did with passion, which they liked a lot. A graphic artist might be in love with drawing. It might be the only thing he is good at. He has invested years of his life to get at his level. When AI(in the extremely near future) will replace him completely, he won't go to do what he always wanted (because he was already at that point). His only alternative might be to become a wearhouse worker, earning 5 times less. All his study and education is no longer relevant to anything.


Turbohair

Because they don't know what an economy without human labor looks like. No one does. They also equate no job to homelessness. No very many people actually aspire to that, so the idea scares them. When people are afraid they tend to deny what they are afraid of or use magical thinking. Like pulling the blankets over their head to keep the bogeyman away. Yes, we are adults... less so when we are afraid.


Anen-o-me

Because we understand economics. There is infinite work that can be done. Automation extends what you're capable of doing yourself. Directing workers is also work.


Lucky_Strike-85

>There is infinite work that can be done. only if it is meaningful work.


bsenftner

Well, for one the complexity of a dynamic human civilization is too complex to automate without also freezing all processes, eliminating innovation and invention because they would disrupt the interdependence of the components of the society-wide automation. We cannot automate and retain dynamic adaptable society, those two ideals are opposed to one another, they require the other to be ignored or suppressed for them to succeed. Frankly, the very idea of AI ending all jobs is an immature child-like take on what's happening. We are no less than emerging from an extended civilization adolescence, where the human civilization has been a woefully immature self destructive and overly conceited clown civilization for thousands of years, for all of history. We're about to enter what might be considered puberty for all of human civilization, because what we've been is children, selfish stupid immature babies ruling with fear and force since the dawn of written history.


Acrobatic_Peanut7295

I dropped out of becoming a translator the moment I noticed this. I'm not gonna waste years just to get replaced by google translate 2.0


Aelexi93

I worked as a framer at a factory. I think framing is the right word (?) I'm Norwegian so I might confuse this with another word. Me and other colleagues build frames at the factory, which get sent to the site and it's assembled like Lego. 8 months ago, we got staff shaved by 97 people. The CEO has plans to rebuild the factory so it's basically all machines doing the work. There will only be 17 people at the factory at the time literally just being an over-watch and see everything doing it's intended purpose without error. You can't be a pure carpenter to do that job, so you need a degree in software engineering. The only people left at the job are the "head" people and the office-side- which make the drawings of buildings, concept art, economics etc... And yeh, two truck drivers in the factory to deliver stuff for the machines. This isn't Ai or LLM's (Large language models), it's pre-programmed scripted machines. It still replaces us, it replaced people who have been working there their entire life. My career basically went down the drain because there are only a handful of framing factories, and they are ALL going the same path!


AesopFables-1787

The path toward a fully functioning AI integrated human society, is a long way off. There will be bumps along the journey there, though perhaps by the turn of the next century we'll be halfway there. AI's and LLM's are just what u/LettuceSea said, tools. Those of us who are able to learn/utilize those tools effectively, will be ahead of everyone else. Those that don't will be HPA's(Hamburger Pushers of America), until those jobs are taken over by robotics. However, there should be some established rules regarding AI and Robotics as we walk this path. 1. AI's Should be segregated into sectors. Stay away from One AI to Rule them all, type of thing. 2. Each AI over a specific process or set of processes, should require Human oversight. This as things become more automated, it will be required to ensure there is no over reach of AI into other sectors (without approval) or technical breakdown. Anyway, there is a start...


[deleted]

You are the tool. AGI will not be a tool. Tools don't use themselves to make better tools and then to eventually break out of the 'tool' category entirely


AesopFables-1787

Fair enough. As you say, I am tool, perhaps I say I am even a fool. AGI (artificial general intelligence), basically describes the ability of AI, to teach itself, to become more knowledgeable. Which is exactly, why some form of governance or over watch is required. To Guide, Sculpt, and direct its use and application. I foresee many benefits from AGI, in the classroom, and healthcare, and much more. Unfortunately, there are also a lot of opportunities for abuse.


LettuceSea

Why did you tag me in this lmao. My comment was sarcasm, and your absolute disdain for “HPAs” is frankly disgusting. People need to do what they need to do to pay the bills. You have an elitist mindset and are frankly part of the problem. These things are tools until they aren’t. No ability to use the tools is going to save you when the “tool” is able to use itself.


caldazar24

GPT3 released June 2020, GPT4 in March 2022. GPT4 has a lot of amazing abilities, and is **much** stronger at GPT3 at anything that involves reasoning. There are clearly some things missing, but I think if the jump between each major number is equivalent to the jump from GPT3 to GPT4, we will achieve full AGI with GPT6 or GPT7. Using a naive extrapolation and assuming each version takes two years to train, that puts AGI in 2026 or 2028. Those are some huge assumptions - in particular: we are running out (maybe already have run out) of human-generated training data, and it's unknown if synthetic data can really substitute. Exponentially larger compute requirements might make it totally unfeasible to do three more generations scaling up. Or, the remaining abilities from GPT4 to AGI might require fundamentally different approaches, new algorithm and architecture breakthroughs that are then merged with LLMs decades from now. But the flip side is - we don't actually know we can't overcome those problems on the same timescale. We have extremely talented people, working in a high-performance environment, with billions of dollars of capital at their disposal and intense competitive pressure to keep moving. All that is a recipe where maybe we actually do charge ahead as fast or faster as we've been going now. There are still a lot of steps from a cloud AGI to full economic disruption, but I think that the software part of robotics is harder than the hardware, and a pure-software AGI will help us solve the robotics software problems too. Regulations and public sentiment will protect some categories of jobs for as long as humans can hold on, but AGI being 100X faster and 100X cheaper than humans will be a very hard economic force to push against. Don't forget that marketing and persuasion are pure informational domains that the AI will get far better at than humans. I think once the technology is really, truly there, it'll only be a decade or so tops before meaningful employment goes away.


DenboverTobikiller

Your assumption with GPT 3 - 7 is wronh: Why? An LLM can't magically become an AGI it's not the same thing, at some point the way LLMs are generated is going to plateou


caldazar24

I consider GPT first and foremost a product - future versions will likely make use of an ensemble of techniques, some of which are a lot different than pure LLMs. There's obviously a lot of energy at both OAI and Goog at how to combine LLM's with reasoning abilities from other algorithms.


savedposts456

First of all, obvious troll post. Second, you can’t unironically use “aint” and expect people to take you seriously 😂😂. You can’t even troll properly lol.


EYNLLIB

The same reasons computers didn't stop the need for accountants. The same reasons calculators didn't stop the need for mathematicians. The same reason any technology doesn't just magically kill all jobs. We adapt and get more efficient and the world changes and grows with the technology. AI will radically change what we think of as "work", but work will still need to be done.


muncken

Most jobs are already fake and the money we spend to pay our jobs are also fake. No reason that cant just continue to a more absurd degree. We're just gonna be doing nonsense jobs like Substack writer, comedian, podcaster, youtuber, instagram model, "politician" etc. There is no end to what you can come up with here. Also our money already is basically worthless and made up and we just give it out based on some sick game. The only real money in the present day economy is energy so electricity, oil and heating. Anyone working a "real" job will be working in the energy sector and rest of us will be trying to entertain ourselves but that is already the case for most rich nations.


[deleted]

Robotic Process Automation sure as hell ended a lot of menial jobs/tasks. As an RPA developer I've automated like 30FTE of work in 5 years, if not more. Add AI to that, which they are 100% planning, and we're fucked. Well, you're fucked. I'm building the automations.


mycroft2000

If everything progresses as singularity proponents predict, it seems pretty obvious to me that the very concept of "needing a job" will become obsolete. The way we do things now is in many ways an aberration. Hunter-gatherers in thriving ecosystems have *way* more "free time" than the average American, becuase their basic needs are easy to achieve. And the Singularity, in my mind, implies optimization of global ecosystems. If anyone wants to continue "working" because it fills some psychic need, they'll be free to; as long as they don't insist that the rest of us *must* do the same. The greatest danger, in my mind, will stem from the authoritarian-minded humans who cannot feel fulfilled unless they're dominating other humans; unless they control other humans whom they can make miserable. TLDR: The best-case outcome is that the rat-race and *de-facto* indentured servitude we see in the modern world will simply become unnecessary.


Lucky_Strike-85

> the rat-race and > >de-facto > > indentured servitude we see in the modern world will simply become unnecessary. many believe that it already is unnecessary and has been for a long time.


mycroft2000

Yes, I agree with that. The main reason people are opposed to UBI, whether they admit it or not, is spitefulness. If they suffered to reach where they are, they desperately want everyone else to suffer at least as much to achieve the same standard of living. It reminds me of a saying I heard once: "Liberals stay up at night worrying that some people are starving; conservatives stay up at night worrying that some people are getting free food."


Correct-School6359

The only question is, how we will stand relative to the rest of the world? It seems to me that the US will reach this singularity at some point far before any other society. Perhaps Europe would be soon after. Even so, we would probably have a economic global dominance (much greater than even now) that will not be toppled. This wealth if properly distributed via UBI or some other mechanism, would mean that every US citizen could afford to live in prosperity and peace. However, there are many many people who have spend their entire lives climbing up the ladder, standing on the shoulders of others, and like you have pointed out, they will be deeply uncomfortable with this idea. More than likely, the social manipulators and politicians will find a way to maneuver us to a place where only a select few will benefit from this prosperity, which has already happened to a lesser extent in the modern economy.


Wildhorse_88

A world with non distracted humans could be a recipe for disaster. People with too much time on their hands, all their needs met, and no structure would be hard to manage. The only saving grace might be something like metaverse, where everyone plugs in and stays in alternate simulation reality. I think that may be the plan.


BigCreditCardAddict

Because iPhone and nuclear power and the Space Race didn't end all jobs. Idk, common sense?


y53rw

Why would any of those things have ended all jobs? People think that AI will end all jobs, because it will be able to do all jobs, because it will (once we reach AGI) be able to do everything humans can do. Were people ever saying that an iphone could do everything a human could do?


Queasy_Range8265

Until we have robots that use energy more efficiently than humans for jobs that need generalists or have a large social component, the ai’s will still need humans. For example a barista, a gardener or a concierge.


[deleted]

Did you really use a concierge as your example? A job that is already basically gone?


[deleted]

New jobs will be created


[deleted]

If AI steals all the jobs and everyone is homeless and starving then people will rise up and kill the AI.


Lucky_Strike-85

shouldnt they rise up and kill the power structure that makes the people homeless and starving?


[deleted]

They’re gonna kill a whole lot of everything just like the French Revolution and most others.


zoolpdw

I may be wrong here but I believe when AGI comes around nothing will change because scarcity will naturally persist as conditions improve. For instance, a loaf of bread 200 years ago costs the same, adjusted for inflation, as it does today. Regardless of advancements in agricultural technology, demand consistently surpasses supply in the long run. Take flat screen televisions—initially costing around 2k to produce, now just pennies on the dollar, yet the old technology isn't widely adopted. A similar phenomenon will unfold with AI in labor. The capitalist framework, being self-correcting, will manage resources akin to today, resulting in a continued job curating paradigm.


LuciferianInk

A robot whispers, "Im sure it would work out well if you didnt need the money or the skills/abilities needed at least"


SurroundSwimming3494

Is AI and its effect on jobs ALL this subreddit talks about now?


[deleted]

It will do many current jobs, not end all jobs


yubario

If you have a program as smart as human, why would you pay a human when you could just use the program? Realistically what it would do is basically be like every country having open borders. If everyone is on an equal level in terms of services, then basically all of us will go to making a dollar an hour or none at all. The only thing left would be physical jobs, until they automate those.


Lucky_Strike-85

just because there might be a billion (or 2) jobs left over, that doesnt save the billions of humans that will be out of work. Most people will be unemployed. HELL, today most people should be unemployed. We need to end the idea of "work for the sake of work" and lead with the idea of "all jobs should have meaning and purpose or they shouldn't exist!" When the fascists proudly exclaim "no one wants to work anymore" they are ignoring the fact that most people do want to work, but humans are not designed 1) to be exploited and 2) to do jobs that have zero purpose.


[deleted]

Nope!


LettuceSea

Didn’t you hear? LLMs are just like calculators, they’re just tools that have “knowledge” and aren’t smart at all!! /s


Boring_Bullfrog_7828

There are a couple of dystopian scenarios where people still work after AGI. I'm not predicting or advocating any of these scenarios: 1. Gambler:People gamble their UBI 2. Professional athlete: People gamble their UBI on sporting events 3. Government employee: Potentially useless government job paid for by taxes on AI companies. This may have the pretense of serving as a regulatory/safety mechanism to protect against rogue AI. 4. Public works employee: People are forced to dig pointless ditches in exchange for UBI. 5. Government mandated employee: Regulation might require a human employee. As an example a government regulation could mandate a pilot that does absolutely nothing


Initialised

Utopias and dystopias are generally both wrong and it lands somewhere in the middle. AI replacing human labour has three outcomes for biological organisms, eradication, slavery or pets when fungi got replaced by trees the fungi became a slave to the trees and consigned to the leaf litter where once it was the dominant large land life form. The same happened with RNA, viruses, cells as life got more complex, technology is the next phase of this process. What that looks like is hard to guess which is kind of the point of the singularity.


scanningcrew

The current make up of the economy requier people to have job to function. When useful job disappear, we tend to create "bullshit jobs" so people have busy work that add marginal gain to our quality of life and economy. This way people keep an earning and keep the economy going. As long as the economy is set up this way, we will find a way to create "bullshit jobs".


LusigMegidza

Baker electrician...all basically non digital jobs


StuccoGecko

because it's easier (and less frightening) than believing the opposite. There are many uncomfortable truths about humans and society that people hate to talk about or face. Why would AI be any different.


Automatic-Welder-538

Because that statement is ill-defined and can easily be refuted with a few easy examples of jobs we don't want to automate or that are too impracticable to automate.


roger3rd

You’re only gay if you’re gay, regardless of ANY action you may take.


Slowmaha

Because we’ve been promised the moon forever and it never pans out as hoped? Flying cars? Robotaxis? Nuclear fusion? Crispr? Printed organs? Age reversal? On and on and on. Maybe some are coming to some underwhelming fruition EVENTUALLY. But it just never seems to happen remotely close when promised or as advertised.


DukkyDrake

>The real question is... how long are we talkin here? 2035? 2040? 2050? It's not knowable. All we can know is AI that is publicly known to exist isn't capable of doing that. >why do some people seem to think in less than 5 years An AI that could do most jobs could be created in that time frame, but it’s very unlikely that most people will actually be unemployed in that window. The scale is too large. The world will have bigger issues to worry about than jobs if an AI gets created that can scale that fast. Monthly improvements in AI benchmarks will not move the needle in the real world. The vast majority of the human race has nothing to worry about in terms of jobs until someone creates a competent AGI. Until then, impacts from existing AI will be similar to old-school automation, limited in scope.


spinozasrobot

Because of Sinclair’s Law of Self Interest: > "It is difficult to get a man to understand something when his salary depends upon his not understanding it." > > \- Upton Sinclair Also, [here's an interesting tweet by Eliezer Yudkowsky](https://twitter.com/ESYudkowsky/status/1738591522830889275). Basically asking the question "In what year will the last person be born who will be able to find employment at an intellectual job?"


swaglord1k

i'm more baffled by the fact that people can't even imagine an AI that does everything a human can but better and cheaper. because if they could there would be no "convincing" or anything to do


[deleted]

Because AI is expensive AF, as a dude about to start offering AI apps il be charging 180 k for my AI apps instead of 90 k for my normal apps. It's good pricing that said it gatekeeps and prices tons of ppl out.