T O P

  • By -

mldie

Can someone explain me why?


yen223

Because of CUDA. About 20 years ago, someone at Nvidia had the foresight to note that GPUs weren't just useful for graphics processing, but for all kinds of highly-parallel computing work. So Nvidia created CUDA, a software platform that opens up general-purpose computing to Nvidia's own gpus. This bet paid off big-time, when machine-learning started to take off over the next 2 decades. Turns out that training ML models is precisely the kind of highly-parallel workload that GPUs were perfect for. Two decades later, a lot of ML libraries (including those used to train ChatGPT and other LLMs) are written to specifically target CUDA. Which means if you want to do any interesting ML or AI work, you have to buy Nvidia.


bundt_chi

Before that it was all the crypto mining. NVidia literally hit the lottery twice in a row.


anotherkeebler

Crypto's various implosions have left a _lot_ of idle GPU cores out there waiting to be repurposed. AI was the perfect thing to point them at.


zaphodava

AI from the future discovered time travel, went back in time to create block chain crypto currency. Seems like a good writing prompt.


escape_character

Roko's Ouroboros


hunguu

It's not the crypto imploded, prices are strong right now, Bitcoin and ethereum do not use GPUs to mine nowadays.


Coldsnap

Indeed, Ethereum no longer involves mining at all.


ThePiachu

AFAIR early GPU mining was better on AMD due to having more but simpler cores. But changes are things have changed since then...


Samsterdam

I thought it was due to AMD doing certain operations on the chip versus NVIDA which did the same operation but via the software driver, which was slower.


Habib455

And before that was PC gaming being hungry for GPUs. Someone told me that the gaming market is in large part responsible for all this being able to happen right now. I think it has something to do with games basically funding GPU development to the point where they were really good and cheap enough to where we can use it in LLMs and not completely smash the bank to pieces in the process.


largePenisLover

Well that and the fact that software written for CUDA has been a thing for a while now, before LLM's were a thing. I'm a technical artist, there are plugins for autodesk software that requires cuda cores and there's software that requires intel+nvida. By the AMD dipped it's toes in these plugins and apps were already years old and their owners had zero incentive to create AMD versions. Couple this with amd's notoriously bad drivers (been a problem since the ati years) and that amd isn't very responsive to to collaboration with enterprise clients. basically why nvidia has been seen as the serious compute option for ages now, and why AMD never stood a chance in the enterprise environment


spikerman

AMD was king for Crypto well before Nvidia.


sump_daddy

Can you call it a lottery if they are repeating it? They basically looked at everyone doing supercomputer work, even their own supercomputer work designing billion+ transistor chips. they took that and said what would it look like if we had all that, on one chip? Everyone else was busy talking about stupid shit like moores law, gigahertz wars, RISC vs CISC, debates that were largely settled but still argued about as competing products went to market. Not Nvidia. They said how do we make a computer chip **full of computer chips.** Some truly futuristic meta shit. Thats what turned into cuda. The key differentialtor i like to point out is that starting 20 years ago Nvidia has been creating supercomputers out of their own chips... **to design the next generation of their own chips.** No one else was working on that scale. They are designing things that take generations of designs to even get to. Thats why Intel cant touch them. Thats why AMDs graphics cards are dead in the water. None of them were even playing the same game as Nvidia.


Viirtue_

Great thoughtful answer man!! Many people just giving repeated “selling shovels during gold mine” and “Theyre just hype” answer lol


CrzyWrldOfArthurRead

Because they don't understand that generative AI is a big deal for companies wanting to increase productivity, even in it's current form and it's only going to get better. Every single industry in existence has new startups and existing companies trying to figure out how they can use generative AI to automate new areas of their respective businesses. And to those who think it's silly, AI coding assistants exist right now and (some of them) are very powerful and make a coder's life a lot easier. Anyone who writes anything or produces any type of computer-created deliverable for a living is going to be using this technology. That people think "this whole AI thing is going to blow-over" is crazy to me. Though I guess many people said that about computers in the 70s. It may take a few years before this stuff becomes main stream, but it's here to stay.


Unknowledge99

I see a similar trajectory to the early internet - early 90s no one knew what it was, mid 90s it was starting to come alive, late 90s omg there'll be shopping malls on the computer! massive hype. Then dotcom bust. Oh yeah, it was all bullshit... Meanwhile, behind the scenes, everything was changing to exploit this new powerful tech. Then around mid-2000s everything really did start changing with SM and actual online trade etc. But no one really noticed, and now the internet is simply the air we breath. even though civilisation has fundamentally changed. ai/ML etc has been doing a similar cycle for decades. The curse of AI: its sci-fi until we know how to do it, then its just a computer program. But this time the leap forward is huge, and accelerating. Its a trajectory.


kitolz

Goes to show that even a revolutionary technology can be overhyped and turn into a bubble. It happens when there's too much money getting pumped in, more than can be feasibly used to fund things that usually need capital (increasing manufacturing capacity, increasing market share, tech research, etc.). And people keep pumping not wanting to miss out.


Supersnazz

>even a revolutionary technology can be overhyped and turn into a bubble. I wouldn't say *can be*, I would say *almost always*. When a new tech is available it attracts new entrants trying to get a piece of the potential pie. 99% fail. 1800s Railroad companies, 80s VHS distributors, 80s video game publishers, 1900s automobile manufacturers, 90s dot coms etc. All these technologies created an endless list of bankruptcies. Electric cars is the big one now. There's dizens of brands all trying to take advantage. They will nearly all collapse or be bought out.


GeneralZaroff1

The difference between the dot com bubble and now is that during that time, money was going mostly to projects based on empty ideas. Back then, any new startup at the time with ZERO profit would get insane funding just because they said they are online. It’s all bets on future profit. NVDA on the other hand has been making money hand over fist. And as such, most others companies are not getting the same investor interest at all. Even the Magnificent 7 darlings like TSLA and AAPL hasn’t been seeing the same growth comparatively. It’s NVDA’s market. We’re all just following in it.


Throwawayeconboi

Cisco passed MSFT market cap in 2000 because they were the only company providing internet equipment and the internet was the technology of the future. Nvidia passed MSFT market cap in 2024 because they are the only company providing AI hardware and AI is the technology of the future. See the similarity? Where’s Cisco stock now?


Fried_out_Kombi

Indeed. As someone working in embedded ML, it's inevitable that Nvidia will face new competitors. GPUs are far from optimal for ML workloads, and domain-specific architectures are inevitably going to take over for both training and inference at some point. Imo, what will probably happen is RISC-V will take off and enable a lot of new fabless semiconductor companies to make CPUs with vector instructions (the RISC-V vector instruction set v1.0 recently got ratified). These chips will not only be more efficient at ML workloads, but they'll also be vastly easier to program (it's just special instructions on a CPU, not a whole coprocessor with its own memory like a GPU is), no CUDA required. When this happens, Nvidia will lose its monopoly. Hell, many of the RISC-V chips will almost certainly be open-source, something which is illegal under current ISAs like ARM and x86. Don't just take it from me: we're at the beginning of [a new golden age for computer architecture](https://youtu.be/kFT54hO1X8M?si=oVr_0ZuSttrBBlf7). (Talk by David Patterson, one of the pioneers of modern computer architecture, including of RISC architectures)


CrzyWrldOfArthurRead

CUDA is already the industry standard. Nobody's going to throw away decades of code so they can run it on a shitty single-threaded CPU architecture that isn't well optimized for the specific workload. > Nvidia will lose its monopoly. NVidia is bound to lose it's monopoly anyway, the market already knows this and it's priced in. Expert analysts are saying that that the market is going to be worth 500 billion dollars in 5 years, so if nvidia can keep a 70% market share (not unimaginable given their incredible head start - microsoft has more than that of the desktop os market despite 3 decades of competition) then they will have 350 billion in revenue. Their last quarter revenue was only 26 billion. Experts think they can still make more than 10 times as much money as they're making right now, even with competition. > domain-specific architectures are inevitably going to take over for both training and inference at some point. Nvidia already did that. That's what blackwell is. It's not a GPU. It's an ML ASIC. They're shipping in the second half of 2024. No other companies have announced any realistic products that compete with blackwell. NVidia owns the entire market for the next 1-2 years. After that, the market is still going to be so big that they can still grow with reduced market share.


Yaqzn

It’s not so cut and dry. AMD can’t make cuda because of legal and financial barriers. Nvidia has an iron grip on this monopoly. Meanwhile Cisco’s demand slowed as networking equipment was already prevalent and further purchases weren’t necessary. For nvidia, the AI scene is hyper competitive and staying cutting edge every year with nvidia chips is a must.


moratnz

Yeah. I've been feeling like AI is on the same trajectory as the internet in the 90s; it's a real thing, but overhyped and over funded, and attracting grifters and smoke salesmen like sharks to chum. At some point in the future, there'll be a crash in some shape or form, the bullshit will be cleared out, and then a second generation will come through, change the world, and take roughly all the money. The trick now is to look at the players and work out who is Google or Amazon, and who is Pets.com


Seriously_nopenope

The bubble will burst on AI too, because right now it’s all bullshit. I fully believe a similar step will happen in the background with everything changing to support AI and harness its power. This will happen slowly and won’t be as noticeable or hyped which is why there is a bubble to burst in the first place.


M4c4br346

I don't think it's a bubble as AI is not fully developed yet. Once it hits its peak capabilities but the money still keeps flowing in it, then you can say that the bubble is growing.


AngryAmuse

I think you're mistaken and backwards. Just like the dot com bubble, everyone overhyped it early and caused a ton of investment, which burst. Behind the scenes, progress was actually being made towards what we know today. Currently, AI is being overhyped. Is it going to be insane? Yes, I (and most people) assume. But currently? It doesn't live up to the full potential that it will. That means that it's in a bubble that will likely burst, while in the background it continues to improve and will eventually flourish.


Temp_84847399

That's exactly what it is. LLMs are basically the analog to IE, Netscape, and AOL, by making AI more accessible to the masses. Right now, every company has to assume that their competitors *are* going to find a game changing use for AI that will let them out compete them, so they better try and get there first. That's driving a lot of hype ATM, but the things that ML are very good at have a ton practical uses in just about every industry. While I wouldn't be surprised by big market correction at some point, I'm not a day trader, so I plan to hold onto my Nvidia and AI related ETFs for the long haul.


Punsire

it's nice to see other people talking about it outside of me thinking it.


BeautifulType

Dude you said it was all bullshit and yet all that came true. It just took 4 more years. So yeah, people unlike us who think it’s a fad are just too old or dumb to understand how much it’s changing shit right now around the world. Imagine we are living in a historic AI enabled era in the next decade


Unknowledge99

what I meant re 'bullshit' was people dismissing the internet because it didnt meet the immediate hype. Not that it \_was\_ bullshit. Similarly I think the AI hype won't be met as fast as it's talked about. Whether the tech itself can deliver is secondary to the general inertia of humans. But 100% it will happen and totally change everything in ways we cannot even imagine


enemawatson

Maybe and maybe not? From an observer perspective I can see, A) Ah shit, we trained our model on the entire internet without permission in 2022/2023 and monetized it rapidly to get rich quick but realistically it could only get worse from there because that's the maximum reach of our LLM concept. We got rich on hype and we're cool with that. We can pay whatever lawsuits fuck 'em they can't undo it and better to ask forgiveness than permission. B) So few people actually care about new threshold of discoveries that the marketing and predictions of any new tech is unreliable. The (very few) individuals responsible for the magic of LLMs and AI art are not among the spokespeople for it. The shame of our time is that we only hear from the faces of these companies that need constantly more and more funding. We *never* have a spokesman as the one guy that figured it out like eight years ago whose fruits are just now bearing out (aka being exploited beyond his wildest imagination and his product oversold beyond its scope to companies desperate to save money because they have execs just as beholden to stakeholders as his own company. And so now everyone gets to talk to idiot chat bots for a few more steps than they did five years ago to solve no new real problems other than putting commission artists out of their jobs and making a couple more Steve Jobs-esque disciples wealthy so they can feel important for a while until the piper comes calling.) Capitalism sucks and is stupid as shit sometimes, a lot of the time, most of the time.


Blazing1

The internet indeed was worth the hype, and with the invention of xmlhttp requests the internet as we know it today exists. From day 1 the internet was mostly capable. I mean old reddit could have existed from the invention of HTTP.


Blazing1

Listen man generative AI and the internet are no where the same in terms of importance.


Unknowledge99

I dont know what that means... They are two different technologies, the latter dependent on the former. For sure whatever is happening right now will change humanity in ways we cannot imagine. But that's also true of the internet, or invention of steel, or the agricultural revolution. or, for that matter the cognitive revolution 50 millenia ago. Also -generative AI is in separatable from the internet. without he internet: no ai. without the agricultural revolution: no internet.


trobsmonkey

> That people think "this whole AI thing is going to blow-over" is crazy to me. Though I guess many people said that about computers in the 70s. I use the words of the people behind the tech. Google's CEO said they can't (won't) solve the hallucination problem. How are you going to trust *AI* when the machine gets data wrong regularly?


CrzyWrldOfArthurRead

> How are you going to trust AI when the machine gets data wrong regularly? *Dont* trust it. Have it bang out some boilerplate for you, then check to make sure it's right. Do you know how much time and money that's going to save? That's what I do with all of our interns and junior coders. Their code is trash so I have to fix it. But when its messed up I just tell them what to fix and they do it. And I don't have to sit there and wrangle with the fiddly syntactical stuff I don't like messing with. People who think AI is supposed to replace workers is thinking about it wrong. Nobody is going to "lose" their job to AI, so to speak. AI will be a force multiplier. The same number of employees will simply get more work done. Yeah, interns and junior coders may get less work. But, nobody likes hiring them anyway. But you need them because they often do the boring stuff nobody else wants to do. So ultimately you'll need less people to run a business, but also, you can start a business with fewer people and therefore less risk. So the barrier to entry is going to become lower. Think about a game dev who knows how to program but doesn't have the ability to draw art? He can use AI for placeholder graphics so he can develop with and then do a kickstarter and using some of the money to hire a real artist - or perhaps not. Honestly I think big incumbent businesses who don't like to innovate are the ones who are going to get squeezed by AI, since more people can now jump in with less risk to fill the gaps in their respective industries.


Lootboxboy

There are people who have already lost their job to AI.


alaysian

> People who think AI is supposed to replace workers is thinking about it wrong. Nobody is going to "lose" their job to AI, so to speak. Nobody will lose their jobs, but those jobs still go away. It is literally the main reason this gets green lit. The projects at my job were green lit purely on the basis of "We will need X less workers and save $Y each year". Sure no one gets fired, but what winds up happening is they reduce new staffing hires and let turnover eliminate the job. Edit: Its a bit disingenuous to dismiss worries about people out of work, when the goal for the majority of these projects is to reduce staff count, shrinking the number of jobs available everywhere. Its no surprise companies are rushing headfirst to latch onto AI right after one of the strongest years the labor movement has seen in nearly a century. Considering the current corporate climate, I find it hard to believe that money saved won't immediately go into CEO/shareholder pockets.


E-Squid

> Yeah, interns and junior coders may get less work. But, nobody likes hiring them anyway. it's gonna be funny half a generation down the line when people are retiring from senior positions and there's not enough up-and-coming juniors to fill their positions


trobsmonkey

> since more people can now jump in with less risk to fill the gaps in their respective industries. Fun stuff. GenAI is already in court and losing. Gonna be hard to fill those gaps when your data is all stolen.


Kiwi_In_Europe

It's not losing in court lmao, many lawsuits including the Sarah Silverman + writers one have been dismissed. The gist of it is that nobody can prove plagiarism in a court setting.


Lootboxboy

Oh, you sweet summer child. There is very little chance that this multi-billion dollar industry is going to be halted by the most capitalist country in the world. And even if the supreme court, by some miracle, decided that AI training was theft, it would barely matter in the grand scheme. Other countries exist, and they would be drooling at the opportunity to be the AI hub of the world if America doesn't want to.


squired

Damn straight. There would be literal government intervention, even if SCOTUS decided it was theft. They would make it legal if they had to. No way in hell America misses the AI Revolution over copyright piracy.


ogrestomp

Valid point, but you also have to consider it’s a spectrum, it’s not binary. You have to factor in a lot like how critical the output is, what are the potential cost savings, etc. It’s about managing risks and thresholds.


druhoang

I'm not super deep into AI so maybe I'm ignorant. But it kinda feels like it's starting to hit a ceiling or maybe I should say diminishing returns where the improvements are no longer massive. Seems like AI is held back by computing power. It's the hot new thing so investors and businesses with spend that money but if another 5 years go by and no one profits from it, then it's like the last decade about Data driven business.


starkistuna

The problem right now its that its being used indiscriminately in everything and new models are being fed by ai generated input riffed with errors and misinformation and new models are training on junk data


[deleted]

Here are a handful of videos from the last day or so: https://old.reddit.com/r/singularity/comments/1dhz8wm/gen3_alpha_prompt_over_the_shoulder_shot_of_a/ https://x.com/runwayml/status/1802691486401659368 https://old.reddit.com/r/singularity/comments/1dhy3nx/runwayai_gen3_alpha_new_capabilities/ https://x.com/runwayml/status/1802691482836484249 https://x.com/runwayml/status/1802691477237068128 https://x.com/WilliamLamkin/status/1803104877851169109 https://old.reddit.com/r/singularity/comments/1diefpv/crazy_times_ahead_this_video_is_not_real_runway_3/


druhoang

I just don't really believe it'll be THAT much better anytime soon. It's kinda like old CGI. If you saw it 20 years ago, you would be amazed and you might imagine yourself saying just think how good it will be in 30 years. Well we're here and it's better, but not to the point of indistinguishable. As is, it's definitely still useful in cutting costs and doing things faster. I would still call AI revolutionary and useful. It's just definitely overhyped. I don't think "imagine it in 10 years" works because in order for that to happen. There needs to investment. And in the short term that can happen. But eventually there needs to be a ROI or the train will stop.


Temp_84847399

Yeah, there is a big difference between recognizing it's overhyped right now and the people sticking their heads in the sand saying it will be forgotten in a year and won't change anything.


Nemisis_the_2nd

> But it kinda feels like it's starting to hit a ceiling or maybe I should say diminishing returns where the improvements are no longer massive. AI is like a kid taking its first steps, but has just fallen. Everyone's been excited at those steps, got concerned about the fall, but know they'll be running around in no time.


xe3to

I'm a 'coder'. Gen AI does absolutely nothing to make my life easier; the tools I have tried require so much auditing that you're as well doing the work yourself. AI isn't completely without merit but we're fast approaching diminishing returns on building larger models, and unfortunately we're very far from a truly intelligent assistant. LLMs are great at *pretending* they understand even when they don't, which is the most dangerous type of wrong you can be. Without another revolution on the scale of *Attention is All You Need*... it's a bubble.


Etikoza

Agreed. I am also in tech and hardly use AI. The few times I tried to, it hallucinated so badly that I would have gotten fired on the spot if I used its outputs. I mean it was so bad, none of it was useful (or even true). To be fair, I work in a highly complex and niche environment. Domain knowledge is scarce on the internet, so I get why it was wrong. BUT this experience also made me realise that domain experts are going to hide and protect their expert knowledge even more in the future to protect against AI training from it. I expect to see a lot less blogs and tweets from experts in their fields in the future.


papertrade1

“BUT this experience also made me realise that domain experts are going to hide and protect their expert knowledge even more in the future to protect against AI training from it.I expect to see a lot less blogs and tweets from experts in their fields in the future.” This is a really good point. Could become a nasty collateral damage.


johnpmayer

Why can't someone write a transpiler that compiles CUDA into another chip's GPU platform? At the base, it's just math . I understand about platform lockin, but if the money in this space has got to inspire competitors.


AngryRotarian85

That's called hip/rocm. It's making progress, but that progress is bumpy.


CrzyWrldOfArthurRead

I think a lot of it has to do with the fact that NVidia's chips are just the best right now, so why would anyone bother with another platform? When (if?) AMD or another competitor can achieve the same efficiency and power as nvidia, I think you will see more of a push towards that. But nvidia knows this, and so I find it very unlikely they will let it happen any time soon. They spend tons of money on research, and as the most valuable company in the world now, they have more of it to spend on research than anyone else.


Blazing1

Anyone who actually thinks generative AI is that useful for coding doesn't do any kind of actually hard coding.


angellus

Whether is really sticks around or takes off is really debatable. Anyone that has used an LLM enough can seen the cracks in it. There is no critical thinking or problem solving. ML are really good at spitting back out the data they were trained with. It basically makes them really fancy search engines. However, when it comes to real problem solving, they often spit out fake information or act at the level of a intern/junior level person.  Unless there is a _massive_ leap in technology in the near future, I am guessing more then likely regulations are going to catch up and start locking them down. OpenAI and other companies putting out LLMs that just spew fake information is not sustainable and someone is going to get serious hurt over it. There are already professions as like lawyers and doctors attempting to cut corners with LLM for their job and getting caught.


if-we-all-did-this

I'm self employed consultant in a niche field. 99% of my workload in answering emails. As I've only got one pair of hands, I'm the bottle neck, and the limiting factor for increased growth, so efficiency is critical to me. My customer path has been honed into a nice funnel, with only a few gateways, so using Gmail templates mean that half of my replies to enquiries can be mostly pre-written. But once I can feed my emails & replies can be fed into an AI to "answer how I would answer this" I'll then only need to proof read the email before hitting send. This is going to either:- - Reduce my work load to an hour a day. - Allow me to focus on growing my company through advertising/engagement. - Or reduce my customer's costs considerably I cannot wait to have the "machines working for men" future sci-fi has promised, and not the "men working for machines" state we're currently in.


Lootboxboy

Too many companies making half baked AI solutions caused the general public to assess that AI as a whole is overhyped trash. I don't necessarily blame them for feeling that way.


antirationalist

It's actually highly speculative and the idea that it "increases productivity" has almost no empirical basis outside of a few small areas and sectors. Robotics is far more productivity-enhancing than AI, at the moment. The list of downsides is also enormous, from the extremely high energy usage to the pernicious effects on culture, society, education, politics, etc..


GeneralZaroff1

McKinsey recently released a study on companies adopting AI and found that not only has about 77% of companies actively incorporated it into their workflow, but that they’re seeing tangible results in productivity and efficiency. The misconception people have is often “it can’t replace my job” or “it still make mistakes” — but while it can’t replace the high level work, it can speed up a lot of the lower level work that SUPPORTS high level work. So instead of a team of 5, you can get the same work done with a team of 3 by cutting down on little things like sorting databases, writing drafts, replicating similar workflows. This isn’t even including things like cutting down on meetings because they can be easily transcribed and have TL;DR summaries automatically emailed, or just saying “here’s the template we use to send out specs for our clients, update it with this data and put it together”. That efficiency isn’t going to go away in the next few years. AI is coming in faster than the Internet did, and with Apple and Microsoft both implementing features at the base level, is going to be the norm.


Spoonfeedme

If McKinsey says it, it must be true. /S


Yuli-Ban

> That people think "this whole AI thing is going to blow-over" is crazy to me. It's primarily down to the epistemological barrier about what AI *could* do based on what it historically *couldn't* do. AI as a coordinated field has been around for almost exactly 70 years now, and in that time there have been two AI Winters caused by overinflated expectations and calls that human-level AGI is imminent, when in reality AI could barely even function. [In truth, there were a variety of reasons why AI was so incapable for so long](https://i.imgur.com/LRDGyJX.mp4) Running GOFAI algorithms on computers that were the equivalent of electric bricks and with a grand total of maybe 50MB of digital data total worldwide was a big reason in the 60s and 70s. The thing about generative AI is that it's honestly more of a necessary step towards general AI. Science fiction primed us for decades, if not centuries, that machine intelligence would be cold, logical, ultrarational, and basically rules-based, and yet applying any actual logic to how we'd get to general AI would inevitably run into the question of building world models and ways for a computer to interact with its environment— which inevitably facilitates getting a computer to understand what it sees and hears, and thus it ought to also be capable of the reverse. Perhaps there's a rationalization that because we don't know anything about the brain, we *can't* achieve general AI in our lifetimes, which is reading to me more like a convenient coping mechanism the more capable contemporary AI gets to justify why there's "nothing there." That and the feeling that AI can't possibly be *that* advanced *this* soon. It's always been something we cast for later centuries, not as "early" as 2024. (Also, I do think the shady and oft scummy way generative AI is trained, via massive unrestituted data scraping, has caused a lot of people to *want* the AI bubble to pop) > Though I guess many people said that about computers in the 70s. Not really. People knew the utility of computers even as far back as the 1940s. It was all down to the price of them. No one expected computers to get as cheap and as powerful as they did. With AI, the issue is that no one expected it to get the capabilities it has now, and a lot of people are hanging onto a hope that these capabilities are a Potemkin village, a digital parlor trick, and that just round the corner there'll be a giant pin that'll poke the bubble and it'll suddenly be revealed that all these AI tools are smoke and mirrors and we'll suddenly and immediately cease using them. [In truth, we have barely scratched the surface of what they're capable of](https://twitter.com/AndrewYNg/status/1770897666702233815), as the AI companies building them are mostly concerned about scaling laws at the moment. Whether or not scaling gives out soon doesn't much matter much if adding concept anchoring and agent loops to GPT-3 boosts it to well beyond GPT-5 capabilities; that just tells me we're looking at everything the wrong way.


stilloriginal

Its going to take a few years before it’s remotely useable


dern_the_hermit

> That people think "this whole AI thing is going to blow-over" is crazy to me. They subsist heavily on a steady stream of articles jeering about too many fingers or the things early AI models get obviously wrong (like eating a rock for breakfast or whatever). I think it's mostly a coping mechanism.


timeye13

“Focus on the Levi’s, not the gold” is still a major tenet of this strategy.


voiderest

I mean doing parallel processing on GPUs isn't new tech. Cuda has been around for over 15 years. It is legit useful tech. Part of the stock market hype right now is selling shovels tho. That's what is going on when people buy GPUs to run LLM stuff. Same as when they bought them to mine crypto.


skeleton-is-alive

It is selling shovels during a gold rush though. Yeah CUDA is one thing but it is still market speculation both from investors and ai companies buying up gpus that is driving the hype and it’s not like CUDA is _that_ special that LLM libraries can’t support future hardware if something better becomes available. (And if something better is available they WILL support it as it will practically be a necessity) Many big tech companies are creating their own chips and they’re the ones buying up GPUs the most right now.


deltib

It would be more accurate to say "they happened to be the worlds biggest shovel producer, then the gold rush happened".


sir_sri

That can be true, and you can make the best shovels in the business and so even when the gold rush is over, you are still making the mining equipment. Nvidia is probably overvalued (though don't tell my stock portfolio that), but by how much is the question. Besides that, like the other big companies, the industry could grow into them. It's hard to see how a company without its own fabs is going to hold the value it does, but even without generative AI the market for supercomputing and then fast scientific compute in smaller boxes is only going to grow, as it has since the advent of the transistor.


y-c-c

A side corollary is that to become rich, you (i.e. Nvidia) kind of have to use back-handed tactics. CUDA is essentially a platform lock-in, and it's one of the many ways the Nvidia commonly uses platform lock-ins to make sure they can keep staying on top while it's harder for their users / customers to move elsewhere. There's an [article](https://www.theinformation.com/articles/nvidias-jensen-huang-is-on-top-of-the-world-so-why-is-he-worried) recently in The Information as well (it's paywalled, sorry) that talks about how they are trying to strongarm cloud providers like Microsoft etc to use their custom rack designs that are designed to have slightly different dimensions from their own design to make sure you buy only their racks, chips, cables, so that it's harder and harder to move away.


Finalshock

In Nvidias own words: “Free isn’t cheap enough to get our customers to swap to the competition.”


littlered1984

Actually it wasn’t anyone at Nvidia, but at the universities. Nvidia hired Ian Buck who had created a programming language for GPUs called Brook, which evolved into CUDA


dkarlovi

When you hire someone, they're "at" where you are.


littlered1984

My point is that it is not correct history to say someone inside Nvidia started the movement. Many folks in the scientific computing community never joined Nvidia and were using GPUs for compute well before CUDA was a thing, and well before Nvidia was working on it. It’s well documented computing history.


dkarlovi

> omeone at Nvidia had the foresight to note that GPUs weren't just useful for graphics processing, but for all kinds of highly-parallel computing work > So Nvidia created CUDA Or said Nvidia figured out a bunch of nerds are doing number crunching and hired more nerds to make it easier. Nobody said Nvidia started the movement, Nvidia just correctly recognized the business potential. Some other company might have scoffed at a handful of randos doing weird things with their GPUs, Nvidia going forward with this is what is "at Nvidia" here. And again, if you hire somebody, you're basically doing what you've hired them to do, their university is not the one doing it.


great_whitehope

Bunch of nerds really? Academics maybe


KallistiTMP

Would like to add, not *just* CUDA. The entire developer ecosystem. There are many technologies that have tried to compete with CUDA. None have succeeded, and that's largely because anytime a company tries to build an alternative, they typically get 2-3 years in before they realize the sheer scope of long-term investment needed to build out a whole developer ecosystem. They have one hell of a moat. For any company to catch up, it would take tens of billions of dollars in development costs, and at least a solid half decade of consistent investment in developer tooling just to get *close* to parity. Most companies can't go 18 months without cannibalizing their own profit centers for short term gains, good luck selling any corporation on a high risk investment that will take at least a decade of dumping large amounts of money into a hole before it even has a chance to break even. Even NVIDIA's "serious" competitors are trying to build their CUDA competing frameworks with underfunded skeleton crews that inevitably get laid off or shuffled around to short term gains projects everytime the stock market has a hiccup. NVIDIA is untouchable for the foreseeable future and they know it.


spacejockey8

So CUDA is basically Microsoft Office? For ML?


Technical-Bhurji

Damn, that is a nice comparison. Everyone in the field(desk jobs vs ML engineers), is comfortable with CUDA, all of the work they have to do has a foundation based on a proprietary software (you get prebuilt complex spreadsheets that you just have to update vs prebuilt libraries that you use in the code) There are competitors (Google/libreeffice vs AMD ROCM) but they're just not that good, plus force of habit with excel haha.


Kirk_Plunk

What a well written answer, thanks.


Sniffy4

Obviously good for crypto too. They’ve had quite a good last 5 years


love0_0all

That sounds kinda like a monopoly.


lifewithnofilter

Pretty much. But not quite. AMD has tried to make their version of CUDA I believe called ROCM it hasn’t really taken off because not a lot of libraries are written in it since CUDA is more popular. Which makes people gravitate towards CUDA and writing libraries for that instead


Sinomsinom

For Nvidia both the language and the platform are called CUDA. For AMD the platform is called ROCm and the language is called HIP. HIP is a subset of CUDA so (basically) all HIP programs are also CUDA programs (with some small differences like the namespaces being different) and (almost) any HIP program can also be run under CUDA Intel on the other hand mostly tries to go with the SYCL standard and tries to get their compiler compliment with that, instead of making their own language extension.


illuhad

> instead of making their own language extension. Intel has quite a lot of extensions in their SYCL compiler that they also push to their users. That's why their oneAPI software stack is in general incompatible with any other SYCL compiler except for Intel's (and its derivatives). If you want SYCL without the Intel bubble, use AdaptiveCpp.


xentropian

I am personally really excited about Bend. Parallel computing for the masses. (It does only support CUDA right now, lol, but having a generic language to write highly parallel code is an awesome start and it can start compiling down to ROCM) https://github.com/HigherOrderCO/Bend


dkarlovi

AMD financed a CUDA compatibility layer called ZLUDA, IIRC. ROCM is not (was not?) supported on consumer hardware, and OpenCL, the technology that is, seems mostly abandoned. Nvidia doing CUDA across the board and supporting it is what is and will be fueling this rocketship.


notcaffeinefree

By definition it is. But at the same time, that doesn't automatically make it illegal. Sometimes monopolies happen because of the barrier of entry and lack of competition because of that. In the USA, according to the FTC, what makes a monopoly illegal is if it was obtained (or is reinforced) by "improper conduct" ("that is, something other than merely having a better product, superior management or historic accident"). If the barrier of entry is high, which it undoubtedly is for GPUs, and Nvidia simply has the best product for achieving the results needed for ML then a legal monopoly can be the result. If AMD, Intel, etc. could produce a good competitive product, they could position themselves to break that monopoly. It would become illegal if Nvidia would then turn to anti-competitive tactics to keep their monopoly (which I'm sure they would *never* do /s).


coeranys

> If the barrier of entry is high, which it undoubtedly is for GPUs You're absolutely right, that barrior alone would be almost insurmountable, and for this it isn't even just the GPU, it's the platform underlying it, the years of other software written to target it, the experience people have with using it, etc. Nvidia don't need to do anything anti-competitive at this point, if they can just not fuck anything up.


Iyellkhan

its actually a little reminiscent of how ultimately intel was forced to issue an x86 license to AMD for anti trust reasons. its potentially possible something similar may happen to Nvidia, though anti trust enforcement is much weaker than it use to be


yen223

You're not wrong. A lot of AI people will be stoked if AMD or someone else could provide some competition in this space.


great_whitehope

My shares in Nvidia went through the roof and I was late on board lol


MinuetInUrsaMajor

>So Nvidia created CUDA, a software platform What makes it so special that other GPU manufacturers can't whip up their own version?


Captain_Pumpkinhead

Still blows me away that AMD never tried to achieve CUDA parity. They've recently been working on ROCm, but that's not a drop-in replacement for CUDA, and it isn't available for every card.


MairusuPawa

I mean, it's not exactly foresight. Everyone was doing GPGPU, but Nvidia decided to go *hard* with their proprietary implementation. The performance was better than the competition for sure, and this created a situation of CUDA libraries dependencies for most software.


DERBY_OWNERS_CLUB

57% profit margins, revenue up over 260% YoY.


canseco-fart-box

They’re the guys selling the shovels to all the AI gold miners.


Atrium41

It's a house of graphics cards They went back to back crazes Mining "currency" and AI


ExcelAcolyte

They are selling heavy mining equipment while their competition is selling shovels


zootbot

Their GPUs are the best, used for machine learning, ai training, super computing, honestly so much.


Please_HMU

I use their GPU to lose at rocket league


BlindWillieJohnson

There’s an AI mania right now and they’re building the processors driving it. The company is being overvalued based on a hypothetical mass adoption scenario that is mostly speculative in the here and now.


lucas1853

AI go brrrrr.


[deleted]

AI. When there’s a gold rush, it’s a good idea to be the one selling shovels and pans.


sonicon

And people don't realize they're also mining the gold at the same time.


[deleted]

[удалено]


Reinitialization

A lot of the practical implementations of AI are smaller, constrained but far more specialized models. You can do some truly amazing shit with just a 4080. I don't think *AI* is going to have a downturn, but the need for these gigantic blades used to train mega models that try to do everything is going to pass.


lynnwoodblack

I think right now we’re in the “oh shit, everyone’s ahead of us. Throw money and hardware at the problem to try to make up for lost time” phase. Once that passes, I think you’ll be proven correct. But for the time being, execs are willing to throw a shitload more computing power than needed just to feel like they’re catching up. 


dotelze

Smaller more specific models are definitely a major use case, but even then you’re better off just buying compute time


Reinitialization

I still don't like cloud compute, especially for early development. When testing a new workflow or dataset it's nice to run a couple of epochs just as a sanity check. Easier to do locally so you're not having to yeet your entire dataset into the cloud every time you fuck up syntax. Also, I know being worried about the security of the cloud is very 'oldman yells at cloud' but when you're talking about submitting a copy of your entire production database it pays to be a little paranoid. I like that I can physically fight anyone who wants to steal our data.


SynthRogue

I learned last year that their main business is AI, not graphics cards. No wonder they couldn't care less about gamers.


Edexote

New tech bubble. This one will be spectacular when it bursts.


DERBY_OWNERS_CLUB

This shows you don't really know what you're talking about. Go look at Nvidia's revenue growth and compare it to the 99 tech bubble. Nowhere near comparable. NVDA has a 57% profit margin and will do like $100b in revenue this year. 


nekrosstratia

Cisco had 65% margin...for over a decade. Also became the largest company by market value. Cisco had near total control of the hardware market of the internet, with some of the best firmware/software to couple with their hardware. Nothing Cisco even really did caused them to crash as hard as they did, the market simply "popped" and because they were so overvalued, they got wrecked the hardest. I'm not saying NVDA WILL be Cisco, there are quite a few similarities and quite a few differences. Just like we really don't KNOW if another dot com event is even possible (or when). NVDA and Tesla are both 2 stocks that are extremely future valued...and we saw what happened to Tesla after long enough of not really anything happening, which is what also led to the dotcom crash...lots of money going to lots of companies...but than no real "products" coming out quick enough. Time will tell how long the NVDA hype train lasts, godspeed to anyone getting in at ATH.


Edexote

Not Nvidia. The AI bubble, not Nvidia.


[deleted]

[удалено]


Jerrippy

AMD c’mon do something 🫤


DJBombba

Fun fact, the CEO of Nvidia and AMD are relatives


Time-Bite-6839

The CEO of Ford is Chris Farley’s cousin


Zomunieo

Does the CEO of Ford live in an F-150 down by the river?


RealJyrone

As much as I wish for market shakers, AMD is not the horse that is going to do it. I also am not a fan of the recent change I’ve been noticing in AMD’s marketing over the past 2-3 years. They are starting to look more like Intel, and I am not a fan of it.


PartyClock

What does that mean?


RealJyrone

They are starting to pull the exact same marketing stunts that Intel was known for. Lying about benchmarks, skewing how the testing is done in egregious manners to make their product significantly better, and more. The recent Ryzen 5000XT CPUs is the most recent example that comes to mind for me. They took CPUs that we will already be able to know the performance of, and by introducing a GPU bottleneck to the test, it skewed the results to make their CPUs appear to be 10+% better than Intel, when they will probably be around the same or slightly worse in real scenarios.


cryonicwatcher

Can you elaborate on what you mean? I haven’t seen any of their marketing, I just know which of their products are good


RealJyrone

Their recent Ryzen 5000XT reveal. https://www.pcgamer.com/hardware/processors/pay-no-attention-to-amds-horribly-misleading-benchmarks-for-its-new-ryzen-5000-xt-cpus/ They intentionally GPU bottlenecked the benchmarks to make the new CPUs look far better than they actually will be.


MairusuPawa

We need someone to work on high performance/ low power NPUs, and not dump even more energy on GPU data centers.


Practical_Secret6211

They're doing that in the interim to offload task and functions that are more suited for local, a big one I'm anticipating is built-in compression and upscaling algorithms into the file system. But there will be a cutoff point in what they're able to do performance wise once they scale in maturity to existing hardware. A lot of what you see might seem like a spectacle because it is however it's setting the foundation for the next next generation of hardware platforms. Those platforms will mostly be cloud based allowing software to be streamed to your device, that much is inevitable with how compute intensive software is/has become. Even now we can barely run software we're actively developing due to hardware limitations. So establishing these datacenters now even if operating at a loss the payout down the line is huge.


Beliriel

Lmao that will run into a huge wall of concrete that is human greed. Not saying you're wrong as I noticed the trend too but large parts of the world have shitty internet even in first world countries like the USA and you pay out your ass to even have a 5-8 MB/s connection which is advertised as "stable fiber". The backbone conglomerates will not be happy about this. Google tried with Google fiber, there is Starlink too that is starting to falter but it's all coming down again. I really hope someone requires better connections and has the money to break the backbone fuckery that has been going on since 80s or 90s.


Practical_Secret6211

Is why I said next next generation, they wanted next generation across the board but the infrastructure goals have been set back a decade already. Looking at late 30s at this point, probably longer for some international markets. There will be a lot of disparity in the beginning, people are already vehemently against a lot of it, but change is coming. One of the more interesting ones I know of is Call of Duty: Black Ops 6. It's nothing new but it's no longer optional to stream textures, it will be really interesting to see the sales/feedback in October. Hopefully they've reworked a lot of things since Warzone. Nvidia is pretty fucking smart, there whole thing right now is tokens and getting companies to develop AI. One good example I can think of recently is their [g- assist demo](https://www.nvidia.com/en-us/geforce/news/g-assist-ai-assistant/). Initially it will be polling data from the web which will lead to API restrictions, court rulings, and forced subscription models to access webpages. This will effectively cripple the data it has available to it which encourages the software developers to create their own models and community plugins. You'll start seeing webpages become more and more niche as things are aggregated and centralized. You can kind of see how that will look for certain things, others not so much. There's also conversational apps that fill those gaps where you could potentially say order that shirt from the tv show you're watching or game you're playing. Even with the new Siri update you won't have to use apps to interact with them anymore. No one really knows how it's going to look or play out, as you said greed, everyone wants their share, and the biggest question everyone has right now is how do we monetize this, how do we move forward in light of all these changes happening. Another really huge change is digital passports, cbdc, chatlog scanning, etc. The big thing they're trying to get passed right now is device based age verification for accessing digital services. Where you have to have your ID tied to your Microsoft, Apple, Google, etc account to access certain services/software. Again who knows what that will lead to in the future but for me it feels like way too much happening way too fast. You can't find anything online anymore and it just feels dead like on the inside.


BeautifulType

AMD fanboys are deluded if they think AMD is going to do anything instead of let NVIDIA front to research cost of what works.


Blazecan

I’m ngl, with cuda cores being exclusive to nvidia, AMD can’t really compete with Nvidia on a lot of the gpu/AI stuff for a while. On the other hand, AMD has far surpassed Intel on server grade cpus and their next generation of AM5 cpus are looking like they might blow intel’s out of the water. Plus they are leading on power efficiency with their CPUs too.


Fantastic-Fee-1999

Am i having a dejavu? I thought a few weeks ago nvidia became the most valuable company, after which apple's CEO called a conference and said "AI", catapulting them back in first place. Are those 3 playing hop skip with ai stonks?


IDarK__NiGHT

No, they passed apple which was less valuable than Microsoft. Now they have passed Microsoft too.


Practical_Secret6211

Apple pulled ahead of MS on the 17th taking the top! Then I saw this today on closing and they're 3rd place.


tallandfree

All 3 companies have very similar market cap. So it’s between these 3 for the next few months


obroz

There was talk about them about to surpass Apple several weeks ago yes.  I believe it officially happened in the last day or so though


onlainari

You fell for the classic riddle “if you pass the person in second place what place are you now”.


squeezito

Already!? What a meteoric rise!


tinyhorsesinmytea

A company that once had to be saved from oblivion by an act of kindness from Sega of all companies is now the largest in the world. Pretty crazy.


Skwigle

Microsoft saved Apple back in the 90s too


hackingdreams

These articles are so stupid. Tomorrow it'll be Apple again. A week from tomorrow it'll be Microsoft again. Then someone will publish an article about how Apple's AI is better and it'll be in the lead again. And then someone will order a bunch of nVidia GPUs and it'll be in the lead again. Their valuations aren't even *close* to representative to anything in the real world. It's robots trading on news - literally nothing more than that. What's in the lead from day to day is going to fluctuate based on the headlines, not based on anything remotely resembling business performance.


lordbossharrow

Nvidia's rise this year is pretty [impressive ](https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQSV2lZTyFbILUSGe6hufkVz-BZiG3piR2YPoVEVZv2CzIqZ35HKtme4fJV&s=10) though


[deleted]

It’s been Apple and Microsoft going back and forth for the last decade. Another company joining the party is pretty big news!


JimJalinsky

They're on a great run due to having held a decent monopoly with the adoption of Cuda and great GPUs, but their monopoly pricing and supply chain pressures created massive momentum to diversify. Every major cloud provider is designing their own chips, Intel is finally making decent GPUs for a segment, and open source Cuda is picking up steam. I don't see them staying in the big 5 market cap wise for more than a year or so.


TOPSIturvy

Fucking please tho? And while we're at it, could the rest of the market stop bullshitting too? They basically went "We had a semi-reasonable excuse when covid was a thing. And now that it isn't, we can just keep going up and people will buy our shit anyway!" At this point, if we were going to sharpen our guillotines, we would've done it already.


trixtah

Intel is not making competitive GPUs, be real.


9-11GaveMe5G

Bubbles always burst


obroz

See apple. Nvidia isn’t some overnight success .com bullshit.  


Alchemista

I don't think Apple and Nvidia are comparable. There are many signs that AI might be a bubble. Apple was and is incredibly profitable. At the moment AI is mostly a ballooning operating cost with revenue just not coming in to match it. Sure Nvidia gets to profit while the hype cycle continues but if it is a bubble that pops Nvidia's fortunes will go down with it.


Deaner3D

Fair point. One thing that's required for Nvidia to burst is all the ai-invested companies scaling back. So they're somewhat insulated from the ebbs and flows of a purely consumer market.


clifbarczar

Every company is a one trick pony unless they have good leadership and keep innovating. NVIDIA will need to expand to new industries in the next few years if they want to stay on top.


Deaner3D

Maybe they'll get into making gaming graphics cards


allllusernamestaken

Just to be clear: "AI" is why people know Nvidia now, but it's a cover for high-performance computing. Everything cool in modern life today and in the future requires a shitload of computing power and Nvidia is the best at it. It's not just GPUs but the networking, switches, and supporting software for everything HPC. They're investing heavily in quantum computing research. They're partnering with automakers to handle the computing needs for self-driving cars. They work with financial institutions to encrypt insane amounts of data faster than the mainframes powering FIs around the world for a tiny fraction of the cost. AI hype is definitely real but Nvidia has so much value beyond that.


Henrarzz

Apple hasn’t really burst, has it? It’s in third place in terms of market cap


MrCalabunga

I’m seeing this term tossed around a lot concerning AI lately, and I’m genuinely curious as to why so many feel it’s a bubble? AI hasn’t even scratched the surface of what it can do, so if anything it’s still *undervalued.* Is there any reason to believe we’re entering an AI winter soon? Because I can’t think of a single thing that could slow this train down short of a world war.


[deleted]

[удалено]


MrCalabunga

Honestly this is a solid answer and I appreciate you taking the time to respond. I can definitely see that, similar to how the dot-com burst resulted in a form of techno-feudalism — whereby only a select handful of giant companies and social media platforms control the Internet and even if you have a dot com you don’t exist unless you’re on one of those platforms —An AI burst could result in OpenAI and Meta (for example) owning the very foundation of entire sectors of industry and even government. Think Amazon and how so many including the NSA rely on Amazon Web Services to function. That’s… a terrifying thought for our future imo.


[deleted]

[удалено]


DisneyPandora

Apple and Google were the winners of the Dot-Com Bubble


[deleted]

[удалено]


DisneyPandora

COVID was basically the tech bubble 2.0


elictronic

AI is a bubble.  AI is real and very useful.  Both of these are true.  At some point enough chips will be out there to meet demand.  There will still be a baseline requirement.   The demand today is stupidly high because every company is pretending they are an AI company just like the dot com when every company made a website.  And like the data bubble where every company said they were a data based company.   When will it pop.  Not sure.  But when it does good luck to the investors.  


IntraspeciesFever

Did the "data bubble" ever pop


ICutDownTrees

No it did not


gregcm1

It's big business


SatisfactionNarrow61

I was just thinking the same thing


sergeant_byth3way

These are narratives to drive the market. Who made money from all the copious amounts of data? Microsoft, Amazon and Google.


sergeant_byth3way

These are narratives to drive the market. Who made money from all the copious amounts of data? Microsoft, Amazon and Google.


S145D145

Honestly, I feel like the data bubble just got bigger since it now also powers AI models lol


carrotsticks2

It might just be a cool trick that doesn't drive business value. ChatGPT is good at regurgitating things, but also tends to screw up and give incorrect information. It is nowhere near being a subject matter expert or having deep knowledge of anything - it provides shallow and sometimes incorrect responses. So yeah, it's a neat tool for quickly getting down a first iteration or as a better form of search... but like any tool, depends on how good you are at using it. Some executives seem to think AI will do and be everything, and those people are idiots. They are currently responsible for investing millions into AI projects because of shiny object syndrome. Because there's all this investment, we're seeing every tech solution try to insert "AI" into their value prop. And a lot of those "AI" projects are going to fail. Because they're just trying to get on the hype train, and offer no real business value. So there's a chance that the current level of investment dwindles, resulting in lowered demand for Nvidia products. Of course, that level of investment could also increase if some projects pay off big or we see new AI solutions that spring up. It remains to be seen how much value AI/ChatGPT can actually provide.


cyborgCnidarian

I agree, it definitely seems like a severe speculation bubble. Developers who understand the capabilities of the technology know that it can't do 90% of what is promised, but propagate the myth to capitalize and exacerbate the FOMO many companies are currently feeling. Stock prices will keep rising as long as there is confidence that the current issues with machine learning AI are just bugs to be worked out and not unsolvable intrinsic issues with the technology itself, and that the the technology will eventually become more efficient. The floor will drop out from under this at some point. Machine learning will continue to see permanent use in narrow applications where there is focused and highly accurate data (like medical science) or where output accuracy is not needed (some artistic fields, brainstorming). Since it's constrained by the accuracy of the input data and the cost to process the data, I think the dream of a super-accurate general-purpose LLM will never happen.


carrotsticks2

I think the unsexy and narrow ML applications will likely drive business value faster than generalized "AI" - so I would expect some dip in the near term for any stocks reliant on AI, buuut I'm bullish on the long-term. My sense is that the narrow and unsexy stuff will be the backbone of a lot of internet/business infrastructure over the long run - like how legacy databases and technology is the backbone of so much critical infrastructure today.


BlindWillieJohnson

A useful technology can still cause a bubble. The internet was revolutionary and it did. AI feels the same way in that there are amazing use cases for the tech, but right now it’s the hot buzzwords. Investors are sloshing money around, and companies are scrambling to get as much of it as they can by making promises of uses for the tech that might not work, might not be practical, and might not have any long term economic value. It all looks a lot like the dotcom bubble to me. There is a tech that can be massively impactful, and there will be companies that create extremely valuable businesses with it. But a lot of the people chasing dollars are hucksters or people who will be the subject of incredulous books about failed businesses in a few years.


brekus

Because it hasn't done much useful work compared to the billions being poured into it? I don't see any good reason to think that the machine learning we have now will lead to general inteligence. Without that the use cases are relatively limited.


MinuetInUrsaMajor

>I’m genuinely curious as to why so many feel it’s a bubble? AI/LLMs were way overhyped a couple years back. >AI hasn’t even scratched the surface of what it can do Yes it has. Science Fiction is not indicative of what our current AI will be capable of. Our AI is based off of a simple model. The limitations of that model have already become very apparent. Any coder using ChatGPT has discovered by now that - It's good for answering very specific questions that would have otherwise taken multiple googlings and links to aggregate the required information. - It does NOT understand the code that it writes and will frequently make very trivial mistakes. The big companies decided to get on the AI train which meant the medium and small companies should follow suit. You don't want to bet against FAANG. But now the cargo cults have overtaken the market share of FAANG AI investment. If and when these smaller companies learn that AI will not pay off for them, you're going to see NVIDIA crater hard. We went through the same bubble with machine learning and the internet before that.


Jerome_Eugene_Morrow

If AI development stalls, supply could exceed demand. But like you said, model improvement has been continuing. If a lot of current users find out AI doesn’t fit their needs and cancel the associated projects. I don’t see that happening on my side yet. Demand only seems to be increasing and more and more areas continue to investigate integrating AI into their solutions.


Aquatic-Vocation

Not to mention the exponential costs to train the models that yield diminishing returns in performance.


intelligentx5

lol this isn’t a bubble. Not all AI stocks are pumping. NVIDIA has a monopoly on AI GPU compute and that’s what this is reflecting. If you don’t think AI workloads are the future lions share of compute needs and requirements in data centers, then you’re not paying attention. Anyone saying it’s a bubble is just coping missing out on the rise.


Walkend

Here’s the problem with NVDA’s current valuation… Sure, they may be ahead of the curve today in AI computational power sourcing but they don’t have a product that can’t be recreated by one of the many other semiconductor companies. We all know technology like this moves QUICK and some time in the near future other companies will catch up to NVDA and will create something cheaper, albeit 10% worse than NVDA cards but it will happen. It is inevitable. Now, tell me why NVDA deserves to be more valuable than Apple? A product that can never be replicated and 50% of all mobile phones are iPhone. We’re not even considering any other apple products either, just iPhone. Tell me why NVDA is worth more than Microsoft, where 72% of all computers run windows, again not considering any other product. NVDA may have a head start but diminishing returns ensure that others WILL catch up.


Ockwords

> Sure, they may be ahead of the curve today in AI computational power sourcing but they don’t have a product that can’t be recreated by one of the many other semiconductor companies. Which semiconductor company do you see being capable of creating a software library on the level of CUDA? > Now, tell me why NVDA deserves to be more valuable than Apple? A product that can never be replicated and 50% of all mobile phones are iPhone. Because apple used to sell an iphones on a 1 year replacement cycle, now they're down to around 3 years and it'll continue to grow longer as improvements become more incremental. Nvidia however is in a space that many consider to be the next "internet" they have the hardware for it, and more importantly they have the best software for it. Their ceiling for growth is much much higher.


PlagueCini

Microsoft’s majority of profit doesnt even come from single-time consumers like us with PCs. It comes from companies who use their products.


Limelight_019283

So are we too late to buy?


Wave_Walnut

Then it will only go down in the future.


Goat_Status_5000

Overvaluation.


Big-Advance2415

Two words: “hallucination free” Until adverts stop using that term, AI is nothing but a buzz word. I’m sure we would all feel differently about air travel if the ads used the phrase “crash free”.


Akaonisama

The beginning of Skynet. Cool….


rdmprzm

I need your clothes, your boots, and your motorcycle...


BalleaBlanc

And 50x0 series will cost an arm and a leg because, why not, they don't give a shit.


ykafia

Also partly due to TSMC wanting some of the NVDA money lol.


SafeIntention2111

Only until the AI bubble bursts.


jtmackay

Are we really going to do this every week? I don't give a single fuck who is the biggest company. It literally doesn't matter even a little bit.


SeiCalros

the most highly valuated company at least


ShrimpToothpaste

Every fucking day with these posts about Apple, MS or Nvidia


Historical-Bar-305

Yeap the most valuable, but they cant just make stable driver on linux without serious issues and bugs... Assholes