Nvidia's New Technologies Are Astounding

BRan...Wd2i
21 Mar 2024
3

Jensen Huang, founder of Nvidia, looks more like a rockstar than a technology company CEO. Exactly 11,000 people came to the Nvidia developers conference the day before. The conference was broadcast live. I'm sure millions watched there too, and tech nerds weren't the only ones in the audience. Basically it's a gpu - cpu company. So technology geeks should come. Entrepreneurs, movie stars, journalists, everyone was there. Because Nvidia announced its new toys and these toys seem to change the future of artificial intelligence and therefore the world. His explanations about the robot world particularly impressed me. On the one hand, there are some worrying things about Tesla at Nvidia's conference. I compiled them all for you. I have prepared an article that anyone who has or is thinking about investing in Nvidia shares, or who is even the slightest curious about the future of Nvidia shares, should definitely watch.
Nvidia hasn't held developer conferences with a live audience since 2019. But as you know, Nvidia shares are doing great this year. Jensen Huang is also in a good mood. Yesterday he wanted to present his new toys live in a big stadium and they really have some great toys. The most important issue is of course the change in GPUs. They are switching to blackwell architecture instead of the current hopper architecture. The chips in the Hopper architecture were H100 GPUs. B200 GPUs are replacing them. Sales prices of H100 GPUs are between 25000 dollars and 40000 dollars. B200s were not announced yesterday, but it looks like they will go up to $50000. Of course, no one buys just the GPU, an entire system is purchased along with the GPU. When we look at the switches that will enable integration with the company's own systems, the shelves they will be placed in, the cooling systems and the software that enables the operation of all these, a GPU unit costs around 200,000 dollars.
They took the name of the new GPU from David Harold Blackwell. A mathematician is one of the people who laid the foundations of artificial intelligence. So what are the differences brought by this new chip? There are 208 billion transistors on it. In fact, they glued not one chip, but two chips on top of each other. It's like a wafer, but there is a tremendous interaction system between them, and Jensen says he never realizes that there are two chips while the information flows through it. While the AI processing power in the current H100s is 4 peta flops, it jumps to 20 peta flops. So it goes up to 5 floors. There are currently 1.7 trillion parameters in the artificial intelligence architecture that gpt uses. He says Nvidia that our new system will allow 27 trillion parameters. Allowing parameters basically means artificial intelligence that is smarter, faster, and can perform more operations in more detail.
If you ask what the equivalent of this complex technology is when it is put into practice, new chips reach approximately 30 times faster inference, that is, processing power, and they do this with lower energy and cost than existing chips. It is thought that there will be savings of approximately 25 times compared to the H100s. Of course, this will greatly simplify the development of new artificial intelligence programs. For example, you could only develop an artificial intelligence model such as GPT, which works with 1.8 trillion parameters, with 25000 A100 chips, which are Nvidia's old A100 chips, in 3 to 5 months. With the current H100s, you can achieve this with 8000 GPU and in 3 months. With the new Blackwell chips and B100s, this will only require 2000 chips and will be done in 3 months, Jensen said. This means that the number of things to be done in artificial intelligence will increase.
He also made many technological explanations about the new hopper architecture and Blackwell, but I will not present them to you. In other words, we have an architecture that is much faster, uses energy more efficiently, and enables more things with fewer chips. The stock market was more or less expecting this. Well, there is nothing very surprising here. That's why the reaction in the stock yesterday was not very positive after the closing. But I think there is a very important issue that the stock market is missing. There were also hints at the conference that Nvidia is now transforming from just a GPU manufacturer into a software and service company.
In this context, they have a new service called NIM, Nvidia inference microservice. This system says this. Even if you have bought Nvidia chips in the past, that is, you are not buying the latest technology, the chips in the past can be used especially in the inference leg of artificial intelligence, that is, the reaction leg. For this, we offer you a new software. If you are working with any Nvidia chip, starting from the A100s, you can easily convert them to inference chips. Thus, you can use the new B100s you will purchase as a training chip. We offer you a software service to solve all this. The annual price will be 4500 dollars. This roughly means: They are no longer satisfied with just selling GPUs and CPUs. They also offer a service package so that the company can live in peace with those GPUs for a lifetime, not feel bad about new GPU updates, and can take old GPUs to new areas of use.
This may lead to Nvidia's valuation being a software company rather than a hardware company. The effects of this will be limited at first. Because these services are just starting. But as the number of Nvidia chips on the market increases, and as those who buy the first Nvidia chips ask the question of where we should use them, we need more integration, as more existing large companies such as Google and Amazon want to use Nvidia chips in small companies and need a software service in this regard, this $ 4500 I think the package will grow a lot. This leads Nvidia to the software as a service (saas) model. It's something that will push price-earnings ratios upward. We will not see the effects in the very short term. But we will definitely see. That's why NIM was one of the issues that impressed me the most.
The issue that affects the most is actually robots, but I will come to that last. Because they bring a completely different perspective to the robot world. Omniverse is another project that Nvidia has been working on for a long time. Omniverse actually means virtual reality. In virtual reality, for example, you build your factory. You simulate it there. There, you design the factory in the most efficient, safest and most logical way for workers' health, simulate it and get the data. If these data are in the direction you want, you design your factory accordingly. Take this to car design, or even microchip design. Because TSMC, the largest chip manufacturer in the world, is the main manufacturer of Nvidia. He was also thinking of redesigning chip production processes using this software. They also focused on this Omniverse issue. We understand that this is evolving rapidly and Apple's Vision Pro will also use it.
So, the reason why Apple's name is Vision Pro is not a coincidence. Pro comes from the pro. As far as I understand at first, Nvidia's Omniverse model will combine with Apple's Vision Pro model, and professionals will benefit from the endless rights of Omniverse when designing new things. This was again one of the exciting aspects of the conference. We have seen many companies cooperate with Nvidia on this issue. Another important development is that Nvidia is expanding and deepening its cooperation with companies such as Google, Amazon and Microsoft. For example, Amazon would buy 20,000 of these new systems and offer these chips as a service to customers like him who offer services over the cloud, such as Amazon web services. Microsoft and Google also have similar projects. This actually shows that microchips will now become rented things like a service. I guess this will also benefit Amazon stock.
But the development that excites me the most is that they offer a new hardware, software and operating system for robots, humanoid robots. The name of this new system is groot, more precisely GR00T. This also joined that series. Groot is actually a special set offered to all companies that want to develop humanoid robots. In this context, Jensen came on stage with about 8 robots. Among those robots, there is Figure 01, which I focused on in my previous article, and he said that thanks to this new system we offer, the skills of humanoid robots such as learning from humans, learning by seeing, learning by listening, and learning from large language models will improve greatly. In this context, I think it would not be wrong to declare him as the master of robots. The most exciting and entertaining images on stage were within this framework. There were also tiny robots that they developed themselves that learned things by looking at people.
Nvidia takes this job very seriously. Here, too, it creates new income models for itself. Of course, I admire Nvidia very much here. So it's not just acting like a chip manufacturer. At the same time, it supports all structures where these chips will be used. For example, they are developing new projects with Johnson and Johnson on better use of artificial intelligence in surgeries. They have a deep collaboration with BYD. It might be a bit worrying for Tesla. They said that BYD will use Nvidia Omniverse in the design of its factories and also uses Nvidia chips in the brains of its vehicles. In this context, Mercedes GLR also came into play among the videos they showed us. Mercedes GLR is a collaboration between Nvidia and Mercedes. As far as I understand, Nvidia will announce its projects regarding artificial intelligence, that is, autonomous driving, on that car.
GLR is expected to hit the market in late 2024 or early 2025. I'm looking forward to seeing what the skill will be like with Tesla's autonomous driving. In this context, I think Tesla's biggest rival is not Mercedes, but Nvidia. Because Nvidia can work with dozens of automobile manufacturers and new autonomous driving projects integrated with them may come. Of course, it's worrying for Tesla. Nvidia is an important competitor, and as you know, Tesla also uses Nvidia's chips in its own system. They also have such interesting positions. I don't think Nvidia will want to be a complete rival to Tesla. Tesla has a big advantage as we will switch to this model instead. Everything at Tesla is deeply integrated. So you can compare it to Apple in this regard. Apple designs its own chips for its own phones. They develop their own software, everything is integrated.
Nvidia, for example, cooperates with Mercedes. So, I actually compare it to Android-Samsung collaboration. Nvidia will probably be the Android of this business in the future and I am sure that it will be an important player and many automobile companies will use Nvidia's artificial intelligence and autonomous driving solutions. On the other hand, of course, Tesla has a huge advantage. Currently, over 5 million vehicles are on the roads. They collect data every day from the 8 cameras of these vehicles. Nvidia does not have such an infrastructure, but Nvidia can also develop these things quickly. In this context, I think the autonomous driving race will accelerate.
I hope Elon Musk also watched the conference the other day. Elon Musk was not there the other day. In fact, Elon Musk was also there in person at the last conference in 2019. It wasn't there this time. This time, they highlighted other collaborations more. Representatives of x AI developed by x were there, it was mentioned. It was mentioned that Tesla would purchase Nvidia's chips. But they made no mention of deeper cooperation. Yes, these are my takeaways from the conference. We see this from here. Jensen Huang I think in the history of entrepreneurs in the United States, Steve Jobs, then Elon Musk, now Jensen is coming, and the things that Jensen put forward were actually more exciting than what Tesla put out.
On the other hand, the stock closed slightly lower. This seems somewhat normal to me. Because they did not announce the price of these new GPUs, the market was waiting for it. And as you know, buy the expectation and sell the news, but I would definitely not short Nvidia in the long run. I also have a position in Nvidia. I have now reduced my position in Nvidia a little bit. Because I think the space it can go in, at least in the short and medium term, is a bit limited. So I think it will maybe go up to 1000 dollars. Of course, I might get burned, or he might go for it. But in the long run, I think Nvidia is still a great investment.
I still don't find the stock too expensive, and I'm sure some rote economists who shorted Nvidia didn't watch yesterday's conference. Nvidia is not a stock to short. Whether or not it goes down is a separate issue, but if you have followed what I have explained here about autonomous driving with artificial intelligence and robots, I think you should never give up on Nvidia, which is the core of this. This is not investment advice, of course I'm talking about the company here, not the stock price, and on the other hand, don't forget that even Tesla is doomed to Nvidia.
Apple is doomed to Nvidia, Microsoft is doomed to Nvidia, Salesforce is doomed to Nvidia, Amazon is doomed to Nvidia. Nvidia is a multi-core company right now and it doesn't look like they will slow down in innovation. I think Nvidia will continue to be a great company, I tried to summarize the situation briefly. I hope it was useful.
The information, comments and recommendations contained herein are not within the scope of investment consultancy. Investment consultancy services are provided within the framework of the investment consultancy agreement to be signed between brokerage firms, portfolio management companies, banks that do not accept deposits and customers. The comments in this article are only my personal comments and these comments may not be appropriate for your financial situation and risk return. For this reason, investments should not be made based on the information and comments in my articles.

Write & Read to Earn with BULB

Learn More

Enjoy this blog? Subscribe to thanhnhimmo

4 Comments

B
No comments yet.
Most relevant comments are displayed, so some may have been filtered out.