Chip designer NVIDIA Corporation will use Intel Corporation's Arizona chip manufacturing facilities to manufacture its chip products, according to the firm's chief executive officer Mr. Jen-hsun Huang. The executive made the comments in an interview with CNBC, as his firm rides the hype train following the massive success of ChatGPT with the general public and firms alike. NVIDIA's products sit at the heart of the artificial intelligence powered chatbot, which uses existing data sets to generate life-like responses to queries.
NVIDIA's Server Boards For ChatGPT Cost An Estimated $200,000 Each
Mr. Huang's interview with CNBC covered his firm's early bet on artificial intelligence and deep learning, which saw skepticism all around, including Wall Street. Since then, NVIDIA has come a long way, and its products are used in various applications such as ChatGPT, healthcare and even supercomputers.
The growth in popularity of ChatGPT has boosted sentiment in NVIDIA right at the time the firm is struggling financially due to weak macroeconomic conditions affecting the personal computing market. For NVIDIA, the troubles are two-fold, as it has to deal with reduced demand for its gaming graphics processing units from gamers and cryptocurrency miners in the wake of last year's market crash.
NVIDIA's latest earnings report for its fourth quarter of the fiscal year 2023 saw the firm report $1.8 billion in gaming revenues for the quarter, which marked a strong 46% annual drop. At the same time, in a small silver lining, the gaming revenues grew by 16% quarter over quarter, for the second strongest growth performance across NVIDIA's five business units.

The growth in silicon valley's interest in artificial intelligence is creating a huge demand for NVIDIA's products. Estimates suggest that the beta version of ChatGPT used a whopping 10,000 NVIDIA GPUs to train the model. This is still the tip of the iceberg since experts have also estimated that if Google's Search hypothetically deployed ChatGPT, an unbelievable 4.1 million NVIDIA GPUs would be required to meet the computing requirements.
Mr. Huang himself has been nothing but full of praise for ChatGPT, calling it the greatest thing ever done in computing history in February. And it also stands to be an essential development for his company, with estimates suggesting that the latest model for ChatGPT will require 30,000 NVIDIA GPUs. Mr. Huang explained to CNBC that his firm ships boards with eight GPUs a piece; one board is estimated to have a price tag of $200,000. This indicated that the new ChatGPT model can potentially drive $750 million in GPU sales to NVIDIA.
However, geopolitical tensions in Asia coupled with the fact that most of NVIDIA's GPU supply comes from the Taiwan Semiconductor Manufacturing Company (TSMC) places the firm's supply chain at risk of disruption. To mitigate this, Mr. Huang is planning to utilize Intel's multi billion dollar plans to open up its chipmaking plants for other chip designers as well.
When asked by CNBC's Katie Tarasov about the importance of TSMC in the global chip supply chain, the NVIDIA executive replied:
The fact of the matter is TSMC is a really important company. And the world doesn't have more than one of them. It is imperative upon ourselves and them, for them to also invest in diversity and redundancy.
Mr. Huang added, "oh absolutely, we'll use Arizona," in response to Tasarov's question about his firm's plans to use Intel's Arizona facilities for its chip manufacturing. TSMC is also building a chip facility in Arizona as part of its efforts to diversify its supply chain, as it potentially aims to take advantage of subsidies offered by the U.S. government; subsidies, if approved, will bar the firm from committing to advanced chipmaking plants in China.
The post Nvidia Will Use Intel’s Arizona Fabs To Make GPUs Says CEO by Ramish Zafar appeared first on Wccftech.