Currencies29156
Market Cap$ 2.54T-0.31%
24h Spot Volume$ 37.26B-9.82%
DominanceBTC51.84%+0.16%ETH16.49%+0.62%
ETH Gas14 Gwei
Cryptorank
MainNewsGoogle Cloud...

Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025


Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025
Apr, 09, 2024
3 min read
by CryptoPolitan
Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025

Nvidia Blackwell platform was intended to integrate its advanced future into the Google Computer System Platform, which Google has promised. Initially anticipated for the beginning of 2025, the platform comes with the ultrafast Nvidia HGX B200, which is developed for AI and high-performance computing (HPC) domains as well as the NBL72 GB200 created for the training of large language models (LLMs). Of all the improvements Google will be introducing the first one is liquid cooling to increase the performance and efficiency. This is the first time liquid cooling will be implemented in Google’s cloud infrastructure.

Expansion of AI accelerators and computing services

Google introduced new instance types and various accelerators to fulfill every customer’s needs. One of the innovations that Google Cloud has introduced is the custom-made AI processor chip, called the Arm-based Axion, in addition to a wide range of AI accelerators and even from Nvidia that will ultimately create an array in terms of Google’s offerings. Along with the new A3 Mega instance, which is set to be jointly developed with Nvidia, there were other new announcements. 

This particular solution is designed with the H100 GPUs which are used in industry-standard nets and a new networking system that can double the bandwidth per each GPU. This is an excellent feature for the developers since it now has the necessary power for training deep LLMs at a high-efficiency rate. 

Google also launched its A3 confidential instance, which is designed to be resilient to data breaches and maintain the confidentiality and integrity of sensitive data and AI workloads by encrypting data end-to-end during its transfer and movement between Intel’s CPU and the Nvidia H100 GPU without any need for code changes.

Advancements in Google’s own AI accelerators

Google Humanizes: Google is still not afraid to push technical boundaries with its onsite AI supercomputers, Cloud TPU v5 processors are now pasteurized for long-time use by everyone. The processors, which Google claimed are the most advanced AI accelerator they ever had, will be 2x faster in FLOPS than the previously observed and amazing 3x memory bandwidth speed increase. 

Google came up with AI-optimized storage options like Hyper disk (In preview). This next-generation block storage service offers a way to drastically reduce model loading time (up to 3.7 times according to Google), Google Artisan declares. While the widespread utilization of AI and machine learning remains its core asset, Google Cloud is also increasingly venturing into general infrastructure categories. 

The vendor deployed new themes referencing Intel’s fourth- and fifth-generation Xeon CPUs. There is a new general-purpose Z4, Z5, N4, and C4 instances, which will have the fifth-generation Emerald Rapids Xeons. The Z4 is performance-oriented, while N4 offers the best cost-effective solution. Today demonstration version of the C4 instances is in private mode, but N4 will be available for use to all at the current moment.

Expanding Google cloud services

Google demoed new C3 physical instances, running on the older Intel Xeon processors of the 4th generation, and X4 memory-optimized physical instances, broadening the service options of Google’s infrastructure. The Z-series introduces the Z3 to Google Cloud’s market, being all storage-optimized virtual machines, which lead their class regarding IOPS with no other major cloud service provider.

The announcements made by Google Cloud at the Las Vegas event consolidate the company’s position in the industry. These announcements prove that the industry leader in cloud computing is still going strong and will stay there for good. In 2025, Nvidia will incorporate its Blackwell platform, with processing units, storage solutions and AI accelerators being already active. This puts Google Cloud at the forefront of cloud computing space. 

The transformations not only ideally improve and enlarge Google’s cloud computing sphere, but also serve developers and businesses, as sources for infinite AI/ML, as well as general processing computations. However, with Google Cloud chasing this dynamic cloud computing ecosystem and maintaining its ingenuity, the company’s strategies and inventions represent its strong intention to a more effective, powerful, and diverse computing system in the future.

Original Story From techcrunch

Read the article at CryptoPolitan

Read More

Early AI Funding May Be Showing Some Cracks

Early AI Funding May Be Showing Some Cracks

AI's deal-making volume seems to be slowing in the second quarter of 2024 compared to...
Jun, 14, 2024
4 min read
by Crunchbase
Bitwise Forecasts $20 Trillion Boost To Global GDP From Crypto And AI By 2030

Bitwise Forecasts $20 Trillion Boost To Global GDP From Crypto And AI By 2030

The intersection of artificial intelligence (AI) and crypto is poised to unleash a no...
Jun, 14, 2024
2 min read
by NewsBTC
MainNewsGoogle Cloud...

Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025


Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025
Apr, 09, 2024
3 min read
by CryptoPolitan
Google Cloud Announces Integration of Nvidia’s Next-Gen Blackwell Platform in 2025

Nvidia Blackwell platform was intended to integrate its advanced future into the Google Computer System Platform, which Google has promised. Initially anticipated for the beginning of 2025, the platform comes with the ultrafast Nvidia HGX B200, which is developed for AI and high-performance computing (HPC) domains as well as the NBL72 GB200 created for the training of large language models (LLMs). Of all the improvements Google will be introducing the first one is liquid cooling to increase the performance and efficiency. This is the first time liquid cooling will be implemented in Google’s cloud infrastructure.

Expansion of AI accelerators and computing services

Google introduced new instance types and various accelerators to fulfill every customer’s needs. One of the innovations that Google Cloud has introduced is the custom-made AI processor chip, called the Arm-based Axion, in addition to a wide range of AI accelerators and even from Nvidia that will ultimately create an array in terms of Google’s offerings. Along with the new A3 Mega instance, which is set to be jointly developed with Nvidia, there were other new announcements. 

This particular solution is designed with the H100 GPUs which are used in industry-standard nets and a new networking system that can double the bandwidth per each GPU. This is an excellent feature for the developers since it now has the necessary power for training deep LLMs at a high-efficiency rate. 

Google also launched its A3 confidential instance, which is designed to be resilient to data breaches and maintain the confidentiality and integrity of sensitive data and AI workloads by encrypting data end-to-end during its transfer and movement between Intel’s CPU and the Nvidia H100 GPU without any need for code changes.

Advancements in Google’s own AI accelerators

Google Humanizes: Google is still not afraid to push technical boundaries with its onsite AI supercomputers, Cloud TPU v5 processors are now pasteurized for long-time use by everyone. The processors, which Google claimed are the most advanced AI accelerator they ever had, will be 2x faster in FLOPS than the previously observed and amazing 3x memory bandwidth speed increase. 

Google came up with AI-optimized storage options like Hyper disk (In preview). This next-generation block storage service offers a way to drastically reduce model loading time (up to 3.7 times according to Google), Google Artisan declares. While the widespread utilization of AI and machine learning remains its core asset, Google Cloud is also increasingly venturing into general infrastructure categories. 

The vendor deployed new themes referencing Intel’s fourth- and fifth-generation Xeon CPUs. There is a new general-purpose Z4, Z5, N4, and C4 instances, which will have the fifth-generation Emerald Rapids Xeons. The Z4 is performance-oriented, while N4 offers the best cost-effective solution. Today demonstration version of the C4 instances is in private mode, but N4 will be available for use to all at the current moment.

Expanding Google cloud services

Google demoed new C3 physical instances, running on the older Intel Xeon processors of the 4th generation, and X4 memory-optimized physical instances, broadening the service options of Google’s infrastructure. The Z-series introduces the Z3 to Google Cloud’s market, being all storage-optimized virtual machines, which lead their class regarding IOPS with no other major cloud service provider.

The announcements made by Google Cloud at the Las Vegas event consolidate the company’s position in the industry. These announcements prove that the industry leader in cloud computing is still going strong and will stay there for good. In 2025, Nvidia will incorporate its Blackwell platform, with processing units, storage solutions and AI accelerators being already active. This puts Google Cloud at the forefront of cloud computing space. 

The transformations not only ideally improve and enlarge Google’s cloud computing sphere, but also serve developers and businesses, as sources for infinite AI/ML, as well as general processing computations. However, with Google Cloud chasing this dynamic cloud computing ecosystem and maintaining its ingenuity, the company’s strategies and inventions represent its strong intention to a more effective, powerful, and diverse computing system in the future.

Original Story From techcrunch

Read the article at CryptoPolitan

Read More

Early AI Funding May Be Showing Some Cracks

Early AI Funding May Be Showing Some Cracks

AI's deal-making volume seems to be slowing in the second quarter of 2024 compared to...
Jun, 14, 2024
4 min read
by Crunchbase
Bitwise Forecasts $20 Trillion Boost To Global GDP From Crypto And AI By 2030

Bitwise Forecasts $20 Trillion Boost To Global GDP From Crypto And AI By 2030

The intersection of artificial intelligence (AI) and crypto is poised to unleash a no...
Jun, 14, 2024
2 min read
by NewsBTC