Hewlett Packard Enterprise and Nvidia teamed up to launch a set of private cloud offerings and integrations designed for generative AI workloads. Nvidia AI Computing by HPE will be available in the fall.

With the move, announced at HPE Discover 2024 in Las Vegas, HPE enters a broad portfolio into the AI computing race. Enterprises are building out on-premises infrastructure and buying AI-optimized servers in addition to using cloud computing.

HPE's partnership comes a few weeks after Dell Technologies launched a broad AI factory partnership with Nvidia. HPE is planning to leverage its channel, integration points with HPE Greenlake, high-performance computing portfolio and cooling expertise to woo enterprises.

The main attraction at HPE Discover 2024 is HPE Private Cloud AI, which deeply integrates Nvidia's accelerators, computing, networking and software with HPE AI storage, servers and Greenlake. HPE Private Cloud AI will also include an OpsRamp AI copilot that will help manage workloads and efficiency.

AI infrastructure is the new innovation hotbed with smartphone-like release cadence | GPUs, Arm instances account for larger portion of cloud costs, says Datadog

According to HPE, HPE Private Cloud AI will include a self-service cloud experience and four configurations to support workloads and use cases. HPE also said that Nvidia AI Computing by HPE offerings and services will also be offered by Deloitte, HCL Tech, Infosys, TCS and Wipro.

Antonio Neri, CEO of HPE, said during his keynote that enterprises need more turnkey options for AI workloads. He was joined by Nvidia CEO Jensen Huang. At Computex, Nvidia said it will move to an annual cycle of GPUs and accelerators along with a bunch of other AI-optimized hardware. Neri said that HPE has been at the leading edge of innovation and supercomputing and will leverage that expertise into AI. "Our innovation will lead to new breakthroughs in edge to cloud," said Neri. "Now it leads to AI and catapult the enterprise of today and tomorrow."

"AI is hard and it is complicated. It is tempting to rush into AI, but innovation at any cost is dangerous," said Neri, who added that HPE's architecture will be more secure, feature guardrails and offer turnkey solutions. "We are proud of our supercomputing leadership. It's what positions us to lead in the generative AI future."

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"HPE is working hard fighting for market share for on-premises AI computing. It's all about AI and in 2024 and that means partnering with Nvidia. Co-developing HPE Private Cloud AI as a turnkey and full stack is an attractive offering for CXOs, as it takes the integration burden off of their teams and lets them focus on what matters most for their enterprise, which is building AI powered next-gen apps."

Constellation Research analyst Andy Thurai said HPE can gain traction in generative AI systems due to integration. 

Thurai said:

"What HPE offers is an equivalent of 'AI in a box.' It will offer the combination of hardware, software, network, storage, GPUs and anything else to run efficient AI solutions. For enterprises, it's efficient to already know the solutions, price points and optimization points. Today, most enterprises that I know are in an AI experimentation mode. Traction may not be that great initially."

HPE bets on go-to-market, simplicity, liquid cooling expertise

HPE's latest financial results topped estimates and Neri said enterprises are buying AI systems. HPE's plan is to differentiate with systems like liquid cooling, one of three ways to cool systems. HPE also has traction with enterprise accounts and saw AI system revenue surge accordingly. Neri said the company's cooling systems will be a differentiator as Nvidia Blackwell systems gain traction.

Nvidia's Huang agreed on the liquid cooling point. "Nobody has plumbed more liquid than Antonio," quipped Huang. 

Here's what HPE Private Cloud AI includes:

  • Support for inference, fine-tuning and RAG workloads using proprietary data.
  • Controls data privacy, security and governance. A cloud experience that includes ITOps and AIOps tools powered by Greenlake and OpsRamp, which provides observability for the stack including Nvidia InfiniBand and Spectrum Ethernet switches.
  • OpsRamp integration with CrowdStrike APIs.
  • Flexible consumption models.
  • Nvidia AI Enterprise software including Nvidia NIM microservices.
  • HPE AI Essentials software including foundation models and a variety of services for data and model compliance.
  • Integration that includes Nvidia Spectrum-X Ethernet networking, HPE GreenLake for File Storage, and HPE ProLiant servers with support for Nvidia L40S, H100 NVL Tensor Core GPUs and the Nvidia GH200 NVL2 platform.

The tie-up with Nvidia and HPE went beyond the private cloud effort. HPE said it will support Nvidia's latest GPUs, CPUs and Superchip across its Cray high-performance computing portfolio as well as ProLiant servers. The support includes current Nvidia GPUs as well as support for the roadmap going forward including Blackwell, Rubin and Vera architectures.

HPE also said GreenLake for File Storage now has Nvidia DGX BasePod certification and OVX storage validations.

Other news at HPE Discover 2024:

  • HPE is adding HPE Virtualization tools throughout its private cloud offerings. HPE Virtualization includes open source kernel-based virtual machine (KVM) with HPE's cluster orchestration software. HPE Virtualization is in preview with a release in the second half.
  • HPE Private Cloud will have native integration with HPE Alletra Storage MP for software defined storage as well as OpsRamp and Zerto for cyber resiliency.
  • HPE and Danfoss said they will collaborate on modular data center designs that deploy heat capture systems for external reuse. HP Labs will also have a series of demos on AI sustainability.