Fortune Sky is Your Go-to Source for the Latest Finance News, Covering Markets, Business, Industries and Internet.
⎯ 《 Fortune • Sky 》

HPE Looks to Gain From Generative AI Interest With Cloud Service

2023-06-21 02:21
Hewlett Packard Enterprise Co., looking to take advantage of the demand for generative artificial intelligence tools, introduced a
HPE Looks to Gain From Generative AI Interest With Cloud Service

Hewlett Packard Enterprise Co., looking to take advantage of the demand for generative artificial intelligence tools, introduced a cloud service to handle the large computing power needed to support the new technology.

The product, called HPE GreenLake for Large Language Models, follows earlier AI cloud offerings by the major public cloud providers, including Amazon.com Inc., Microsoft Corp. and Alphabet Inc.’s Google. HPE’s service works with German AI startup Aleph Alpha’s large language models that analyze and create text from users’ prompts.

“We have reached a generational market shift in AI that will be as transformational as the web, mobile and cloud,” HPE Chief Executive Officer Antonio Neri said Tuesday in a statement announcing the new product.

Read More: A Cheat Sheet to AI Buzzwords and Their Meanings

HPE has been moving away from its reliance on selling traditional hardware such as computer servers and expanding its information technology services. Last month, HPE said sales in its High-Performance Computing and AI unit increased 18% to $840 million in the quarter ended April 30.

Unlike general public cloud services, HPE’s product is “uniquely designed to run a large-scale AI training and simulation workload, and at full computing capacity,” the Spring, Texas-based company said. Aleph Alpha’s software, Luminous, is a pretrained commercial large language model offered in multiple languages.

“Now organizations can embrace AI to drive innovation, disrupt markets and achieve breakthroughs with an on-demand cloud service that trains, tunes, and deploys models, at scale and responsibly,” Neri said.