HPE GreenLake AI/ML Howto
HPE GreenLake for Large Language Models is an on-demand, multi-tenant cloud service that enables enterprises to privately train, tune, and deploy large-scale AI models using sustainable supercomputing infrastructure powered by nearly 100% renewable energy.
View MoreHow to Use HPE GreenLake AI/ML
Sign up for HPE GreenLake account: Create an HPE MyAccount at auth.hpe.com/hpe/cf/registration to access the HPE GreenLake platform
Access HPE GreenLake platform: Log into common.cloud.hpe.com to access the unified control plane for all HPE GreenLake services
Select AI/ML services: Navigate to the AI/ML section and choose HPE GreenLake for Large Language Models service
Configure compute resources: Select the required supercomputing capacity powered by HPE Cray XD supercomputers with NVIDIA GPUs for your AI workloads
Set up ML environment: Use HPE Machine Learning Development Environment to set up your training environment and configure data pipelines
Prepare training data: Use HPE Machine Learning Data Management Software to integrate, track and manage your training datasets
Train and tune models: Leverage the supercomputing platform to privately train and fine-tune your large language models
Deploy models: Deploy your trained models for inference using the platform's deployment capabilities
Monitor and manage: Use the unified control plane to monitor model performance, manage resources and optimize costs
HPE GreenLake AI/ML FAQs
HPE GreenLake for Large Language Models is an on-demand, multi-tenant cloud service that allows enterprises to privately train, tune, and deploy large-scale AI models using HPE's supercomputing platform and AI software.
HPE GreenLake AI/ML Monthly Traffic Trends
HPE GreenLake AI/ML saw a 13.6% increase to 5.9M visits in July. This growth can be attributed to the introduction of ProLiant Gen12 Servers, which offer advanced security, performance, and AI-driven automation, enhancing the product's appeal to AI-driven enterprises.
View history traffic
Related Articles
View More