Gcore has introduced AI Cloud Stack, a private AI cloud software platform designed to help organizations create hyperscaler-level infrastructure powered by NVIDIA GPU acceleration. The launch signals a broader shift in how enterprises and service providers approach AI deployment, balancing speed, scalability, and sovereignty as the demand for compute power accelerates worldwide.
The Luxembourg-based company says AI Cloud Stack allows users to convert NVIDIA GPU clusters into full-featured, multi-tenant private AI clouds within a short time. The platform targets cloud service providers, telecom operators, and large organizations looking to run and monetize AI workloads efficiently. Gcore attempts to satisfy the demand for AI compute that is growing rapidly beyond the big public hyperscalers by converting raw GPU power into cloud services that are more accessible and giving customers the power to control their infrastructure.
Gcore created AI Cloud Stack jointly with VAST Data and Nokia, effectively combining their tech into one solution that significantly accelerates user flow through new AI tech deployments. This alliance integrates different hardware and software solutions, thus cutting the whole process time and simplifying the creation of AI systems of larger scale, which are the most common problems faced by AI applications.
Seva Vayner, Gcore’s Product Director for Edge Cloud and AI, explained that the platform reflects the hybrid future of AI infrastructure. He said many organizations face strict operational and regulatory requirements that make on-premise AI necessary but difficult to scale. According to him, Gcore’s stack simplifies that process by offering a complete operational layer that enables rapid deployment from Infrastructure-as-a-Service to Model-as-a-Service.
Already running across thousands of NVIDIA Hopper GPUs in Europe, AI Cloud Stack combines a cloudification layer, AI suite, orchestration tools, and automation capabilities, creating a foundation for commercial-grade private AI clouds.
The company says this initiative represents more than just another infrastructure launch—it shows how enterprises can now own and operate AI workloads independently, without trading off performance or compliance. As the global AI landscape evolves, Gcore’s move points toward a future where organizations can innovate with both autonomy and scale.
