GIGABYTE Technology is using CES 2026 to position itself as a full-stack AI infrastructure provider, unveiling what it calls an end-to-end computing ecosystem under the theme “AI Forward.” The company’s showcase spans modular AI data centers, liquid-cooled server racks, edge systems for physical AI, and AI-ready desktops and laptops designed to run models locally.
The message is clear: as AI workloads scale across cloud, edge, and on-device environments, GIGABYTE wants to be the hardware backbone powering all of it.
Building enterprise-scale “AI factories”
At the center of GIGABYTE’s CES announcement is GIGAPOD, a modular, building-block AI data center platform designed to accelerate the deployment of what the company calls enterprise AI factories. GIGAPOD integrates high-performance servers, high-speed networking, and GIGABYTE POD Manager (GPM) software to simplify the design, deployment, and validation of AI infrastructure.

The platform supports direct-liquid-cooling (DLC) servers built on Intel Xeon 6, AMD EPYC 9004/9005, and NVIDIA HGX B300 systems, as well as AMD Instinct MI355X accelerators. GIGABYTE also introduced an in-house Rack Management Switch, capable of managing up to eight liquid-cooled racks from a single 1U unit, with multi-vendor CDU support and leak detection aimed at improving reliability and operational efficiency.
Grace Blackwell and next-gen AI supercomputing
For hyperscale and high-performance deployments, GIGABYTE is showcasing the NVIDIA Grace Blackwell Ultra NVL72, a rack-level system combining 72 NVIDIA Grace CPUs with Quantum-X800 InfiniBand and Spectrum-X Ethernet networking. The company says the platform can deliver up to 50× inference performance compared with NVIDIA’s previous Hopper generation.
Also on display are new AI supercomputers, including the G894-SD3-AAX7 powered by NVIDIA HGX B300 and the XL44-SX2-AAS1 featuring RTX PRO 6000 Blackwell Server Edition GPUs. Both systems pair dual Intel Xeon 6 processors with DDR5 memory, high-speed networking, and NVIDIA BlueField-3 DPUs for enhanced security and data handling.
Bringing AI to the edge with “Physical AI”
Beyond data centers, GIGABYTE is pushing AI closer to where data is generated. Its CES booth includes a smart warehouse demonstration built around compact edge computers, embedded systems, and industrial PCs designed for AGVs, AMRs, robotic arms, and machine vision.
These systems are optimized for low-latency, 24/7 inference, supporting what GIGABYTE describes as Physical AI — AI that can perceive, decide, and act in real-world environments rather than just in the cloud.
Agentic AI moves to the desktop
On the client side, GIGABYTE is targeting the rise of Agentic AI with its AI TOP product line, including AI TOP ATOM, AI TOP 100 Z890, and AI TOP 500 TRX50 desktops. These systems are designed to run local LLMs and multimodal models, enabling inference, fine-tuning, and retrieval-augmented generation (RAG) without relying on cloud infrastructure — a pitch centered on privacy, security, and predictable costs.
The company is also introducing AI TOP Utility software to simplify local AI workflows, alongside laptops with the GiMATE AI companion and the AORUS RTX 5090 AI BOX, which uses Thunderbolt 5 to bring near-desktop AI performance to notebooks.
Why it matters
GIGABYTE’s CES 2026 strategy reflects a broader industry shift: AI is no longer confined to massive cloud data centers. It’s spreading across factories, warehouses, offices, and personal devices — and each layer requires different hardware trade-offs.
By covering AI factories, physical AI at the edge, and agentic AI on-device, GIGABYTE is betting that infrastructure vendors who can span the entire stack will be better positioned as enterprises move from AI experimentation to full-scale deployment.
GIGABYTE is exhibiting at LVCC North Hall, Booth #8519, throughout CES 2026.























