KR9298-E3 – AI Server with 8x AMD Instinct™ MI350X GPUs
NEW
The KR9298-E3 AI Server is a general-purpose heterogeneous computing platform designed for large language model (LLM) inference, AI cloud services, and scalable high-performance computing (HPC) workloads.
Equipped with dual-socket SP5 AMD EPYC™ 9005 Series processors (up to 500W each, air-cooled) and the latest AMD Instinct™ MI350X 8-GPU platform, the KR9298-E3 delivers exceptional compute density, ultra-high GPU memory (2TB HBM3e), and an industry-leading 48 TB/s memory bandwidth. This ensures fast training, inference, and high-throughput data processing for the most demanding AI workloads.
AI & LLM Inference: Optimized for GPT-class models, generative AI, and multimodal AI
Cloud Computing Services: Scalable multi-tenant AI and HPC clusters
High-Performance Computing (HPC): Scientific research, simulations, and engineering workloads
Data-Intensive Workloads: Real-time analytics and GPU-accelerated big data processing
Price on request – daily updates apply. Please check availability before ordering.
Unfortunately there are no reviews yet. Be the first who rates this product.
{{.}}
{{/content}}{{{.}}}
{{/content}}