Understanding AI
Artificial intelligence (AI) refers to computer science processes and statistical algorithms that simulate and augment human intelligence.
Red Hat AI Enterprise
Featured
What is vLLM?
What is sovereign AI?
Red Hat Launches Red Hat AI Enterprise to Deliver a Unified AI Platform that Spans from Metal to Agents
Foundations of AI
What is machine learning?
What is deep learning?
What are foundation models for AI?
What are large language models?
SLMs vs LLMs: What are small language models?
What is AI inference?
AI infrastructure explained
What is an AI platform?
Types of AI
What is generative AI?
Predictive AI vs. generative AI
What is agentic AI?
Agentic AI vs. generative AI
Model enhancements
What is Mixture of Experts (MoE)?
What is retrieval-augmented generation?
RAG vs. fine-tuning
What is parameter-efficient fine-tuning (PEFT)?
LoRA vs. QLoRA
What is InstructLab?
What is vLLM?
What is Model Context Protocol (MCP)?
What is Model-as-a-Service?
What are Granite models?
AI at scale
What is AgentOps?
What is sovereign AI?
What is llm-d?
What is distributed inference?
What is enterprise AI?
What is edge AI?
What is MLOps?
What is LLMOps?
AIOps explained
What is AI security?
Understanding AI/ML use cases
What is AI in healthcare?
AI in banking
Understanding AI in telecommunications
Why choose Red Hat AI?
Build on a trusted foundation that supports any model and any agent on any hardware accelerator—across the hybrid cloud. Red Hat AI gives organizations the freedom to deploy where their data, compliance, and cost requirements demand.
Inference
Manage model complexity with fast, efficient inference powered by vLLM and the control to run any model on any accelerator across the hybrid cloud.
Data
Customize domain-specific agentic AI use cases with models connected to your organization’s own private data.
Agents
Simplify and accelerate your journey to successful agentic AI adoption with governance and control.
Platform
Deploy resilient, trustworthy AI solutions on a foundation of open source transparency and hybrid cloud scalability.
Red Hat AI portfolio
Scale your AI foundation
- Customize models with control.
- Optimize resource allocation.
Optimize model performance
- Fast inference at scale.
- Powered by vLLM.
Build and deploy AI applications
- Manage the full AI lifecycle.
- Implement AI guardrails.
Run LLMs on an individual server
- Develop, test, and run gen AI.
- Fast, flexible inference.
AI customer stories from Red Hat Summit and AnsibleFest 2025
Turkish Airlines doubled the speed of deployment times with organization-wide data access.
JCCM improved the region's environmental impact assessment (EIA) processes using AI.
Denizbank sped up time to market from days to minutes.
Hitachi operationalized AI across its entire business with Red Hat AI.