Build and deploy high-performance GPU applications with MAX
MAX provides powerful libraries and tools to develop, optimize and deploy fast.
Why developers use MAX
Incredible Performance
MAX was built from the ground up to deliver out-of-the-box performance for AI workloads. See how we measure performance.
Hardware Portability
MAX provides portability across CPU+GPU generations, and gives you incredible utilization benefits - driving real compute cost savings.
Complete Control
Optimize your model's performance, write custom ops, or build your own model. MAX gives you full control over every layer of the stack.

Deploy Gen AI in seconds with MAX
Develop custom GPU research with MAX
MAX for Research
Advanced tools and libraries for model, kernel, and hardware developers to deliver even more precise control. Some of the many tools include:
Build custom graphs
Control single to multi-gpu scaling
Program heterogenous compute
Write custom GPU code
Low-level host and device control

FREE for everyone
Paid support for scaled enterprise deployments
MAX Self Managed
FREE FOREVERMAX is available FREE for everyone to self manage
Incredible performance for LLMs, PyTorch, and ONNX models
Deploy MAX yourself on-prem or on any cloud provider
Community support through Discord and Github
MAX Enterprise
PAY AS YOU GOSupport the largest deployments needed by your enterprise
SLA support with guaranteed response time.
Dedicated Slack channel and account manager.
Access to the world’s best AI engineering team.
What developers are saying about MAX
“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”
“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”
“The Community is incredible and so supportive. It’s awesome to be part of.”
“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”
“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”
“The Community is incredible and so supportive. It’s awesome to be part of.”
“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”
“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”
“The Community is incredible and so supportive. It’s awesome to be part of.”
“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”
“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”
“The Community is incredible and so supportive. It’s awesome to be part of.”
“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”
“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”
“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”
“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”
“I tried MAX builds last night, impressive indeed. I couldn't believe what I was seeing... performance is insane.”
“The more I benchmark, the more impressed I am with the MAX Engine.”
“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”
“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”
“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”
“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”
“I tried MAX builds last night, impressive indeed. I couldn't believe what I was seeing... performance is insane.”
“The more I benchmark, the more impressed I am with the MAX Engine.”
“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”
“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”
“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”
“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”
“I tried MAX builds last night, impressive indeed. I couldn't believe what I was seeing... performance is insane.”
“The more I benchmark, the more impressed I am with the MAX Engine.”
“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”
“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”
“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”
“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”
“I tried MAX builds last night, impressive indeed. I couldn't believe what I was seeing... performance is insane.”
“The more I benchmark, the more impressed I am with the MAX Engine.”
Start building with MAX
Easy ways to get started
Get started guide
With just a few commands, you can install MAX as a conda package and deploy a GenAI model on a local endpoint.
400+ open source models
Follow step by step recipes to build Agents, chatbots, and more with MAX.
Browse Examples
Follow step by step recipes to build Agents, chatbots, and more with MAX.