The 24.4 release introduced GGUF support and quantisation for models such as Llama3 and many new features for Mojo. With that behind us Modular is now working hard towards the next release. To try out the new features press the Nightly tab on the MAX installation guide. There's an active community in the #nightly Discord channel to discuss the upcoming features with, and Modular staff are there to answer questions and respond to your feedback!
The latest MAX nightly release now has a interactive GUI chat interface for Llama3, the commands to beta test it before the official release are here.
Blogs, Tutorials, and Videos
- Read about What's New in MAX 24.4.
- Chris Lattner talked about the broader vision for Mojo and MAX at the AI Engineer World’s Fair 2024.
- Billy and Walter talked about Mojo debugging: extending MLIR and LLDB.
- Weiwei dived deep into Efficient Data-Flow Analysis on Region-Based Control Flow in MLIR.
- Medhi and Jeff explored how MLIR can be slow generating LLVM IR, and how Mojo avoids the footguns.
- Santiago released an introduction to Mojo (for Python developers)
- Mike released a video: Mojo First Impression.
Awesome Mojo
You can discuss these projects with the creators on the Discord channel: #community
- Frank built libjpeg-mojo: bindings to libjpeg from Mojo.
- Samay reinvigorated and updated arrow.mojo for the latest Mojo release.
- Kevin released mojonet: PyTorch Neural Network wrapper to help transition existing projects.
Open Source Contributions
Check out all the open source contributions here. Any significant changes that made it into the changelog are here. make sure to DM Jack Clayton on Discord if you've had your first PR merged to claim some epic Mojo swag!
Check out the merged contributions this week from our valuable community members: