The Modular platform (formerly known as MAX) is a comprehensive suite of AI libraries, tools, and technologies designed to unify fragmented AI deployment workflows. By providing developers with a single, powerful toolchain, it enables full programmability, exceptional performance, and seamless hardware portability, significantly speeding up the time to market for AI innovations.
Key Features
- End-to-End Pipelines: Includes ready-to-run pipelines for common AI workloads, such as self-hosting the Llama 3.1 text-generation model, with customizable and extensible code.
- Code Examples & Notebooks: The repository contains numerous examples and Jupyter notebooks demonstrating how to utilize the Modular platform for various AI tasks.
- Stable & Nightly Builds: Available in both stable and nightly releases, with clear instructions for installation and project setup using Magic.
- MAX Engine & Containers: Features the MAX Engine for high-performance model inference and the MAX container for convenient Docker-based deployment, including GPU support and integration with Kubernetes.
- Community & Support: Active Discord community and forums for discussions, bug reports, and feature requests.
Getting Started
- Clone the Repository:
bash git clone https://github.com/modular/modular.git
- Switch to Stable Branch (if using stable release):
bash git checkout stable
- Explore Examples: Check out the
examples
andtutorials
directories for practical guides and finished code.
Contribution & Licensing
- Contributions to the Mojo standard library are welcome (see Contribution Guide).
- Licensed under the Apache License v2.0 with LLVM Exceptions, with additional terms under the Modular, MAX & Mojo Community License.
For more details, visit the Modular documentation or join the community on Discord.