🚀 Our MPT-7B family of open-source models is trending on the Hugging Face Hub! Take a look at our blog post to learn more. 🚀

✨ We’ve just launched our Inference service. Learn more in our blog post

Making ML Training Efficient

Improve efficiency of neural network training with algorithmic methods that deliver speed, boost quality and reduce cost.

Rooted in Rigorous ML Research

The MosaicML Composer is an open source deep learning library purpose-built to make it easy to add algorithmic methods and compose them together into novel recipes that speed up model training and improve model quality. This library includes 20 methods for computer vision and natural language processing in addition to standard models, datasets, and benchmarks

See how
from composer import Trainer
from composer.algorithms import ChannelsLast, MixUp
from composer.models import ComposerResNet

trainer = Trainer(
  model=ComposerResNet("resnet50", num_classes=1000),
  train_dataloader=your_train_dataloader,
  eval_dataloader=your_eval_dataloader,
  max_duration='90ep',
  algorithms=[
    ChannelsLast(),
    MixUp(alpha=0.1),
  ])

trainer.fit()
Get started
$ pip install mosaicml
Copy

Better Algorithms

We modify the training algorithm to improve training speed and model quality

Improved Quality

Improve accuracy, perplexity, and other metrics that matter to you

Lower Cost

Find the most cost effective way to run your training

Explore Tradeoffs

Identify the most cost effective ways to run training workloads across clouds and on different types of hardware backends for a variety of models and datasets.

As a Community

Let's develop the best solutions to the most challenging problems in ML today.

We want our community to be a safe and inclusive space for all current and future ML practitioners. Learn more in our Community Guidelines and Code of Conduct