MosaicML Releases Open-Source MPT-30B LLMs, Trained on H100s to Power Generative AI Applications

MosaicML announced the availability of MPT-30B Base, Instruct, and Chat, the most advanced models in their MPT (MosaicML Pretrained Transformer) series of open-source large language models. These state-of-the-art models – which were trained with an 8k token context window – surpass the quality of the original GPT-3 and can be used directly for inference and/or as starting points for building proprietary models.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to our Newsletter