Grok-1

Open source release of xAI's LLM

2024-03-18

Grok-1
This is the base model weights and network architecture of Grok-1, xAI's large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.
Grok-1 is an open-source large language model developed by xAI, featuring a robust 314 billion parameter Mixture-of-Experts architecture. Designed for advanced natural language processing, it offers a versatile foundation for AI applications, enabling developers to leverage its extensive training and scalability for innovative projects. Its open availability fosters collaboration and customization in AI research and development.
Twitter Artificial Intelligence GitHub