- Baichuan Intelligence, started by Sogou founder Wang Xiaochuan, has unveiled its large language model, Baichuan-13B.
- The 13-billion parameter open-source model is trained on Chinese and English data, and outmatches Meta’s LLaMa, which uses 1 trillion tokens in its comparable model.
- As China readies for strict AI regulations, Baichuan-13B provides a free, approved model that can operate on various consumer-grade hardware amidst U.S. AI chip sanctions.