About

本文共143字。
Expand

About

Introduction

This project is inspired by LMSYS.

About LLMSYS

The development of large-scale pre-training models has been booming, with new large models emerging every two to three days.

The main objectives of this project are:

  • Summarize and address the engineering challenges and training techniques encountered during the training process of large-scale pre-training models.

  • Open-source and maintain a set of high-quality, reusable software tools related to LLM training and inference.

Share the state-of-the-art open-source LLMs.

About Maintainer

The current maintainer is Fang Taosong, an incoming computer science master’s student from the University of Chinese Academy of Sciences. His research focuses on LLM training, inference acceleration, and…

Contact email: fangtaosong2022@iscas.ac.cn