DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models
Go to file
2024-01-02 11:35:12 +08:00
README.md Update README.md 2024-01-02 11:35:12 +08:00

DeepSeek-MoE

9. License

This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the Model License. DeepSeek Coder supports commercial use.

See the LICENSE-CODE and LICENSE-MODEL for more details.