DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models
Go to file
2024-01-02 11:41:01 +08:00
LICENSE-CODE Add files via upload 2024-01-02 11:35:53 +08:00
LICENSE-MODEL Add files via upload 2024-01-02 11:35:53 +08:00
README.md Update README.md 2024-01-02 11:41:01 +08:00

DeepSeek-MoE

9. License

This code repository is licensed under the MIT License. The use of DeepSeek models is subject to the Model License. DeepSeek supports commercial use.

See the LICENSE-CODE and LICENSE-MODEL for more details.