diff --git a/README.md b/README.md
index 8979d31..59b93c5 100644
--- a/README.md
+++ b/README.md
@@ -144,6 +144,10 @@ inputs = tokenizer.encode(input_text, return_tensors="pt").to(device)
 outputs = model.generate(**inputs, max_length=128)
 print(tokenizer.decode(outputs[0]))
 ```
+
+---
+In the following scenario, the Deepseek Coder model effectively calls a class **IrisClassifier** and its member function from the `model.py` file, and also utilizes functions from the `utils.py` file, to complete a function for model training and evaluation.
+
 ![Completion GIF](pictures/completion_demo.gif)
 
 #### Chat Model Inference