resolving conflicts

This commit is contained in:
Isaac Dyor 2025-01-22 17:29:35 -08:00
commit ec8daf9e30
12 changed files with 329 additions and 3 deletions

View File

@ -38,6 +38,11 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/pal/README.md"> Pal - AI Chat Client<br/>(iOS, ipadOS) </a> </td>
<td> Pal is a customized chat playground on iOS </td>
</tr>
<tr>
<td> <img src="https://www.librechat.ai/librechat.svg" alt="LibreChat" width="64" height="auto" /> </td>
<td> <a href="https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/deepseek">LibreChat</a> </td>
<td> LibreChat is a customizable open-source app that seamlessly integrates DeepSeek for enhanced AI interactions </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/rss-translator/RSS-Translator/main/core/static/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/rss_translator/README.md"> RSS Translator </a> </td>
@ -88,6 +93,16 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/deepseek-php/deepseek-laravel/blob/master/README.md">Laravel Integration</a> </td>
<td> Laravel wrapper for Deepseek PHP client, to seamless deepseek API integration with laravel applications.</td>
</tr>
<tr>
<td> <img src="./docs/zotero/assets/zotero-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/zotero/README_cn.md">Zotero</a></td>
<td> <a href="https://www.zotero.org">Zotero</a> is a free, easy-to-use tool to help you collect, organize, annotate, cite, and share research.</td>
</tr>
<tr>
<td> <img src="./docs/Siyuan/assets/image-20250122162731-7wkftbw.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Siyuan/README.md">SiYuan</a> </td>
<td> SiYuan is a privacy-first personal knowledge management system that supports complete offline usage, as well as end-to-end encrypted data sync.</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/102771702?s=200&v=4" alt="Wordware" width="64" height="auto" /> </td>
<td> <a href="docs/wordware/README.md">Wordware Integration</a> </td>
@ -174,6 +189,11 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="docs/llm.nvim/README.md"> llm.nvim </a> </td>
<td> A free large language model(LLM) plugin that allows you to interact with LLM in Neovim. Supports any LLM, such as Deepseek, GPT, GLM, Kimi or local LLMs (such as ollama). </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/codecompanion.nvim/README.md"> codecompanion.nvim </a> </td>
<td> AI-powered coding, seamlessly in Neovim. </td>
</tr>
</table>
### JetBrains Extensions
@ -229,4 +249,9 @@ English/[简体中文](https://github.com/deepseek-ai/awesome-deepseek-integrati
<td> <a href="https://github.com/rubickecho/n8n-deepseek"> n8n-nodes-deepseek </a> </td>
<td> An N8N community node that supports direct integration with the DeepSeek API into workflows. </td>
</tr>
<tr>
<td> <img src="https://framerusercontent.com/images/8rF2JOaZ8l9AvM4H6ezliw44aI.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="https://github.com/BerriAI/litellm"> LiteLLM </a> </td>
<td> Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format. Supports DeepSeek AI with cost tracking as well. </td>
</tr>
</table>

View File

@ -38,6 +38,11 @@
<td> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/pal/README_cn.md"> Pal - AI Chat Client<br/>(iOS, ipadOS) </a> </td>
<td> 一款可以在 iPhone 或 iPad 上使用的 AI 助手 </td>
</tr>
<tr>
<td> <img src="https://www.librechat.ai/librechat.svg" alt="LibreChat" width="64" height="auto" /> </td>
<td> <a href="https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/deepseek">LibreChat</a> </td>
<td> LibreChat 是一个可定制的开源应用程序,无缝集成了 DeepSeek以增强人工智能交互体验 </td>
</tr>
<tr>
<td> <img src="https://raw.githubusercontent.com/rss-translator/RSS-Translator/main/core/static/favicon.ico" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="hhttps://github.com/deepseek-ai/awesome-deepseek-integration/blob/main/docs/rss_translator/README_cn.md"> RSS翻译器 </a> </td>
@ -72,6 +77,16 @@
<td> <a href="docs/raycast/README_cn.md">Raycast</a></td>
<td> <a href="https://raycast.com/?via=ViGeng">Raycast</a> 是一款 macOS 生产力工具,它允许你用几个按键来控制你的工具。它支持各种扩展,包括 DeepSeek AI。</td>
</tr>
<tr>
<td> <img src="./docs/zotero/assets/zotero-icon.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/zotero/README_cn.md">Zotero</a></td>
<td> <a href="https://www.zotero.org">Zotero</a> 是一款免费且易于使用的文献管理工具,旨在帮助您收集、整理、注释、引用和分享研究成果。</td>
</tr>
<tr>
<td> <img src="./docs/Siyuan/assets/image-20250122162731-7wkftbw.png" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/Siyuan/README_cn.md">思源笔记</a> </td>
<td> 思源笔记是一款隐私优先的个人知识管理系统,支持完全离线使用,并提供端到端加密的数据同步功能。</td>
</tr>
<tr>
<td> <img src="https://avatars.githubusercontent.com/u/102771702?s=200&v=4" alt="Wordware" width="64" height="auto" /> </td>
<td> <a href="https://github.com/deepseek-php/deepseek-wordware/blob/master/README.md">Wordware </a> </td>
@ -157,6 +172,11 @@
<td> <a href="docs/llm.nvim/README.md"> llm.nvim </a> </td>
<td> 免费的大语言模型插件让你在Neovim中与大模型交互支持任意一款大模型比如DeepseekGPTGLMkimi或者本地运行的大模型(比如ollama) </td>
</tr>
<tr>
<td> <img src="https://github.com/user-attachments/assets/d66dfc62-8e69-4b00-8549-d0158e48e2e0" alt="Icon" width="64" height="auto" /> </td>
<td> <a href="docs/codecompanion.nvim/README.md"> codecompanion.nvim </a> </td>
<td> AI 驱动的编码,在 Neovim 中无缝集成. </td>
</tr>
</table>
### JetBrains 插件

50
docs/Siyuan/README.md Normal file
View File

@ -0,0 +1,50 @@
# README
![image](assets/image-20250122162731-7wkftbw.png)
---
SiYuan is a privacy-first personal knowledge management system that supports complete offline usage, as well as end-to-end encrypted data sync.
Fuse blocks, outlines, and bidirectional links to refactor your thinking.
## STEP1.
Apply a token from [deepseek open platform](https://platform.deepseek.com/)
## STEP 2
Enter the Settings interface
![image](assets/image-20250122163007-hkuruoe.png)
## STEP 3
Navigate to the "AI" tab in Settings and configure the following:
* Enter your DeepSeek API key in the text field
* Configure the endpoint URL and model details:
* URI: `https://api.deepseek.com/v1/`
* Model: `deepseek-chat`
* Temperature: 1.3
![image](assets/image-20250122162241-32a4oma.png)
how to use:
![image](assets/image-20250122162425-wlsgw0u.png)

36
docs/Siyuan/README_cn.md Normal file
View File

@ -0,0 +1,36 @@
# README_cn
![image](assets/image-20250122162731-7wkftbw.png)
---
思源笔记是一款注重隐私的个人知识管理系统,支持完全离线使用,并提供端到端加密的数据同步功能。通过融合块、大纲和双向链接,重构你的思维方式。
## 第一步
从[深度求索开放平台](https://platform.deepseek.com/)申请一个令牌。
## 第二步
进入设置界面
![image](assets/image-20250122163007-hkuruoe.png)
## 第三步
在设置中导航到“AI”选项卡并进行以下配置
* 在文本框中输入你的DeepSeek API密钥
* 配置端点URL和模型详细信息
* URI: `https://api.deepseek.com/v1/`
* 模型: `deepseek-chat`
* 温度: 1.3
![image](assets/image-20250122162241-32a4oma.png)
如何使用:
![image](assets/image-20250122162425-wlsgw0u.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 75 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 105 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

View File

@ -0,0 +1,97 @@
# [codecompanion.nvim](https://github.com/olimorris/codecompanion.nvim)
> AI-powered coding, seamlessly in Neovim
**codecompanion.nvim** is a productivity tool which streamlines how you develop with LLMs, in Neovim.
## Features
- :speech_balloon: [Copilot Chat](https://github.com/features/copilot) meets [Zed AI](https://zed.dev/blog/zed-ai), in Neovim
- :electric_plug: Support for Anthropic, Copilot, Gemini, Ollama, OpenAI, Azure OpenAI, HuggingFace and xAI LLMs (or bring your own!)
- :rocket: Inline transformations, code creation and refactoring
- :robot: Variables, Slash Commands, Agents/Tools and Workflows to improve LLM output
- :sparkles: Built in prompt library for common tasks like advice on LSP errors and code explanations
- :building_construction: Create your own custom prompts, Variables and Slash Commands
- :books: Have multiple chats open at the same time
- :muscle: Async execution for fast performance
## Installation
First, Navigate to the nvim configuration folder (default on Linux is `~/.config/nvim`)
### Install via `lazy.nvim`
Then to the `lua/plugins` folder. Create a file named `init.lua` and add the following content:
```lua
return {
"olimorris/codecompanion.nvim",
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
config = function()
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
end,
},
strategies = {
chat = { adapter = "deepseek", },
inline = { adapter = "deepseek" },
agent = { adapter = "deepseek" },
},
})
end
}
```
Restart nvim, and `lazy.nvim` should automatically download and install the codecompanion.nvim plugin and its dependencies based on the above file.
### Install via `mini.deps`
Add the following content to your `init.lua`:
```lua
local add, later = MiniDeps.add, MiniDeps.later
later(function()
add({
source = "olimorris/codecompanion.nvim",
depends = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
})
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
end,
},
strategies = {
chat = { adapter = "deepseek", },
inline = { adapter = "deepseek" },
agent = { adapter = "deepseek" },
},
})
end)
```
Restart nvim, and `mini.deps` should also automatically download and install the codecompanion.nvim plugin.
### Other Installation Methods
https://codecompanion.olimorris.dev/installation.html
## Usage
https://codecompanion.olimorris.dev/usage/introduction.html

View File

@ -0,0 +1,98 @@
# [codecompanion.nvim](https://github.com/olimorris/codecompanion.nvim)
> AI 驱动的编码,在 Neovim 中无缝集成
**codecompanion.nvim** 是一个提高生产力的工具,它简化了在 Neovim 中使用大型语言模型LLM进行开发的方式。
## 特性
- :speech_balloon: 在 Neovim 中,[Copilot Chat](https://github.com/features/copilot) 与 [Zed AI](https://zed.dev/blog/zed-ai) 的结合
- :electric_plug: 支持 Anthropic、Copilot、Gemini、Ollama、OpenAI、Azure OpenAI、HuggingFace 和 xAI LLM或自定义 LLM
- :rocket: 内联变换、代码生成与重构
- :robot: 通过变量、斜杠命令、代理/工具和工作流改善 LLM 输出
- :sparkles: 内置常用任务的提示词,例如 LSP 错误的建议和代码解释
- :building_construction: 创建自定义提示、变量和斜杠命令
- :books: 同时打开多个对话
- :muscle: 异步执行,提供快速的性能
## 安装
首先,导航到 Neovim 配置文件夹默认情况下Linux 上的路径是 `~/.config/nvim`)。
### 通过 `lazy.nvim` 安装
然后进入 `lua/plugins` 文件夹。创建一个名为 `init.lua` 的文件,并添加以下内容:
```lua
return {
"olimorris/codecompanion.nvim",
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
config = function()
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
end,
},
strategies = {
chat = { adapter = "deepseek", },
inline = { adapter = "deepseek" },
agent = { adapter = "deepseek" },
},
})
end
}
```
重新启动 Neovim`lazy.nvim` 应该会自动下载并安装 `codecompanion.nvim` 插件及其依赖项。
### 通过 `mini.deps` 安装
将以下内容添加到你的 `init.lua`
```lua
local add, later = MiniDeps.add, MiniDeps.later
later(function()
add({
source = "olimorris/codecompanion.nvim",
depends = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
})
require("codecompanion").setup({
adapters = {
deepseek = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "https://api.deepseek.com",
api_key = "YOUR_API_KEY",
},
})
end,
},
strategies = {
chat = { adapter = "deepseek", },
inline = { adapter = "deepseek" },
agent = { adapter = "deepseek" },
},
})
end)
```
重新启动 Neovim`mini.deps` 应该会自动下载并安装 `codecompanion.nvim` 插件。
### 其他安装方法
https://codecompanion.olimorris.dev/installation.html
## 使用
https://codecompanion.olimorris.dev/usage/introduction.html

View File

@ -23,7 +23,7 @@ return {
{
"Kurama622/llm.nvim",
dependencies = { "nvim-lua/plenary.nvim", "MunifTanjim/nui.nvim" },
cmd = { "LLMSesionToggle", "LLMSelectedTextHandler", "LLMAppHandler" },
cmd = { "LLMSessionToggle", "LLMSelectedTextHandler", "LLMAppHandler" },
config = function()
require("llm").setup({
url = "https://api.deepseek.com/chat/completions",

View File

@ -23,7 +23,7 @@ return {
{
"Kurama622/llm.nvim",
dependencies = { "nvim-lua/plenary.nvim", "MunifTanjim/nui.nvim" },
cmd = { "LLMSesionToggle", "LLMSelectedTextHandler", "LLMAppHandler" },
cmd = { "LLMSessionToggle", "LLMSelectedTextHandler", "LLMAppHandler" },
config = function()
require("llm").setup({
url = "https://api.deepseek.com/chat/completions",