CodeWave文档相关API合集
  1. dify接口
CodeWave文档相关API合集
  • 第三方接口
  • 平台接口
    • 获取IDE中文档路径映射(对外SaaS)
      POST
    • 获取IDE中文档路径映射(doogfood) Copy
      POST
    • 账号密码登录平台
      POST
    • 获取H5组件列表信息
      GET
    • 获取PC组件列表信息
      GET
  • DocBot
    • git推送后触发
      POST
    • 自定义文档机器人
      POST
  • 写文档测试用
    • 在逻辑中调用第三方接口
      GET
    • 测试平台应用的API
      GET
    • python接口_返回原json
      POST
    • 低代码应用接口
      POST
    • python接口_返回原数字
      POST
    • python接口_返回原列表
      POST
    • python接口_返回原body
      POST
    • python接口_流式
      POST
    • python接口_自签名证书
      POST
    • 自定义_查询天气
      POST
  • 文档中心接口
    • 根据ID查文档
  • 活链服务
    • 活链跳转
  • 公司接口
    • gitlab
      • 获取项目下文件夹的文件列表
  • github接口
    • 获取user信息
    • 获取token
  • 钉钉接口
    • 获取用户通讯录个人信息
    • 获取用户token
    • 获取部门列表
    • 获取企业内部应用的accessToken
    • 获取部门详情
  • 飞书
    • 获取授权登录授权码
    • 自建应用获取 tenant_access_token
    • 获取 user_access_token
    • 获取登录用户信息
    • 自建应用获取 app_access_token
  • AI接口
    • 自部署kimi
    • kimi官方接口
    • 聚合AI(廉价的
    • 阿里云百炼
    • 其他中转
    • 讯飞星火
    • 零一万物
    • 通义千问
    • 字节-豆包
    • 字节-扣子
    • 智谱清言
    • deepseek
    • 扣子语音合成
    • LM Studio
  • 阿里云接口
    • 身份证校验
  • 百度云API
    • 获取Access_token
    • 银行卡识别
  • 百川智能
  • 免key接口
    • IP地址查询
    • 手机号码归属地查询
    • 网易云音乐热门评论随机API接口
  • 免key接口 Copy
    • IP地址查询
    • 手机号码归属地查询
    • 网易云音乐热门评论随机API接口
  • 文档脚本
    • 测试腾讯云云函数
    • 测试调用本地端口转发
  • 云函数
    • 腾讯云函数
      • 测试动态路径
  • 文档桥接项目
    • 获取已上传的文档图片
  • IP查询
    • aiqimao.com
    • tianapi.com
  • 本地脚本
    • 在线测试
  • 泡泡文档
    • 根据文档id获取文档内容
  • 扣子
    • 上传文件
  • 泡泡机器人
    • 认证
      • 获取访问令牌 accessToken
    • 服务端API
      • 消息
        • 发送用户消息
  • 扣子接口
    • 发起对话
  • dify接口
    • 插件
    • 对话
      POST
    • CoreAgent资产
      POST
    • dify资产
      POST
    • CA获取插件标签列表
      POST
    • 解析插件文件信息
      POST
    • CA上传插件
      POST
    • CA安装插件
      POST
  1. dify接口

dify资产

开发中
POST
https://marketplace.dify.ai/api/v1/plugins/search/advanced

请求参数

Header 参数
content-type
string 
可选
示例值:
application/json
pragma
string 
可选
示例值:
no-cache
priority
string 
可选
示例值:
u=1, i
Body 参数application/json
page
integer 
必需
page_size
integer 
必需
query
string 
必需
sort_by
string 
必需
sort_order
string 
必需
category
string 
必需
tags
array
必需
type
string 
必需
示例
{
    "page": 1,
    "page_size": 999,
    "query": "",
    "sort_by": "install_count",
    "sort_order": "DESC",
    "tags": [],
    "type": "plugin"
}

示例代码

Shell
JavaScript
Java
Swift
Go
PHP
Python
HTTP
C
C#
Objective-C
Ruby
OCaml
Dart
R
请求示例请求示例
Shell
JavaScript
Java
Swift
curl --location --request POST 'https://marketplace.dify.ai/api/v1/plugins/search/advanced' \
--header 'Content-Type: application/json' \
--data-raw '{
    "page": 1,
    "page_size": 999,
    "query": "",
    "sort_by": "install_count",
    "sort_order": "DESC",
    "tags": [],
    "type": "plugin"
}'

返回响应

🟢200成功
application/json
Body
code
integer 
必需
data
object 
必需
plugins
array [object {28}] 
必需
total
integer 
必需
msg
string 
必需
示例
{"code":0,"data":{"plugins":[{"agent_strategy":{},"badges":[],"brief":{"en_US":"Ollama"},"category":"model","created_at":"2024-12-04T09:46:32Z","endpoint":{},"icon":"langgenius/packages/ollama/_assets/icon_s_en.svg","index_id":"langgenius___ollama","install_count":129728,"introduction":"## Overview\n\nOllama is a cross-platform inference framework client (MacOS, Windows, Linux) designed for seamless deployment of large language models (LLMs) such as Llama 2, Mistral, Llava, and more. With its one-click setup, Ollama enables local execution of LLMs, providing enhanced data privacy and security by keeping your data on your own machine.\n\nDify supports integrating LLM and Text Embedding capabilities of large language models deployed with Ollama.\n\n## Configure\n\n#### 1. Download Ollama\nVisit [Ollama download page](https://ollama.com/download) to download the Ollama client for your system.\n\n#### 2. Run Ollama and Chat with Llava\n\n````\nollama run llama3.2\n````\n\nAfter successful launch, Ollama starts an API service on local port 11434, which can be accessed at `http://localhost:11434`.\n\nFor other models, visit [Ollama Models](https://ollama.com/library) for more details.\n\n#### 3. Install Ollama Plugin\nGo to the Dify marketplace and search the Ollama to download it.\n\n![](./_assets/ollama-01.png)\n\n#### 4. Integrate Ollama in Dify\n\nIn `Settings \u003e Model Providers \u003e Ollama`, fill in:\n\n![](./_assets/ollama-02.png)\n\n- Model Name:`llama3.2`\n- Base URL: `http://\u003cyour-ollama-endpoint-domain\u003e:11434`\n- Enter the base URL where the Ollama service is accessible.\n- If Dify is deployed using Docker, consider using the local network IP address, e.g., `http://192.168.1.100:11434` or `http://host.docker.internal:11434` to access the service.\n- For local source code deployment, use `http://localhost:11434`.\n- Model Type: `Chat`\n- Model Context Length: `4096`\n- The maximum context length of the model. If unsure, use the default value of 4096.\n- Maximum Token Limit: `4096`\n- The maximum number of tokens returned by the model. If there are no specific requirements for the model, this can be consistent with the model context length.\n- Support for Vision: `Yes`\n- Check this option if the model supports image understanding (multimodal), like `llava`.\n\nClick \"Save\" to use the model in the application after verifying that there are no errors.\n\nThe integration method for Embedding models is similar to LLM, just change the model type to Text Embedding.\n\nFor more detail, please check [Dify's official document](https://docs.dify.ai/development/models-integration/ollama).\n","label":{"en_US":"Ollama"},"latest_package_identifier":"langgenius/ollama:0.0.5@cc38c90a58d4b4e43c9a821d352829b2c2a8d6d742de9fec9e61e6b34865b496","latest_version":"0.0.5","model":{"background":"#F9FAFB","configurate_methods":["customizable-model"],"description":{"en_US":"Ollama"},"help":{"title":{"en_US":"How to integrate with Ollama","zh_Hans":"如何集成 Ollama"},"url":{"en_US":"https://docs.dify.ai/tutorials/model-configuration/ollama"}},"icon_large":{"en_US":"icon_l_en.svg"},"icon_small":{"en_US":"icon_s_en.svg"},"label":{"en_US":"Ollama"},"model_credential_schema":{"credential_form_schemas":[{"default":null,"label":{"en_US":"Base URL","zh_Hans":"基础 URL"},"max_length":0,"options":[],"placeholder":{"en_US":"Base url of Ollama server, e.g. http://192.168.1.100:11434","zh_Hans":"Ollama server 的基础 URL,例如 http://192.168.1.100:11434"},"required":true,"show_on":[],"type":"text-input","variable":"base_url"},{"default":"chat","label":{"en_US":"Completion mode","zh_Hans":"模型类型"},"max_length":0,"options":[{"label":{"en_US":"Completion","zh_Hans":"补全"},"show_on":[],"value":"completion"},{"label":{"en_US":"Chat","zh_Hans":"对话"},"show_on":[],"value":"chat"}],"placeholder":{"en_US":"Select completion mode","zh_Hans":"选择对话类型"},"required":true,"show_on":[{"value":"llm","variable":"__model_type"}],"type":"select","variable":"mode"},{"default":"4096","label":{"en_US":"Model context size","zh_Hans":"模型上下文长度"},"max_length":0,"options":[],"placeholder":{"en_US":"Enter your Model context size","zh_Hans":"在此输入您的模型上下文长度"},"required":true,"show_on":[],"type":"text-input","variable":"context_size"},{"default":"4096","label":{"en_US":"Upper bound for max tokens","zh_Hans":"最大 token 上限"},"max_length":0,"options":[],"placeholder":null,"required":true,"show_on":[{"value":"llm","variable":"__model_type"}],"type":"text-input","variable":"max_tokens"},{"default":"false","label":{"en_US":"Vision support","zh_Hans":"是否支持 Vision"},"max_length":0,"options":[{"label":{"en_US":"Yes","zh_Hans":"是"},"show_on":[],"value":"true"},{"label":{"en_US":"No","zh_Hans":"否"},"show_on":[],"value":"false"}],"placeholder":null,"required":false,"show_on":[{"value":"llm","variable":"__model_type"}],"type":"radio","variable":"vision_support"},{"default":"false","label":{"en_US":"Function call support","zh_Hans":"是否支持函数调用"},"max_length":0,"options":[{"label":{"en_US":"Y
修改于 2025-05-20 06:04:35
上一页
CoreAgent资产
下一页
CA获取插件标签列表
Built with