- 在 config.json 中新增 dev 配置项用于区分开发与生产环境 - 实现 v-dev-only 指令,仅在开发模式下渲染元素 - 注册全局自定义指令 dev-only,支持通过 binding.value 控制启用状态 - 在 TaskSyllabus/Bg.vue 中使用 v-dev-only 指令隐藏生产环境下的加号区域 - 移除 card 样式中的固定 margin-bottom,改由容器控制间距 - 统一使用 CSS 变量 --color-border-separate 替代硬编码的分割线颜色 - 为 Task.vue 的搜索框添加 clearable 属性并移除弹出项的阴影效果 - Layout 组件名称规范化为首字母大写 - 在主题样式中定义深色与浅色模式下的 --color-border-separate 颜色值 - 覆盖 Element Plus 的 --el-fill-color-blank 以适配暗黑模式背景透明度需求
AgentCoord: Visually Exploring Coordination Strategy for LLM-based Multi-Agent Collaboration
System Usage
Installation
Install with Docker (Recommended)
If you have installed docker and docker-compose on your machine, we recommend running AgentCoord in docker:
Step 1: Clone the project:
git clone https://github.com/AgentCoord/AgentCoord.git
cd AgentCoord
Step 2: Config LLM (see LLM configuration (use docker)):
Step 3: Start the servers
docker-compose up
Step 4: Open http://localhost:8080/ to use the system.
Install on your machine
If you want to install and run AgentCoord on your machine without using docker:
Step 1: Clone the project
git clone https://github.com/AgentCoord/AgentCoord.git
cd AgentCoord
Step 2: Config LLM (see LLM configuration (install on your machine)):
Step 3: Install required packages, then run the backend and frontend servers separately (see readme.md for frontend and backend
Step 4: Open http://localhost:8080/ to use the system.
Configuration
LLM configuration (use docker)
You can set the configuration (i.e. API base, API key, Model name) for default LLM in ./docker-compose.yml. Currently, we only support OpenAI’s LLMs as the default model. We recommend using gpt-4-turbo-preview as the default model (WARNING: the execution process of multiple agents may consume a significant number of tokens). You can switch to a fast mode that uses the Mistral 8×7B model with hardware acceleration by Groq for the first time in strategy generation to strike a balance of response quality and efficiency. To achieve this, you need to set the FAST_DESIGN_MODE field in the yaml file as True and fill the GROQ_API_KEY field with the api key of Groq.
LLM configuration (install on your machine)
You can set the configuration in ./backend/config/config.yaml. See LLM configuration (use docker) for configuration explanations.
Agent configuration
Currently, we support config agents by role-prompting. You can customize your agents by changing the role prompts in AgentRepo\agentBoard_v1.json. We plan to support more methods to customize agents (e.g., supporting RAG, or providing a unified wrapper for customized agents) in the future.
More Papers & Projects for LLM-based Multi-Agent Collaboration
If you’re interested in LLM-based multi-agent collaboration and want more papers & projects for reference, you may check out the corpus collected by us. Any contribution to the corpus is also welcome.
