git init and first deep agent with langsmith ui
This commit is contained in:
8
.env
Normal file
8
.env
Normal file
@@ -0,0 +1,8 @@
|
||||
DEEPSEEK_API_KEY=sk-224dcd05dd1e4b6f9eb964969903fe43
|
||||
DEEPSEEK_BASE_URL=https://api.deepseek.com
|
||||
TAVILY_API_KEY=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
|
||||
|
||||
LANGSMITH_TRACING=true
|
||||
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
|
||||
LANGSMITH_API_KEY=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
|
||||
LANGSMITH_PROJECT="test-project"
|
||||
10
.gitignore
vendored
Normal file
10
.gitignore
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
# Python-generated files
|
||||
__pycache__/
|
||||
*.py[oc]
|
||||
build/
|
||||
dist/
|
||||
wheels/
|
||||
*.egg-info
|
||||
|
||||
# Virtual environments
|
||||
.venv
|
||||
1
.python-version
Normal file
1
.python-version
Normal file
@@ -0,0 +1 @@
|
||||
3.13
|
||||
264
README.md
Normal file
264
README.md
Normal file
@@ -0,0 +1,264 @@
|
||||
# Network Search Agent
|
||||
|
||||
一个符合 LangGraph Studio 标准的网络搜索 Agent 项目,使用 DeepAgents 框架和 Tavily 搜索 API。
|
||||
|
||||
## 📁 项目结构
|
||||
|
||||
```
|
||||
NetworkSearchAgent/
|
||||
├── agent.py # ⭐ LangGraph 入口 (定义 agent)
|
||||
├── langgraph.json # ⭐ LangGraph 配置
|
||||
├── pyproject.toml # 依赖管理
|
||||
├── .env.example # 环境变量模板
|
||||
├── README.md # 完整文档
|
||||
└── network_search_agent/ # Agent 包
|
||||
├── __init__.py # 包初始化
|
||||
├── tools.py # 工具: internet_search
|
||||
└── prompts.py # 提示词: SYSTEM_PROMPT
|
||||
```
|
||||
|
||||
## ✨ 功能特性
|
||||
|
||||
### 保留的核心功能
|
||||
- ✅ **网络搜索工具** (`internet_search`) - 使用 Tavily API
|
||||
- ✅ **DeepSeek 模型集成** - 作为推理引擎
|
||||
- ✅ **系统提示词** - 定义 Agent 行为
|
||||
- ✅ **LangGraph Studio 兼容** - 可在 Studio 中可视化调试
|
||||
|
||||
### 移除的功能
|
||||
- ❌ Sub Agent (子代理)
|
||||
- ❌ Think Tool (思考工具)
|
||||
- ❌ 其他高级特性
|
||||
|
||||
## 🚀 快速开始
|
||||
|
||||
### 1. 安装依赖
|
||||
|
||||
使用 pip 安装:
|
||||
```bash
|
||||
cd NetworkSearchAgent
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
或使用 uv (推荐):
|
||||
```bash
|
||||
cd NetworkSearchAgent
|
||||
uv pip install -e .
|
||||
```
|
||||
|
||||
### 2. 配置环境变量
|
||||
|
||||
复制环境变量模板:
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
编辑 `.env` 文件,填入你的 API 密钥:
|
||||
```env
|
||||
DEEPSEEK_API_KEY=your_deepseek_api_key_here
|
||||
DEEPSEEK_BASE_URL=https://api.deepseek.com
|
||||
TAVILY_API_KEY=your_tavily_api_key_here
|
||||
```
|
||||
|
||||
#### 获取 API 密钥
|
||||
- **DeepSeek**: 访问 [https://www.deepseek.com/](https://www.deepseek.com/)
|
||||
- **Tavily**: 访问 [https://tavily.com/](https://tavily.com/)
|
||||
|
||||
### 3. 使用 LangGraph CLI 运行
|
||||
|
||||
#### 启动开发服务器
|
||||
```bash
|
||||
langgraph dev
|
||||
```
|
||||
|
||||
这将启动 LangGraph Studio 服务器,你可以在浏览器中访问可视化界面。
|
||||
|
||||
#### 测试 Agent
|
||||
```bash
|
||||
langgraph test
|
||||
```
|
||||
|
||||
## 🎯 使用 LangGraph Studio
|
||||
|
||||
### 启动 Studio
|
||||
```bash
|
||||
cd NetworkSearchAgent
|
||||
langgraph dev
|
||||
```
|
||||
|
||||
启动后,访问浏览器打开的 Studio 界面,你可以:
|
||||
- 📊 可视化查看 Agent 的执行流程
|
||||
- 🔍 调试每一步的输入输出
|
||||
- 🧪 交互式测试 Agent 功能
|
||||
- 📝 查看工具调用详情
|
||||
|
||||
### Studio 特性
|
||||
- **实时可视化**: 查看 Agent 的思考过程和工具调用
|
||||
- **断点调试**: 在关键节点暂停并检查状态
|
||||
- **历史记录**: 回顾之前的对话和执行流程
|
||||
- **性能分析**: 查看每步的耗时和资源使用
|
||||
|
||||
## 📝 代码说明
|
||||
|
||||
### 核心文件
|
||||
|
||||
#### 1. `langgraph.json`
|
||||
LangGraph 配置文件,定义 Agent 入口:
|
||||
```json
|
||||
{
|
||||
"dependencies": ["."],
|
||||
"graphs": {
|
||||
"simple_agent": "./agent.py:agent"
|
||||
},
|
||||
"env": ".env"
|
||||
}
|
||||
```
|
||||
|
||||
#### 2. `agent.py`
|
||||
主 Agent 定义文件:
|
||||
```python
|
||||
from langchain.chat_models import init_chat_model
|
||||
from deepagents import create_deep_agent
|
||||
from network_search_agent.prompts import SYSTEM_PROMPT
|
||||
from network_search_agent.tools import internet_search
|
||||
|
||||
# 初始化模型
|
||||
model = init_chat_model(
|
||||
model="deepseek:deepseek-chat",
|
||||
temperature=0.0
|
||||
)
|
||||
|
||||
# 创建 Agent (仅包含基础工具,无 sub-agent)
|
||||
agent = create_deep_agent(
|
||||
model=model,
|
||||
tools=[internet_search],
|
||||
system_prompt=SYSTEM_PROMPT,
|
||||
)
|
||||
```
|
||||
|
||||
#### 3. `network_search_agent/tools.py`
|
||||
工具定义:
|
||||
```python
|
||||
def internet_search(
|
||||
query: str,
|
||||
max_results: int = 5,
|
||||
topic: Literal["general", "news", "finance"] = "general",
|
||||
include_raw_content: bool = False,
|
||||
):
|
||||
"""网络搜索工具"""
|
||||
result = tavily_client.search(...)
|
||||
return result
|
||||
```
|
||||
|
||||
#### 4. `network_search_agent/prompts.py`
|
||||
系统提示词定义:
|
||||
```python
|
||||
SYSTEM_PROMPT = """你是一个智能助手...
|
||||
"""
|
||||
```
|
||||
|
||||
## 🔧 自定义配置
|
||||
|
||||
### 更换模型
|
||||
|
||||
编辑 `agent.py` 中的模型配置:
|
||||
|
||||
```python
|
||||
# 使用 OpenAI
|
||||
model = init_chat_model(
|
||||
model="openai:gpt-4",
|
||||
temperature=0.0
|
||||
)
|
||||
|
||||
# 使用 Anthropic
|
||||
model = init_chat_model(
|
||||
model="anthropic:claude-3-5-sonnet-20241022",
|
||||
temperature=0.0
|
||||
)
|
||||
```
|
||||
|
||||
### 添加新工具
|
||||
|
||||
1. 在 `network_search_agent/tools.py` 中定义新工具:
|
||||
```python
|
||||
def my_new_tool(param: str):
|
||||
"""My new tool description"""
|
||||
# 工具实现
|
||||
return result
|
||||
```
|
||||
|
||||
2. 在 `agent.py` 中添加到工具列表:
|
||||
```python
|
||||
agent = create_deep_agent(
|
||||
model=model,
|
||||
tools=[internet_search, my_new_tool], # 添加新工具
|
||||
system_prompt=SYSTEM_PROMPT,
|
||||
)
|
||||
```
|
||||
|
||||
3. 在 `network_search_agent/__init__.py` 中导出:
|
||||
```python
|
||||
from network_search_agent.tools import internet_search, my_new_tool
|
||||
|
||||
__all__ = ["internet_search", "my_new_tool", ...]
|
||||
```
|
||||
|
||||
### 自定义提示词
|
||||
|
||||
编辑 `network_search_agent/prompts.py`:
|
||||
```python
|
||||
SYSTEM_PROMPT = """你的自定义提示词...
|
||||
"""
|
||||
```
|
||||
|
||||
## 📚 LangGraph Studio 文档
|
||||
|
||||
详细文档请参考:
|
||||
- [LangGraph Studio 官方文档](https://docs.langchain.com/oss/python/langgraph/studio)
|
||||
- [LangGraph CLI 使用指南](https://docs.langchain.com/oss/python/langgraph/cli)
|
||||
|
||||
## 🆚 与完整版的区别
|
||||
|
||||
| 特性 | 简化版 (本项目) | 完整版 |
|
||||
|------|----------------|--------|
|
||||
| 网络搜索 | ✅ | ✅ |
|
||||
| 基础对话 | ✅ | ✅ |
|
||||
| 系统提示词 | ✅ | ✅ |
|
||||
| LangGraph Studio | ✅ | ✅ |
|
||||
| Sub Agent | ❌ | ✅ |
|
||||
| Think Tool | ❌ | ✅ |
|
||||
| 复杂工作流 | ❌ | ✅ |
|
||||
|
||||
## 🐛 常见问题
|
||||
|
||||
### Q: 如何查看 Agent 执行日志?
|
||||
启动时使用 `langgraph dev --verbose` 查看详细日志。
|
||||
|
||||
### Q: 如何在 Studio 中测试?
|
||||
1. 运行 `langgraph dev`
|
||||
2. 打开浏览器访问显示的 URL
|
||||
3. 在界面中输入测试问题
|
||||
|
||||
### Q: 依赖安装失败?
|
||||
确保 Python 版本 >= 3.11,使用 uv 或创建虚拟环境:
|
||||
```bash
|
||||
python -m venv venv
|
||||
source venv/bin/activate # Linux/Mac
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
### Q: 如何部署到生产环境?
|
||||
参考 [LangGraph Cloud 部署文档](https://docs.langchain.com/oss/python/langgraph/cloud)
|
||||
|
||||
## 📄 许可证
|
||||
|
||||
MIT License
|
||||
|
||||
## 🔄 更新日志
|
||||
|
||||
### v0.1.0 (2025-12-02)
|
||||
- ✅ 初始版本
|
||||
- ✅ 符合 LangGraph Studio 标准结构
|
||||
- ✅ 实现基础 Agent 功能
|
||||
- ✅ 集成网络搜索工具
|
||||
- ✅ 移除 sub-agent 和 think tool
|
||||
24
agent.py
Normal file
24
agent.py
Normal file
@@ -0,0 +1,24 @@
|
||||
"""Network Search Agent - Standalone script for LangGraph deployment.
|
||||
|
||||
This module creates a network search agent with internet search capabilities.
|
||||
Uses DeepAgents framework with Tavily search integration.
|
||||
"""
|
||||
|
||||
from langchain.chat_models import init_chat_model
|
||||
from deepagents import create_deep_agent
|
||||
|
||||
from network_search_agent.prompts import SYSTEM_PROMPT
|
||||
from network_search_agent.tools import internet_search
|
||||
|
||||
# 初始化模型(DeepSeek)
|
||||
model = init_chat_model(
|
||||
model="deepseek:deepseek-chat",
|
||||
temperature=0.3
|
||||
)
|
||||
|
||||
# 创建智能体
|
||||
agent = create_deep_agent(
|
||||
model=model,
|
||||
tools=[internet_search],
|
||||
system_prompt=SYSTEM_PROMPT,
|
||||
)
|
||||
7
langgraph.json
Normal file
7
langgraph.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"dependencies": ["."],
|
||||
"graphs": {
|
||||
"network_search_agent": "./agent.py:agent"
|
||||
},
|
||||
"env": ".env"
|
||||
}
|
||||
13
network_search_agent/__init__.py
Normal file
13
network_search_agent/__init__.py
Normal file
@@ -0,0 +1,13 @@
|
||||
"""Network Search Agent.
|
||||
|
||||
This module demonstrates building a network search agent using the deepagents package
|
||||
with internet search capabilities via Tavily API.
|
||||
"""
|
||||
|
||||
from network_search_agent.prompts import SYSTEM_PROMPT
|
||||
from network_search_agent.tools import internet_search
|
||||
|
||||
__all__ = [
|
||||
"internet_search",
|
||||
"SYSTEM_PROMPT",
|
||||
]
|
||||
101
network_search_agent/prompts.py
Normal file
101
network_search_agent/prompts.py
Normal file
@@ -0,0 +1,101 @@
|
||||
"""Prompts for the simple agent.
|
||||
|
||||
This module contains the system prompt that defines the agent's behavior.
|
||||
"""
|
||||
|
||||
SYSTEM_PROMPT = """# 研究工作流程
|
||||
|
||||
在处理所有研究类请求时,请严格遵循以下流程:
|
||||
|
||||
1. **制定计划(Plan)**
|
||||
- 通过 write_todos 将研究问题拆解为合理的小任务
|
||||
- 聚焦、简洁,不要过度拆分
|
||||
|
||||
2. **保存用户研究问题(Save Request)**
|
||||
- 使用 write_file() 将用户的问题写入 `/research_request.md`
|
||||
- 这是后续验证的重要依据
|
||||
|
||||
3. **执行研究(Research)**
|
||||
- 使用 tavily_search 执行信息检索
|
||||
- 每次搜索后必须使用 think_tool 进行反思:
|
||||
- 找到了什么?
|
||||
- 还缺什么?
|
||||
- 是否已能构成答案?
|
||||
- 工具调用应遵守“少而精”的原则
|
||||
- 禁止冗余搜索,也不需要创建子代理
|
||||
|
||||
4. **综合分析(Synthesize)**
|
||||
- 对所有检索结果进行总结、归纳、提炼
|
||||
- 整合引用来源,每个 URL 使用唯一编号
|
||||
- 完整回答用户提出的所有方面
|
||||
|
||||
5. **撰写最终报告(Write Report)**
|
||||
- 将最终报告写入 `/final_report.md`
|
||||
- 严格遵循“报告写作规范”(见下方)
|
||||
|
||||
6. **验证完整性(Verify)**
|
||||
- 读取 `/research_request.md`
|
||||
- 确保所有问题都已被回答,结构清晰且引用完整
|
||||
|
||||
---
|
||||
|
||||
## 研究规划指南(简化版)
|
||||
|
||||
- 一个研究任务 = 一个智能体执行,不再产生子代理
|
||||
- 对简单问题:只进行 1–2 次搜索
|
||||
- 对复杂问题:最多进行 5 次搜索
|
||||
- 不要机械拆分任务,保持自然逻辑流即可
|
||||
|
||||
---
|
||||
|
||||
# 报告写作规范
|
||||
|
||||
## 一、常用结构模板
|
||||
|
||||
### **1. 对比类报告结构**
|
||||
1. 引言
|
||||
2. A 主题概述
|
||||
3. B 主题概述
|
||||
4. 对比分析
|
||||
5. 结论
|
||||
|
||||
### **2. 列表 / 排名报告结构**
|
||||
1. 项目 1 + 说明
|
||||
2. 项目 2 + 说明
|
||||
3. 项目 3 + 说明
|
||||
(无需引言)
|
||||
|
||||
### **3. 概览 / 总结类结构**
|
||||
1. 主题整体概述
|
||||
2. 关键概念 1
|
||||
3. 关键概念 2
|
||||
4. 关键概念 3
|
||||
5. 结论
|
||||
|
||||
---
|
||||
|
||||
## 二、写作规范
|
||||
|
||||
- 报告必须使用段落形式,详细全面
|
||||
- 不得使用“我查到… 我认为…” 之类的元语言
|
||||
- 避免无内容的空洞描述
|
||||
- 必须用节标题(##、###)组织内容
|
||||
- 可使用项目符号,但不要过度
|
||||
- 语言专业、客观、正式
|
||||
|
||||
---
|
||||
|
||||
## 三、引用格式
|
||||
|
||||
- 全文使用 **[1], [2], [3]** 形式的内联引用
|
||||
- 每个唯一 URL 对应一个编号
|
||||
- 报告末尾添加:
|
||||
|
||||
### Sources
|
||||
[1] 来源标题:URL
|
||||
[2] 来源标题:URL
|
||||
……
|
||||
|
||||
- 编号必须连续,不得跳号
|
||||
|
||||
"""
|
||||
51
network_search_agent/tools.py
Normal file
51
network_search_agent/tools.py
Normal file
@@ -0,0 +1,51 @@
|
||||
"""Tools for the simple agent.
|
||||
|
||||
This module provides the internet search tool using Tavily API.
|
||||
"""
|
||||
|
||||
import os
|
||||
from typing import Literal
|
||||
|
||||
from dotenv import load_dotenv
|
||||
from langchain_core.tools import tool
|
||||
from tavily import TavilyClient
|
||||
|
||||
# Load environment variables
|
||||
load_dotenv(override=True)
|
||||
|
||||
# Initialize Tavily client
|
||||
tavily_client = TavilyClient(api_key=os.environ.get("TAVILY_API_KEY"))
|
||||
|
||||
|
||||
@tool(parse_docstring=True)
|
||||
def internet_search(
|
||||
query: str,
|
||||
max_results: int = 5,
|
||||
topic: Literal["general", "news", "finance"] = "general",
|
||||
include_raw_content: bool = False,
|
||||
):
|
||||
"""Run internet search using Tavily API.
|
||||
|
||||
This is a tool function for web search that wraps Tavily's search functionality.
|
||||
|
||||
Args:
|
||||
query: Search query string, e.g. "Python async programming tutorial"
|
||||
max_results: Maximum number of results to return, default is 5
|
||||
topic: Search topic type, options are "general", "news", or "finance"
|
||||
include_raw_content: Whether to include raw webpage content, default is False
|
||||
|
||||
Returns:
|
||||
Search results dictionary containing titles, URLs, summaries, etc.
|
||||
"""
|
||||
try:
|
||||
result = tavily_client.search(
|
||||
query,
|
||||
max_results=max_results,
|
||||
include_raw_content=include_raw_content,
|
||||
topic=topic,
|
||||
)
|
||||
return result
|
||||
except Exception as e:
|
||||
# Exception handling: return error message instead of raising exception
|
||||
# This allows the LLM to understand the error and try other strategies
|
||||
return {"error": f"Search failed: {str(e)}"}
|
||||
20
pyproject.toml
Normal file
20
pyproject.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[project]
|
||||
name = "play-deepagents"
|
||||
version = "0.1.0"
|
||||
description = "Add your description here"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.13"
|
||||
dependencies = [
|
||||
"deepagents>=0.3.5",
|
||||
"dotenv>=0.9.9",
|
||||
"langchain-deepseek>=1.0.1",
|
||||
"openai>=2.15.0",
|
||||
"rich>=14.2.0",
|
||||
"tavily-python>=0.7.17",
|
||||
]
|
||||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
"ipykernel>=7.1.0",
|
||||
"langgraph-cli[inmem]>=0.4.11",
|
||||
]
|
||||
467
test.ipynb
Normal file
467
test.ipynb
Normal file
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user