AutoGen-GetSatrt

AutoGen 是一个多 Agent 框架,聚焦多 Agent 通过协同去解决任务,目前 AutoGen 最新版还在开发阶段,代码文档相对比较混乱

以下 AutoGen 例子中涉及 2 个概念:Agent、Team,其中 Agent 定位是收到消息采取行动,所以可以为其绑定工具或者函数,team 则定义 Agent 的交互方式,其中 RoundRobinGroupChat 表示顺序循环调用 Agent

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
import nest_asyncio
nest_asyncio.apply()
import requests
from api_key import GAODE_WEATHER_API_KEY
async def get_weather(city_name:str)->str:
'''根据输入的城市名,使用高德接口查询当地天气'''
# 构建请求URL
url = f'https://restapi.amap.com/v3/weather/weatherInfo?key={GAODE_WEATHER_API_KEY}&city={city_name}&extensions=all'
try:
# 发送GET请求
response = requests.get(url)
# 检查请求是否成功
if response.status_code == 200:
# 解析响应内容
data = response.json()
# 检查API响应状态
if data.get('status') == '1' and data.get('infocode') == '10000':
# return data # 返回天气数据
today_weather=data['forecasts'][0]['casts'][0]
return f"今天:{today_weather['date']}\n广州天气:{today_weather['dayweather']} 白天气温{today_weather['daytemp']}摄氏度 晚上气温{today_weather['nighttemp']}摄氏度"
else:
return '' # 返回空结果,因为API返回了错误信息
else:
return '' # 返回空结果,因为HTTP状态码不是200
except Exception as e:
print(f"An error occurred: {e}")
return '' # 返回空结果,因为请求过程中发生了异常

由于 AutoGen 官方没有支持 ollma 模型调用,目前通过 litellm 对接 ollama 与 AutoGen,在 ollama 已有模型的前提下,通过 litellm --model ollama/qwen2.5:latest 实现,然后在 http:\\localhost:4000 使用 ollama 模型

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main()->None:
weather_agent=AssistantAgent(
name='weather_agent',
model_client=OpenAIChatCompletionClient(
model="gpt-4o",
api_key="NotRequiredSinceWeAreLocal",
base_url="http://192.168.3.155:4000",
),
tools=[get_weather]
)
termination = MaxMessageTermination(3)
agent_team=RoundRobinGroupChat(
[weather_agent],
termination_condition=termination
)
stream=agent_team.run_stream(task="广州天气怎么样?")
await Console(stream=stream)
await main()
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---------- user ----------
广州天气怎么样?
C:\Users\wushaogui\miniconda3\lib\site-packages\autogen_agentchat\agents\_assistant_agent.py:307: UserWarning: Resolved model mismatch: gpt-4o-2024-08-06 != ollama/qwen2.5:latest. Model mapping may be incorrect.
result = await self._model_client.create(
---------- weather_agent ----------
[FunctionCall(id='call_121204b2-2f4d-4453-a896-5527b8673774', arguments='{"city_name": "\\u5e7f\\u5dde"}', name='get_weather')]
[Prompt tokens: 177, Completion tokens: 18]
---------- weather_agent ----------
[FunctionExecutionResult(content='今天:2024-12-12\n广州天气:多云 白天气温20摄氏度 晚上气温12摄氏度', call_id='call_121204b2-2f4d-4453-a896-5527b8673774')]
---------- weather_agent ----------
今天:2024-12-12
广州天气:多云 白天气温20摄氏度 晚上气温12摄氏度
---------- Summary ----------
Number of messages: 4
Finish reason: Maximum number of messages 3 reached, current message count: 4
Total prompt tokens: 177
Total completion tokens: 18
Duration: 0.87 seconds