LLM
golang/eino eino框架的基础使用 Message以及ChatModel入门
字节跳动 Eino Go 语言 AI 应用开发框架入门教程,介绍 Message 消息结构、RoleType 角色类型、ChatModel 对话模型接口以及 Generate/Stream 流式调用方法。
Eino
Message
- 对话的基本单位
- 定义
type Message struct { Role RoleType `json:"role"` // 消息内容 用户或者模型的实际输出文本 Content string `json:"content"` // MultiContent 多模态消息内容 用户或者模型的多模态输出文本 // 如果 MultiContent 不为空,使用 MultiContent 而不是 Content // Deprecated: Use UserInputMultiContent for user multimodal inputs and AssistantGenMultiContent for model multimodal outputs. MultiContent []ChatMessagePart `json:"multi_content,omitempty"` // UserInputMultiContent passes multimodal content provided by the user to the model. UserInputMultiContent []MessageInputPart `json:"user_input_multi_content,omitempty"` // AssistantGenMultiContent is for receiving multimodal output from the model. AssistantGenMultiContent []MessageOutputPart `json:"assistant_output_multi_content,omitempty"` Name string `json:"name,omitempty"` // ToolCalls 工具调用列表 仅用于模型回复 ToolCalls []ToolCall `json:"tool_calls,omitempty"` // 工具调用ID 仅用于工具调用消息 ToolCallID string `json:"tool_call_id,omitempty"` // 工具名称 仅用于工具调用消息 ToolName string `json:"tool_name,omitempty"` ResponseMeta *ResponseMeta `json:"response_meta,omitempty"` // ReasoningContent is the thinking process of the model, which will be included when the model returns reasoning content. ReasoningContent string `json:"reasoning_content,omitempty"` // customized information for model implementation Extra map[string]any `json:"extra,omitempty"` } - RoleType
- Assistant = "assistant"
模型回复 - User = "user"
用户输入 - System = "system"
系统提示 - Tool = "tool"
工具调用
- Assistant = "assistant"
ChatModel
- 对话模型
- 定义
type BaseChatModel interface { Generate(ctx context.Context, input []*schema.Message, opts ...Option) (*schema.Message, error) Stream(ctx context.Context, input []*schema.Message, opts ...Option) ( *schema.StreamReader[*schema.Message], error) } BaseChatModel定义了对话模型的基本接口Generate生成单次消息的完整返回Stream生成一条消息的流式返回,即模型会实时返回消息的内容(通常是一个token一个token的返回),而不是等待模型完成后再返回
- eino 提供了多个对话模型的实现,如 OpenAI、Claude等,在使用时需要调用对应的实现
import ( "github.com/cloudwego/eino-ext/components/model/openai" ) chatModel, err := openai.NewChatModel(ctx, &openai.ChatModelConfig{ Model: "qwen/qwen3.6-plus:free", BaseURL: "https://openrouter.ai/api/v1", APIKey: "", })
如上述代码所示,chatModel就是一个 OpenAI 对话模型的实例,可以使用BaseURL来指定模型的 API 地址,来调用OpenAI Compatible API Generate方法使用示例ret, err := chatModel.Generate(ctx, messages) if err != nil { _, _ = fmt.Fprintln(os.Stderr, err) os.Exit(1) } fmt.Println(ret.String())
输出示例assistant: Hello! How can I assist you today? reasoning content: "Here's a thinking process:\n\n1. **Analyze User Input:** The user said \"hello\". This is a simple greeting.\n2. **Identify Intent:** The user is initiating a conversation.\n3. **Determine Response Strategy:** \n - Acknowledge the greeting warmly.\n - Offer assistance.\n - Keep it concise and friendly.\n4. **Draft Response:** \"Hello! How can I assist you today?\" or \"Hi there! What can I help you with?\"\n5. **Refine Response:** Both are good. I'll go with a friendly, open-ended response that invites the user to ask a question or state their need.\n6. **Final Output Generation:** \"Hello! How can I assist you today?\" (matches the refined response)✅" finish_reason: stop usage: &{22 {0} 184 206 {170}}Stream方法使用示例stream, err := cm.Stream(ctx, messages) if err != nil { _, _ = fmt.Fprintln(os.Stderr, err) os.Exit(1) } defer stream.Close() for { frame, err := stream.Recv() if errors.Is(err, io.EOF) { break } if err != nil { _, _ = fmt.Fprintln(os.Stderr, err) os.Exit(1) } if frame != nil { _, _ = fmt.Fprint(os.Stdout, frame.Content) } } _, _ = fmt.Fprintln(os.Stdout)
输出示例[assistant] Hello! How can I assist you today?
交互式完整示例
package main
import (
"bufio"
"context"
"errors"
"flag"
"fmt"
"io"
"os"
"strings"
"github.com/cloudwego/eino-ext/components/model/openai"
"github.com/cloudwego/eino/components/model"
"github.com/cloudwego/eino/schema"
)
func main() {
var instruction string
flag.StringVar(&instruction, "instruction", "You are a helpful assistant.", "")
flag.Parse()
ctx := context.Background()
cm, err := newChatModel(ctx)
if err != nil {
_, _ = fmt.Fprintln(os.Stderr, err)
os.Exit(1)
}
fmt.Println("Chat session started. Type 'exit' or 'quit' to end.")
scanner := bufio.NewScanner(os.Stdin)
messages := []*schema.Message{
schema.SystemMessage(instruction),
}
for {
fmt.Print("\n[User] ")
if !scanner.Scan() {
break
}
query := strings.TrimSpace(scanner.Text())
if query == "" {
continue
}
if query == "exit" || query == "quit" {
break
}
messages = append(messages, schema.UserMessage(query))
fmt.Print("[Assistant] ")
stream, err := cm.Stream(ctx, messages)
if err != nil {
fmt.Fprintf(os.Stderr, "\nError: %v\n", err)
continue
}
var response string
for {
frame, err := stream.Recv()
if errors.Is(err, io.EOF) {
break
}
if err != nil {
fmt.Fprintf(os.Stderr, "\nStream error: %v\n", err)
break
}
if frame != nil {
fmt.Print(frame.Content)
response += frame.Content
}
}
messages = append(messages, schema.AssistantMessage(response, nil))
fmt.Println()
stream.Close()
}
}
func newChatModel(ctx context.Context) (model.ToolCallingChatModel, error) {
return openai.NewChatModel(ctx, &openai.ChatModelConfig{
APIKey: "",
Model: "qwen/qwen3.6-plus:free",
BaseURL: "https://openrouter.ai/api/v1",
})
}
Chat session started. Type 'exit' or 'quit' to end.
[User] hello
[Assistant] Hello! How can I help you today?
[User] i am spider man, remember this!
[Assistant] Got it, Spider-Man! 🕷️ I'll keep that in mind for the rest of our conversation. Just a quick heads-up: I don't haave persistent memory between chats, so if you start a new session later, I might need a quick reminder. But for now, your web-slinger status is officially noted!
What's on your mind today? Stopping villains, working through some Parker-level science homework, or just swinging by to chat?
[User] who am i
[Assistant] You're Spider-Man! 🕷️ You told me just a moment ago, and I've got it noted for this chat.
What's up, web-head? Need help tracking down a villain, troubleshooting some web-fluid chemistry, or just working through the daily superhero life balance? 😄
[User]