Want to build a chatbot that can handle multiple users and remember conversations, all without breaking the bank? In this tutorial, we’ll explore how to create a multi-user chatbot using LangChain Go, a powerful framework for building applications powered by large language models (LLMs), SQLite for persistent memory, and Groq’s free API for LLM responses.
Why Groq?
Groq(https://groq.com/) is a company that offers a generous free tier for their OpenAI-compatible API. This means you can experiment with and build LLM-powered applications without incurring any costs. Their API is a drop-in replacement for OpenAI’s API, making it easy to integrate with LangChain.
Setting the Stage:
Before we jump into the code, ensure you have the following prerequisites:
- Go installed: If not, download and install it from golang.org.
- Groq API Key: Obtain an API key from Groq.
- Basic understanding of Go programming: Familiarity with Go syntax and concepts is essential.
Project Setup:
- Create a new Go module:
mkdir chatbot
cd chatbot
go mod init chatbot
- Install the necessary dependencies:
go get github.com/tmc/langchaingo
go get github.com/tmc/langchaingo/llms/openai
go get github.com/tmc/langchaingo/memory
go get github.com/tmc/langchaingo/memory/sqlite3
go get github.com/mattn/go-sqlite3
This sets up your Go project with the required LangChain and SQLite packages.
Code Breakdown:
Now, let’s break down the code into logical segments and explain each part in detail.
1. Package and Imports:
package main
import (
"bufio"
"context"
"database/sql"
"fmt"
"os"
"strings"
"github.com/tmc/langchaingo/chains"
"github.com/tmc/langchaingo/llms/openai"
"github.com/tmc/langchaingo/memory"
"github.com/tmc/langchaingo/memory/sqlite3"
_ "github.com/mattn/go-sqlite3"
)
package main
: Declares this file as the entry point of our Go application.import
: Imports necessary packages:bufio
: For buffered I/O, allowing us to read user input from the console.context
: For managing context across function calls.database/sql
: For interacting with SQL databases.fmt
: For formatted I/O.os
: For interacting with the operating system (e.g., environment variables).strings
: For string manipulation.github.com/tmc/langchaingo/chains
: For creating chains of LLM operations.github.com/tmc/langchaingo/llms/openai
: For using Groq’s OpenAI-compatible API.github.com/tmc/langchaingo/memory
: For managing chatbot memory.github.com/tmc/langchaingo/memory/sqlite3
: For using SQLite as persistent memory._ "github.com/mattn/go-sqlite3"
: Imports the SQLite driver.
2. main
Function:
func main() {
if err := runChatbot(); err != nil {
fmt.Fprintln(os.Stderr, err)
os.Exit(1)
}
}
- The
main
function is the entry point of our application. - It calls the
runChatbot
function and handles any errors that occur.
3. runChatbot
Function:
func runChatbot() error {
apiKey := os.Getenv("GROQ_API_KEY")
llm, err := openai.New(
openai.WithModel("llama3-8b-8192"),
openai.WithBaseURL("https://api.groq.com/openai/v1"),
openai.WithToken(apiKey),
)
if err != nil {
return err
}
db, err := sql.Open("sqlite3", "chatbot_history.db")
if err != nil {
return err
}
defer db.Close()
reader := bufio.NewReader(os.Stdin)
ctx := context.Background()
fmt.Println("Chatbot started. Type 'exit' to quit.")
// ... (rest of the runChatbot function)
}
- This function encapsulates the core logic of our chatbot.
- It retrieves the Groq API key from the environment variables.
- It creates a new LangChain LLM client, configured to use Groq’s API and the
llama3-8b-8192
model. - It opens an SQLite database connection to
chatbot_history.db
. defer db.Close()
: Ensures the database connection is closed when the function exits.- It creates a
bufio.Reader
to read input from the console. - It creates a background context.
- It prints a welcome message.
4. User Input and Session Management:
for {
fmt.Print("Enter User ID (or 'exit'): ")
userID, _ := reader.ReadString('\n')
userID = strings.TrimSpace(userID)
if strings.ToLower(userID) == "exit" {
break
}
sessionID := generateSessionID(userID)
// ... (rest of the loop)
}
- The outer
for
loop handles multiple users. - It prompts the user to enter a User ID.
- It reads the User ID from the console and removes any leading or trailing whitespace.
- If the user enters “exit,” the loop breaks.
sessionID := generateSessionID(userID)
: Generates a session ID based on the User ID.
5. Chat History and LLM Chain:
chatHistory := sqlite3.NewSqliteChatMessageHistory(
sqlite3.WithSession(sessionID),
sqlite3.WithDB(db),
)
conversationBuffer := memory.NewConversationBuffer(memory.WithChatHistory(chatHistory))
llmChain := chains.NewConversation(llm, conversationBuffer)
fmt.Printf("Chatting with user %s. Type 'exit' to change user.\n", userID)
for {
fmt.Print("> ")
input, _ := reader.ReadString('\n')
input = strings.TrimSpace(input)
if strings.ToLower(input) == "exit" {
break
}
out, err := chains.Run(ctx, llmChain, input)
if err != nil {
fmt.Println("Error:", err)
continue
}
fmt.Println(out)
}
- It creates a
SqliteChatMessageHistory
object, using the generatedsessionID
to separate conversations. - It creates a
ConversationBuffer
memory object to store the chat history. - It creates a
Conversation
chain, combining the LLM and memory. - It prints a message indicating which user is being chatted with.
- The inner
for
loop handles the conversation with the current user. - It reads user input, runs the LLM chain, and prints the response.
- If the user enters “exit,” the inner loop breaks, and the user can select a new user.
6. generateSessionID
Function:
func generateSessionID(userID string) string {
return userID
}
- This function generates a session ID based on the User ID. In this simple implementation, it just returns the User ID.
Complete Code:
package main
import (
"bufio"
"context"
"database/sql"
"fmt"
"os"
"strings"
"github.com/tmc/langchaingo/chains"
"github.com/tmc/langchaingo/llms/openai"
"github.com/tmc/langchaingo/memory"
"github.com/tmc/langchaingo/memory/sqlite3"
_ "github.com/mattn/go-sqlite3"
)
func main() {
if err:= runChatbot(); err!= nil {
fmt.Fprintln(os.Stderr, err)
os.Exit(1)
}
}
func runChatbot() error {
apiKey:= os.Getenv("GROQ_API_KEY")
llm, err:= openai.New(
openai.WithModel("llama3-8b-8192"),
openai.WithBaseURL("https://api.groq.com/openai/v1"),
openai.WithToken(apiKey),
)
if err!= nil {
return err
}
db, err:= sql.Open("sqlite3", "chatbot_history.db")
if err!= nil {
return err
}
defer db.Close()
reader:= bufio.NewReader(os.Stdin)
ctx:= context.Background()
fmt.Println("Chatbot started. Type 'exit' to quit.")
for {
fmt.Print("Enter User ID (or 'exit'): ")
userID, _:= reader.ReadString('\n')
userID = strings.TrimSpace(userID)
if strings.ToLower(userID) == "exit" {
break
}
sessionID:= generateSessionID(userID)
chatHistory:= sqlite3.NewSqliteChatMessageHistory(
sqlite3.WithSession(sessionID),
sqlite3.WithDB(db),
)
conversationBuffer:= memory.NewConversationBuffer(memory.WithChatHistory(chatHistory))
llmChain:= chains.NewConversation(llm, conversationBuffer)
fmt.Printf("Chatting with user %s. Type 'exit' to change user.\n", userID)
for {
fmt.Print("> ")
input, _:= reader.ReadString('\n')
input = strings.TrimSpace(input)
if strings.ToLower(input) == "exit" {
break
}
out, err:= chains.Run(ctx, llmChain, input)
if err!= nil {
fmt.Println("Error:", err)
continue
}
fmt.Println(out)
}
}
fmt.Println("Chatbot stopped.")
return nil
}
func generateSessionID(userID string) string {
return userID
}
That’s it! You’ve now built a multi-user chatbot using LangChain Go, Groq, and SQLite. This is just the beginning; there are many ways to expand and enhance this chatbot. Feel free to experiment with different prompts, memory management techniques, and Groq’s API capabilities.
If you have any questions or want to discuss further, connect with me on Linkedin or my blog. Happy coding!
Leave a Reply