Location>code7788 >text

Building smart chatbots from scratch: Rasa and ChatGPT API practical tutorial

Popularity:887 ℃/2025-04-11 21:24:47

Introduction: The era opportunities of AI dialogue systems

In the wave of digital transformation, chatbots have become the key link connecting users and services. Whether it is the 7×24-hour instant response in the customer service system or the voice interaction in the smart home, chat robots are reshaping the way human-computer interaction is being done. This article will use detailed tutorials to teach you step by step to build interactive web chatbots using the Rasa framework and ChatGPT API, covering the entire process of environment construction, model training, API calls to deployment operation and maintenance.

1. Technology selection: The core advantages of Rasa and ChatGPT

Rasa Framework: Swiss Army Knife with Open Source Dialogue System

  • Modular architecture: NLU (Natural Language Understanding) + Core (Dialogue Management) + X (Visualization Tool);
  • Data controllability: Support localized training, and sensitive data does not need to be uploaded to the cloud;
  • Customization flexibility: Define the dialogue process through YAML files, and implement business logic by Python code;
  • Typical scenarios: Enterprise-level applications that require complex multi-round dialogue and domain knowledge base integration.

ChatGPT API: The ultimate weapon for generative AI

  • Big model capability: Based on the GPT-3.5-turbo engine, it is good at open dialogue and creative generation;
  • Quick iteration: You can obtain the latest model capabilities through API calls without local training;
  • Cost-effective: Pay on demand mode ($0.002/1000 tokens), suitable for scenarios with large traffic fluctuations;
  • Typical scenarios: General scene dialogues such as customer service Q&A, content creation, education and tutoring.

2. Rasa-based chat robot development practice

2.1 Environment construction: Magic startup of Python ecosystem

# Create a virtual environment (Python 3.8+ recommended)
 python -m venv rasa_env
 source rasa_env/bin/activate # Linux/Mac
 rasa_env\Scripts\activate # Windows
 
 # Install the Rasa core library
 pip install rasa
 
 # Initialize the project (automatically generate sample files)
 rasa init --no-prompt

2.2 Domain Modeling: DNA Design of Dialogue Systems

Example:

version: "3.0"
 intents:
   - greet
   - ask_weather
   - goodbye
 
 entities:
   - city
 
 Responses:
   utter_greet:
     - text: "Hello! I am a weather query robot, please enter the city name to check the weather"
   utter_weather:
     - text: "🌦️ {city} Today's weather: sunny, temperature 25℃"
 
 actions:
   - action_fetch_weather

2.3 Training data preparation: NLU food

Example:

version: "3.0"
 nlu:
   - intent: greet
     examples: |
       - Hello
       - Good morning
       - Are you there
 
   - intent: ask_weather
     examples: |
       - [Beijing](city) What is the weather like
       - Check the weather forecast for [Shanghai](city)

Example:

version: "3.0"
 stories:
   - story: Simple query
     Steps:
       - intent: greet
       - action: utter_greet
       - intent: ask_weather
       - action: action_fetch_weather
       - intent: goodbye
       - action: utter_goodbye

2.4 Model training and optimization

# Train the NLU model
 rasa train nlu
 
 # Train the dialogue model
 rasa train core
 
 # Cross-validation test
 rasa test

2.5 Deployment and Web Integration

(Use Flask-SocketIO to achieve real-time communication):

from flask import Flask, render_template
 from flask_socketio import SocketIO, send
 import rasa
 
 app = Flask(__name__)
 ['SECRET_KEY'] = 'your_secret_key'
 socketio = SocketIO(app)
 
 # Load the trained Rasa model
 interpreter = .get_model('models').interpreter
 
 @('/')
 def index():
     return render_template('')
 
 @('message')
 def handle_message(msg):
     # Get user input
     user_input = msg['message']
    
     # Rasa model processing
     result = (user_input)
     response = result['text']
    
     # Return response
     send({'message': response}, broadcast=True)
 
 if __name__ == '__main__':
     (app, debug=True)

Front-end interface:

<!DOCTYPE html>
 <html>
 <head>
     <title>Rasa Chatbot</title>
     <style>
         .chat-container { height: 400px; overflow-y: auto; border: 1px solid #ccc; }
         .message { padding: 8px; margin: 5px; border-radius: 4px; }
         .user { background-color: #e3f2fd; text-align: right; }
         .bot { background-color: #f0f4c3; text-align: left; }
     </style>
 </head>
 <body>
     <div class="chat-container" ></div>
     <input type="text" placeholder="Input message...">
     <button onclick="sendMessage()">Send</button>
 
     <script src="/ajax/libs//4.0.1/"></script>
     <script>
         const socket = io();
        
         function sendMessage() {
             const message = ('userInput').value;
             const chatbox = ('chatbox');
            
             // Add user message
              += `<div class="message user">${message}</div>`;
            
             // Send a message to the backend
             ('message', { message });
            
             // Clear the input box
             ('userInput').value = '';
            
             // Automatically scroll to the bottom
              = ;
         }
 
         // Receive robot response
         ('message', (data) => {
             const chatbox = ('chatbox');
              += `<div class="message bot">${}</div>`;
              = ;
         });
     </script>
 </body>
 </html>

3. ChatGPT API rapid integration solution

3.1 API key acquisition

  1. Access/Register an account;
  2. Enter the console to generate the API key (keep it safe).

3.2 Python call example

import openai
 import os
 
 # Load API key from environment variables
 openai.api_key = ("OPENAI_API_KEY")
 
 def chat_with_gpt(prompt, max_tokens=50, temperature=0.7):
     response = (
         model="gpt-3.5-turbo",
         messages=[{"role": "user", "content": prompt}]
     )
     return [0].()
 
 # Test conversation
 user_input = "Writing a poem about autumn"
 bot_response = chat_with_gpt(user_input)
 print(f"User: {user_input}\nRobot: {bot_response}")

3.3 Web-side integration (Flask example)

from flask import Flask, request, jsonify
 
app = Flask(__name__)
 
@('/chat', methods=['POST'])
def chat():
    user_message = ['message']
    bot_response = chat_with_gpt(user_message)
    return jsonify({'response': bot_response})
 
if __name__ == '__main__':
    (port=5000)

4. Hybrid architecture: the co-evolution of Rasa + ChatGPT

4.1 Architectural Design

Rasa + ChatGPT Co-evolution.

4.2 Implementation steps

  1. Rasa handles structured requests
#
 from rasa_sdk import Action
 import openai
 
 class ActionQueryWeather(Action):
     def name(self):
         return "action_query_weather"
    
     def run(self, dispatcher, tracker, domain):
         city ​​= tracker.get_slot("city")
         prompt = f"Query the real-time weather of {city}"
         response = (
             model="gpt-3.5-turbo",
             messages=[{"role": "user", "content": prompt}]
         )
         dispatcher.utter_message(text=[0].text)
         Return []

2.Configure Rasa to call external API

# 
action_endpoint:
  url: "http://localhost:5055/webhook"

V. Best practices for deployment and operation and maintenance

5.1 Deployment Plan Selection

plan Applicable scenarios cost flexibility
Local server Small project/test environment Low middle
Cloud Functions (AWS Lambda) Large fluctuations in traffic middle high
Containerization (Docker+K8s) Enterprise-level production environment Higher Extremely high

5.2 Performance optimization tips

  1. Request batch processing: Merge multiple user requests to reduce the number of API calls;
  2. Cache mechanism: Set cache expiration time for high-frequency problems (such as weather query);
  3. Load balancing: Use Nginx to distribute requests to multiple Rasa instances.

6. In-depth customization of business scenarios

6.1 Logistics Supply Chain Cases

need: Optimize multimodal scheduling decisions.
accomplish

def analyze_logistics_data(data):
     # Use ChatGPT to parse unstructured logistics data
     prompt = f"Analyze the following logistics data:\n{data}"
     response = (
         model="gpt-3.5-turbo",
         messages=[{"role": "user", "content": prompt}]
     )
     return [0].text

6.2 Education platform case

need: Recommended personalized learning paths.
accomplish

def generate_study_plan(student_data):
     prompt = f "Create a study plan based on the following student data:\n{student_data}"
     response = (
         model="gpt-3.5-turbo",
         messages=[{"role": "user", "content": prompt}]
     )
     return [0].text

7. Summary and Outlook

This article demonstrates the complete development process from basic chatbots to enterprise-level intelligent dialogue systems through a combination of Rasa framework and ChatGPT API. In the future, as the capabilities of big model continue to evolve, the following directions are worth paying attention to:

  1. Multimodal interaction: Integrated speech recognition, image understanding and other abilities;
  2. Reinforcement learning: Optimize conversation strategies through user feedback;
  3. Edge computing: Implement low latency response on local devices.

Start hands-on practice now and build your own intelligent dialogue system!