Location>code7788 >text

What is the difference between gRPC and traditional RPC? Explain one article clearly!

Popularity:764 ℃/2025-04-18 11:42:44

Nowadays, everyone who works in system development likes to develop "microservice architecture" - simply put it simply, breaking a large system into many small services, which is more flexible and easier to expand. So how do you communicate between these services? It depends on a technology calledRPC (remote procedure call). Today we will talk about its "evolutionary version":gRPC, see how it is different from traditional RPC.

1. Understand a few concepts first

What is RPC?

It can be understood as the way of "calling functions across machines". It's like you're calling a function locally, but it's actually running on another server. There are many implementations of traditional RPC, such as XML-RPC, JSON-RPC, SOAP, etc. The data formats are mostly XML or JSON.

So what is gRPC?

A more efficient RPC framework produced by Google is based on the HTTP/2 protocol, and the data format is Protocol Buffers (referred to as Protobuf). Good performance, high efficiency, and automatic code generation, it sounds very good, right?

2. Several major differences between gRPC and traditional RPC (vernacular version)

Comparison points Traditional RPC gRPC
Transmission Protocol Usually HTTP/1 or TCP HTTP/2, supports multiplexing, fast speed
Data format XML/JSON, readable but large in size Protobuf, small size, fast parsing
Code generation Usually written manually Support automatic generation of client/server code
Streaming Generally not supported Supports four calling modes and supports two-way flow
Cross-language support A little bit troublesome Officially support multilingual (Go, Python, etc.)
Error handling Use HTTP status code to process Use standard error code mechanism to support detailed description

3. Give an example more intuitive

Use traditional JSON-RPC to adjust the interface

{
  "jsonrpc": "2.0",
  "method": "getUserProfile",
  "params": {
    "userId": 123,
    "includeDetails": true
  },
  "id": 1
}

Humans can understand it, but the data volume is large and the parsing speed is slow.

Use gRPC + Protobuf

First define the protocol:

syntax = "proto3";
​
service UserService {
  rpc GetUserProfile(UserRequest) returns (UserProfile) {}
}
​
message UserRequest {
  int32 user_id = 1;
  bool include_details = 2;
}
​
message UserProfile {
  int32 user_id = 1;
  string username = 2;
  string email = 3;
}

Then you can call it like this:

request = user_pb2.UserRequest(user_id=123, include_details=True)
 response = (request)
 print(f"Username: {}")

The structure is clearer, the volume is smaller, and the transmission efficiency is higher.

4. Comparison of request processing methods

Traditional RPC calling method

# XML-RPC Example
 import
 ​
 # Create a client
 server = ("http://localhost:8000")
 ​
 # Each call will create a new connection
 result = server.get_user_info(user_id=123)
 print(f"User Information: {result}")
 ​
 # Have to reconnect again
 another_result = server.get_product_details(product_id=456)

Just like having to re-dial every time you make a call, it takes time!

How to call gRPC

import grpc
 import user_service_pb2
 import user_service_pb2_grpc
 ​
 # Create a connection channel
 with grpc.insecure_channel('localhost:50051') as channel:
     # Create a call object
     stub = user_service_pb2_grpc.UserServiceStub(channel)
    
     # Multiple methods can be called in the same connection
     response1 = (user_service_pb2.GetUserRequest(user_id=123))
     response2 = (user_service_pb2.GetProductRequest(product_id=456))
    
     # Can also make streaming calls, and receive data bit by bit like watching videos
     for product in (user_service_pb2.ListProductsRequest(category="Mobile")):
         print(f"Product: {}, Price: {}")

It’s like setting up a dedicated line, with constant calls and being able to speak and listen while talking, which is so convenient!

5. How big is the performance gap?

Scenario: Obtain 1000 user information

Traditional REST (HTTP/1 + JSON) version:

import requests
 import time
 ​
 start_time = ()
 users = []
 ​
 # Send 1000 independent HTTP requests, and connect to each time
 for i in range(1000):
     response = (f"/users/{i}")
     (())
 ​
 duration = () - start_time
 print(f"REST API: {len(users)} users were obtained, and it took {duration:.2f} seconds")
 # Output: REST API: 1000 users were obtained, taking 10.45 seconds

gRPC version:

import grpc
 import user_pb2
 import user_pb2_grpc
 import time
 ​
 start_time = ()
 ​
 with grpc.insecure_channel(':50051') as channel:
     stub = user_pb2_grpc.UserServiceStub(channel)
    
     # Get all users at one time, batch processing
     users = list((user_pb2.GetUsersRequest(limit=1000)))
 ​
 duration = () - start_time
 print(f"gRPC: {len(users)} users were obtained, and it took {duration:.2f} seconds")
 # Output: gRPC: 1000 users were obtained, taking 1.23 seconds

Summarize:gRPC is faster because it:

  • Support connection reuse (no need to reconnect every time)

  • With Protobuf, data is lighter and faster

  • Streaming, high batch efficiency

6. Comparison of error handling methods

REST Error Handling:

Error returned by the server:

{
  "error": {
    "code": 404,
    "message": "User not found",
    "details": "The user with ID 12345 does not exist"
  }
}

Client processing:

fetch('/api/users/12345')
   .then(response => {
     if (!) {
       return ().then(err => {
         throw new Error(`${}: ${}`);
       });
     }
     return ();
   })
   .catch(error => ('Error:', error));

Rely on HTTP status codes, but the format is not unified and needs to be parsed manually.

gRPC error handling:

Server definition error:

def GetUser(self, request, context):
     user = database.find_user(request.user_id)
     if not user:
         context.set_code(.NOT_FOUND)
         context.set_details(f"User not found {request.user_id}")
         return user_pb2.UserProfile() # Return empty object
     return user

Client handling error:

try:
     response = (request)
     print(f"User Information: {response}")
 except as e:
     if () == .NOT_FOUND:
         print(f"Error: User does not exist - {()}")
     else:
         print(f"RPC error: {()} - {()}")

Standard error code + description, the client can catch directly. As convenient as handling local exceptions!

7. Selection of practical application scenarios

When to use the traditional REST API?

  1. Direct API adjustment at the front end

    // It is very convenient to click the REST API in the browser
     fetch('/api/products')
       .then(res => ())
       .then(products => (products));
  2. Connect to third-party platforms For example, if you accept WeChat Pay and Alipay API, they are all REST, so you have to follow them.

  3. Simple system Small projects do not pursue performance, REST development speed is fast

When to use gRPC?

  1. Internal microservice communication There are too many services, frequent internal calls, and using gRPC is fast and stable

  2. Real-time data application

    // Real-time push of stock prices
     func (s *StockServer) PriceStream(request *, stream pb.StockService_PriceStreamServer) error {
       for {
         price := getLatestPrice()
         (&{
           Symbol: ,
           Price: price,
           Timestamp: ().Unix(),
         })
         (1 * )
       }
     }
  3. Mobile application Mobile phone traffic is expensive, gRPC data is small, saving traffic

  4. Multilingual system Python service adjustment Go service, Java service adjustment C# service, are not problems

8. A summary of one sentence

REST API is like Mandarin, everyone can understand it; gRPC is like a highway, although there are thresholds, once you get on the road, it will be fast!

If you are working on interfaces for ordinary users or simple systems, the REST API is enough.

But if you are building microservices and need high performance, multilingual, and streaming capabilities, then decisively use gRPC!