如何使用 Node.js 创建负载平衡服务器?
如果您的网站或应用程序没有收到很多请求,您不必使用负载平衡,但是当它变得非常流行并开始接收大量流量时,您的底层服务器可能无法处理它。因为单个 NodeJS 服务器不能灵活地处理大量流量。
添加更多机器可以解决这个问题。但是为了将流量共享到您的所有应用程序服务器,需要一个负载均衡器。
负载均衡器:负载均衡器充当应用程序服务器前面的交通警察,并在所有能够以最大化速度和容量利用率的方式满足这些请求的服务器上路由客户端请求,并确保没有一台服务器过度工作,这可能降低性能。
如何设置负载均衡服务器?
1. 使用集群模块: NodeJS 有一个称为集群模块的内置模块,以利用多核系统的优势。使用此模块,您可以将 NodeJS 实例启动到系统的每个核心。主进程侦听端口以接受客户端请求并使用某种智能方式在工作人员中分发。因此,使用此模块,您可以利用系统的工作能力。
以下示例涵盖了使用和不使用集群模块的性能差异。
没有集群模块:
确保您已使用以下命令安装了express和crypto模块:
npm install express crypto
index.js
Javascript
const { generateKeyPair } = require('crypto');
const app = require('express')();
// API endpoint
// Send public key as a response
app.get('/key', (req, res) => {
generateKeyPair('rsa', {
modulusLength: 2048,
publicKeyEncoding: {
type: 'spki',
format: 'pem'
},
privateKeyEncoding: {
type: 'pkcs8',
format: 'pem',
cipher: 'aes-256-cbc',
passphrase: 'top secret'
}
}, (err, publicKey, privateKey) => {
// Handle errors and use the
// generated key pair.
res.send(publicKey);
})
})
app.listen(3000, err => {
err ?
console.log("Error in server setup") :
console.log('Server listening on PORT 3000')
});
Javascript
const express = require('express');
const cluster = require('cluster');
const { generateKeyPair } = require('crypto');
// Check the number of available CPU.
const numCPUs = require('os').cpus().length;
const app = express();
const PORT = 3000;
// For Master process
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers.
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
// This event is firs when worker died
cluster.on('exit', (worker, code, signal) => {
console.log(`worker ${worker.process.pid} died`);
});
}
// For Worker
else {
// Workers can share any TCP connection
// In this case it is an HTTP server
app.listen(PORT, err => {
err ?
console.log("Error in server setup") :
console.log(`Worker ${process.pid} started`);
});
// API endpoint
// Send public key
app.get('/key', (req, res) => {
generateKeyPair('rsa', {
modulusLength: 2048,
publicKeyEncoding: {
type: 'spki',
format: 'pem'
},
privateKeyEncoding: {
type: 'pkcs8',
format: 'pem',
cipher: 'aes-256-cbc',
passphrase: 'top secret'
}
}, (err, publicKey, privateKey) => {
// Handle errors and use the
// generated key pair.
res.send(publicKey);
})
})
}
Javascript
const app = require('express')();
// API endpoint
app.get('/', (req,res)=>{
res.send("Welcome to GeeksforGeeks !");
})
// Launching application on several ports
app.listen(3000);
app.listen(3001);
app.listen(3002);
app.listen(3003);
Javascript
const express = require('express');
const path = require('path');
const app = express();
const axios = require('axios');
// Application servers
const servers = [
"http://localhost:3000",
"http://localhost:3001"
]
// Track the current application server to send request
let current = 0;
// Receive new request
// Forward to application server
const handler = async (req, res) =>{
// Destructure following properties from request object
const { method, url, headers, body } = req;
// Select the current server to forward the request
const server = servers[current];
// Update track to select next server
current === (servers.length-1)? current = 0 : current++
try{
// Requesting to underlying application server
const response = await axios({
url: `${server}${url}`,
method: method,
headers: headers,
data: body
});
// Send back the response data
// from application server to client
res.send(response.data)
}
catch(err){
// Send back the error message
res.status(500).send("Server error!")
}
}
// Serve favicon.ico image
app.get('/favicon.ico', (req, res
) => res.sendFile('/favicon.ico'));
// When receive new request
// Pass it to handler method
app.use((req,res)=>{handler(req, res)});
// Listen on PORT 8080
app.listen(8080, err =>{
err ?
console.log("Failed to listen on PORT 8080"):
console.log("Load Balancer Server "
+ "listening on PORT 8080");
});
Javascript
const express = require('express');
const app1 = express();
const app2 = express();
// Handler method
const handler = num => (req,res)=>{
const { method, url, headers, body } = req;
res.send('Response from server ' + num);
}
// Only handle GET and POST requests
// Receive request and pass to handler method
app1.get('*', handler(1)).post('*', handler(1));
app2.get('*', handler(2)).post('*', handler(2));
// Start server on PORT 3000
app1.listen(3000, err =>{
err ?
console.log("Failed to listen on PORT 3000"):
console.log("Application Server listening on PORT 3000");
});
// Start server on PORT 3001
app2.listen(3001, err =>{
err ?
console.log("Failed to listen on PORT 3001"):
console.log("Application Server listening on PORT 3001");
});
使用以下命令运行index.js文件:
node index.js
输出:我们将在终端屏幕上看到以下输出:
Server listening on PORT 3000
现在打开浏览器并转到http://localhost:3000/key ,您将看到以下输出:
—–BEGIN PUBLIC KEY—– MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwAneYp5HlT93Y3ZlPAHjZAnPFvBskQKKfo4an8jskcgEuG85KnZ7/16kQw2Q8/7Ksdm0sIF7qmAUOu0B773X 1BXQ0liWh+ctHIq/C0e9eM1zOsX6vWwX5Y+WH610cpcb50ltmCeyRmD5Qvf+OE/C BqYrQxVRf4q9+029woF84Lk4tK6OXsdU+Gdqo2FSUzqhwwvYZJJXhW6Gt259m0wD YTZlactvfwhe2EHkHAdN8RdLqiJH9kZV47D6sLS9YG6Ai/HneBIjzTtdXQjqi5vF Y+H+ixZGeShypVHVS119Mi+hnHs7SMzY0GmRleOpna58O1RKPGQg49E7Hr0dz8eh 6QIDAQAB —–END PUBLIC KEY—–
上面的代码监听端口 3000 并发送公钥作为响应。生成 RSA 密钥是 CPU 密集型工作。这里只有一个 NodeJS 实例在单核中工作。为了查看性能,我们使用了autocannon工具来测试我们的服务器,如下所示:
上图显示,当运行 500 个并发连接 10 秒时,服务器可以响应 2000 个请求。平均请求/秒为 190.1 秒。
使用集群模块:
Javascript
const express = require('express');
const cluster = require('cluster');
const { generateKeyPair } = require('crypto');
// Check the number of available CPU.
const numCPUs = require('os').cpus().length;
const app = express();
const PORT = 3000;
// For Master process
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers.
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
// This event is firs when worker died
cluster.on('exit', (worker, code, signal) => {
console.log(`worker ${worker.process.pid} died`);
});
}
// For Worker
else {
// Workers can share any TCP connection
// In this case it is an HTTP server
app.listen(PORT, err => {
err ?
console.log("Error in server setup") :
console.log(`Worker ${process.pid} started`);
});
// API endpoint
// Send public key
app.get('/key', (req, res) => {
generateKeyPair('rsa', {
modulusLength: 2048,
publicKeyEncoding: {
type: 'spki',
format: 'pem'
},
privateKeyEncoding: {
type: 'pkcs8',
format: 'pem',
cipher: 'aes-256-cbc',
passphrase: 'top secret'
}
}, (err, publicKey, privateKey) => {
// Handle errors and use the
// generated key pair.
res.send(publicKey);
})
})
}
使用以下命令运行index.js文件:
node index.js
输出:我们将在终端屏幕上看到以下输出:
Master 16916 is running
Worker 6504 started
Worker 14824 started
Worker 20868 started
Worker 12312 started
Worker 9968 started
Worker 16544 started
Worker 8676 started
Worker 11064 started
现在打开浏览器并转到http://localhost:3000/key ,您将看到以下输出:
—–BEGIN PUBLIC KEY—– MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAzxMQp9y9MblP9dXWuQhf sdlEVnrgmCIyP7CAveYEkI6ua5PJFLRStKHTe3O8rxu+h6I2exXn92F/4RE9Yo8EOnrUCSlqy9bl9qY8D7uBMWir0I65xMZu3rM9Yxi+6gP8H4CMDiJhLoIEap+d9Czr OastDPwI+HF+6nmLkHvuq9X5aORvdiOBwMooIoiRpHbgcHovSerJIfQipGs74IiR 107GbpznSUxMIuwV1fgc6mAULuGZl+Daj0SDxfAjk8KiHyXbfHe5stkPNOCWIsbAtCbGN0bCTR8ZJCLdZ4/VGr+eE0NOvOrElXdXLTDVVzO5dKadoEAtzZzzuQId2P/z JwIDAQAB —–END PUBLIC KEY—–
上面的 NodeJS 应用程序在我们系统的每个核心上启动。主进程接受请求并分发给所有工作人员。在这种情况下执行如下所示:
上图显示,当运行 500 个并发连接 10 秒时,服务器可以响应 5000 个请求。平均请求/秒为 162.06 秒。
因此,使用集群模块,您可以处理更多请求。但是,有时这还不够,如果这是您的情况,那么您的选择是水平缩放。
2. 使用 Nginx:如果你的系统有多个应用服务器需要响应,并且你需要在所有服务器上分发客户端请求,那么你可以巧妙地使用Nginx作为代理服务器。 Nginx位于服务器池的前端,并使用某种智能方式分发请求。
在以下示例中,我们在不同端口上有 4 个相同的 NodeJS 应用程序实例,您也可以使用另一台服务器。
文件名为 index.js
Javascript
const app = require('express')();
// API endpoint
app.get('/', (req,res)=>{
res.send("Welcome to GeeksforGeeks !");
})
// Launching application on several ports
app.listen(3000);
app.listen(3001);
app.listen(3002);
app.listen(3003);
现在在您的机器上安装Nginx并在/etc/nginx/conf.d/中创建一个名为your-domain.com.conf的新文件,其中包含以下代码。
upstream my_http_servers {
# httpServer1 listens to port 3000
server 127.0.0.1:3000;
# httpServer2 listens to port 3001
server 127.0.0.1:3001;
# httpServer3 listens to port 3002
server 127.0.0.1:3002;
# httpServer4 listens to port 3003
server 127.0.0.1:3003;
}
server {
listen 80;
server_name your-domain.com www.your-domain.com;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Host $http_host;
proxy_pass http://my_http_servers;
}
}
3. 使用 Express Web 服务器:Express Web 服务器有很多优势。如果您对 NodeJS 感到满意,您可以实现自己的 Express 基本负载均衡器,如以下示例所示。
第 1 步:创建一个空的 NodeJS 应用程序。
mkdir LoadBalancer
cd LoadBalancer
npm init -y
第 2 步:安装所需的依赖项,如ExpressJS、axios 、 并同时使用以下命令。
npm i express axios
npm i concurrently -g
第 3 步:创建两个文件config.js 负载均衡器服务器和index.js用于 应用服务器。
这里文件名是 config.js
Javascript
const express = require('express');
const path = require('path');
const app = express();
const axios = require('axios');
// Application servers
const servers = [
"http://localhost:3000",
"http://localhost:3001"
]
// Track the current application server to send request
let current = 0;
// Receive new request
// Forward to application server
const handler = async (req, res) =>{
// Destructure following properties from request object
const { method, url, headers, body } = req;
// Select the current server to forward the request
const server = servers[current];
// Update track to select next server
current === (servers.length-1)? current = 0 : current++
try{
// Requesting to underlying application server
const response = await axios({
url: `${server}${url}`,
method: method,
headers: headers,
data: body
});
// Send back the response data
// from application server to client
res.send(response.data)
}
catch(err){
// Send back the error message
res.status(500).send("Server error!")
}
}
// Serve favicon.ico image
app.get('/favicon.ico', (req, res
) => res.sendFile('/favicon.ico'));
// When receive new request
// Pass it to handler method
app.use((req,res)=>{handler(req, res)});
// Listen on PORT 8080
app.listen(8080, err =>{
err ?
console.log("Failed to listen on PORT 8080"):
console.log("Load Balancer Server "
+ "listening on PORT 8080");
});
这里,文件名是 index.js
Javascript
const express = require('express');
const app1 = express();
const app2 = express();
// Handler method
const handler = num => (req,res)=>{
const { method, url, headers, body } = req;
res.send('Response from server ' + num);
}
// Only handle GET and POST requests
// Receive request and pass to handler method
app1.get('*', handler(1)).post('*', handler(1));
app2.get('*', handler(2)).post('*', handler(2));
// Start server on PORT 3000
app1.listen(3000, err =>{
err ?
console.log("Failed to listen on PORT 3000"):
console.log("Application Server listening on PORT 3000");
});
// Start server on PORT 3001
app2.listen(3001, err =>{
err ?
console.log("Failed to listen on PORT 3001"):
console.log("Application Server listening on PORT 3001");
});
说明:上面的代码以 2 个 Express 应用程序开始,一个在端口 3000 上,另一个在端口 3001 上。单独的负载均衡器进程应该在这两者之间交替,向端口 3000 发送一个请求,向端口 3001 发送下一个请求,然后下一个返回 3000 端口。
第 4 步:在您的项目文件夹上打开命令提示符并同时运行两个脚本并行使用。
concurrently "node config.js" "node index.js"
输出:
我们将在控制台上看到以下输出:
现在,打开浏览器并访问http://localhost:8080/并发出一些请求,我们将看到以下输出: