Location>code7788 >text

Interview Frequently Asked Questions - Front-end performance optimization of large file uploads

Popularity:60 ℃/2024-07-24 11:51:00

Large file upload is one of the common requirements in front-end development, especially when you need to process HD images, videos or other large files. Optimizing large file uploads not only improves user experience, but also effectively reduces the burden on the server. In this article, we will discuss several common optimization techniques for large file uploads, including file slicing and concurrent uploads, breakpoint uploads, background processing optimization, security considerations and user experience optimization.

1. Preamble

In modern Web applications, it has become a common demand for users to upload large files. However, direct uploading of large files faces many challenges, such as interrupted uploading due to unstable network, poor user experience due to long uploading time, and high pressure on the server. Therefore, optimizing the performance of large file uploads is particularly important.

2. File slicing and concurrent uploads

2.1 File Slicing Principles

File slicing (Chunking) is a method of dividing a large file into a number of small segments, each of which is uploaded independently. This can effectively reduce the amount of data uploaded in a single upload and lower the probability of upload failure.

2.2 Realization steps

  1. Front-end slicing: UtilizationBlobtargetslicemethod slices the file.
  2. concurrent upload: UseEnables concurrent uploading of multiple slices.
  3. Consolidated requests: Notify the server to merge these slices when the upload is complete.

3. Continuous transmission at breakpoints

Resumable Uploads allows you to continue uploading from a breakpoint when the uploading process is interrupted, avoiding the need to re-upload the entire file.

3.1 Realization steps

  1. Front-end logging of progress: UselocalStorageRecords information about slices that have been uploaded.
  2. resume transmission after a break: Check which slices are not uploaded when uploading and continue to upload the unfinished parts.

4. Back-office processing optimization

4.1 Segmented reception and consolidation

The server needs to support receiving slice requests and merging files after all slice uploads are complete. This logic can be implemented using middleware or server-side programming languages.

 

5. Security considerations

5.1 Document type validation

File types should be checked on both the front-end and back-end to ensure that the uploaded file types are as expected.

5.2 File size limitations

Limit the size of individual files and total uploaded files to prevent malicious users from uploading oversized files that cause server stress.

6. User experience optimization

6.1 Progress display

Enhance user experience by displaying an upload progress bar to keep users informed of the upload progress.

 

6.2 Handling of network fluctuations

Considering that users may upload files in environments with unstable networks, a failure retry mechanism can be added.

Full Example

Back-end code (+ Express)

Installation of dependencies

npm init -y
npm install express multer fs

 

Creating server files ()

const express = require('express');
const multer = require('multer');
const fs = require('fs');
const path = require('path');
const bodyParser = require('body-parser');

const app = express();
const upload = multer({ dest: 'uploads/' });
(());

// Routing: handling file slice uploads
('/upload', ('chunk'), (req, res) => {
  const { index, fileName } = ;
  const chunkPath = (__dirname, 'uploads', `${fileName}-${index}`);
  (, chunkPath);
  (200).send('Chunk uploaded');
});

// Routing: merge slices
('/merge', (req, res) => {
  const { totalChunks, fileName } = ;
  const filePath = (__dirname, 'uploads', fileName);
  const writeStream = (filePath);

  for (let i = 0; i < totalChunks; i++) {
    const chunkPath = (__dirname, 'uploads', `${fileName}-${i}`);
    const data = (chunkPath);
    (data);
    (chunkPath);
  }

  ();
  (200).send('File merged');
});

(3000, () => {
  ('Server started on http://localhost:3000');
});

 

Front-end code ( + )

  1. Creating HTML files ()
<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  <title> Large File Upload </title>
</head>
<body>
  <input type="file" id="fileInput">
  <progress id="progressBar" value="0" max="100"></progress>
  <button onclick="uploadFile()"> upload file</button>
  <script src=""></script>
</body>
</html>

 

  1. Creating JavaScript files ()
const fileInput = ('fileInput');
const progressBar = ('progressBar');
const chunkSize = 5 * 1024 * 1024; // 5MB

const uploadChunk = async (chunk, index, fileName) => {
  const formData = new FormData();
  ('chunk', chunk);
  ('index', index);
  ('fileName', fileName);

  await fetch('/upload', {
    method: 'POST',
    body: formData
  });

  updateProgressBar(index);
};

const updateProgressBar = (index) => {
  const uploadedChunks = (('uploadedChunks')) || [];
  if (!(index)) {
    uploadedChunks.push(index);
     = ( / totalChunks) * 100;
    ('uploadedChunks', (uploadedChunks));
  }
};

const uploadFile = async () => {
  const file = [0];
  const totalChunks = ( / chunkSize);
  const uploadedChunks = (('uploadedChunks')) || [];
  const promises = [];

  for (let i = 0; i < totalChunks; i++) {
    if (!(i)) {
      const chunk = (i * chunkSize, (i + 1) * chunkSize);
      promises.push(uploadChunk(chunk, i, ));
    }
  }

  await (promises);

  await fetch('/merge', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: ({ totalChunks, fileName:  })
  });

  ('uploadedChunks');
  alert('File uploaded successfully');
};

 

Starting the back-end server

 

  1. Open the front-end page in your browser

commander-in-chief (military)Open the file in your browser, select the file and click the "Upload File" button to see the progress of the file upload.

node server.js