Pankaj Tanwar
Published on

I built a file upload service without any external library, just pure JavaScript 🤖

––– views

Third-party packages, compilers, and bundlers are magic behind the curtain. Due to lack of time and massive competition, we don't worry enough about the low-level stuff to know what's exactly happening behind the scenes in these third-party packages.

In this article, we are going to build a file upload service with vanilla JavaScript from scratch. The goal is to build this with no external libraries to understand some of JavaScript’s core concepts. We will be reading the file uploaded by the user on the frontend and streaming it in chunks to the backend, and storing it there.

Here's a quick look at what we will be making:

File Upload Demo

Let's dig in.

Table of contents

  1. Set up the Node.js server
  2. Set up the frontend
  3. Read the file content on the frontend
  4. Divide and stream the file in chunks to the backend
  5. Receive the chunks and store them on the server

Set up the Node.js server

We are going to make use of the beautiful, inbuilt HTTP package to set up the backend server.

First, we need to create a new folder for the project.

mkdir fileupload-service

After doing so, we need to create an index.js file that would be the entry point of our backend server.

touch index.js

After this, create the HTTP server.

const http = require('http') // import http module
const server = http.createServer() // create server
server.listen(8080, () => {
console.log('Server running on port 8080') // listening on the port
})

The above code is pretty self-explanatory. We have created an HTTP server, running on port 8080.

Set up the frontend

The next step is to set up the frontend. As we are not doing anything fancy, we will create a basic HTML file with file input and an upload button, which will initiate the uploading process when clicked. There would be a tiny status text which will declare the status of file upload.

In vanilla JS, to add an action on any button click, we can simply attach an event listener.

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>File Uploader</title>
</head>
<body>
<h2>File Upload Service</h2>
<input type="file" id="file" />
<button id="upload">Upload</button>
<small id="status"></small>
<script>
const file = document.getElementById('file')
const upload = document.getElementById('upload')
const status = document.getElementById(status)
upload.addEventListener('click', () => {
console.log('clicked the upload button!')
})
</script>
</body>
</html>

Users can select the file and upload it by clicking on the upload button. Easy-peasy!

To serve this HTML file on calling the home route, we need to send this file from the backend. The simplest approach is below.

server.on('request', (req, res) => {
if (req.url === '/' && req.method === 'GET') {
return res.end(fs.readFileSync(__dirname + '/index.html'))
}
})

The server.on('request') method is used to listen to all HTTP requests in a Node backend server.

Read the file content on the frontend

As our backend server is up and running, we need a way to read the file on the frontend. To do so, we are going to use the FileReader object. It lets web applications asynchronously read the contents of files (or raw data buffers) stored on the user's computer, using File or Blob objects to specify the file or data to read.

The syntax to read a file on the client-side using FileReader object is the following.

const fileReader = new FileReader() // initialize the object
fileReader.readAsArrayBuffer(file) // read file as array buffer

We can access selected input files under the files field for the input. Currently, we are only building it for a single file upload, but later on, we can extend it for multiple file uploads as well.

const selectFile = file.files[0]

To read a file, FileReader provides a couple of methods.

  1. FileReader.readAsArrayBuffer() — read file as array buffer
  2. FileReader.readAsBinaryString() — read the file in raw binary data
  3. FileReader.readAsDataURL() — read the file and returns result as a data url
  4. FileReader.readAsText() — If we are aware of the type of file as text, this method is useful

For our use case, we will be using the readAsArrayBuffer method to read the file in bytes and stream it to the backend over the network.

To track reading the file on the client-side, FileReader provides a couple of event listeners like onload, onprogress, etc.

Our goal is to read the file, split it into chunks, and upload it to the backend, so we will be using the onload event, which is triggered once the file reading is completed.

You might wonder, why we are not using the onprogress method to make the application for a fully streamable file upload? But the issue with the onprogress method is it does not tell the new read chunk, it tells the complete data read until now. So, we use the onload method.

Once the file is completely read, we split it into small chunks and stream it to the backend.

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>File Uploader</title>
</head>
<body>
<h2>File Upload Service</h2>
<input type="file" id="file" />
<button id="upload">Upload</button>
<small id="status"></small>
<script>
const file = document.getElementById('file');
const upload = document.getElementById('upload');
const status = document.getElementById(status);
upload.addEventListener('click', () => {
// set status to uploading
status.innerHTML = ‘uploading…’;
const fileReader = new FileReader();
fileReader.readAsArrayBuffer(file.files[0]);
fileReader.onload = (event) => {
console.log('Complete File read successfully!')
}
});
</script>
</body>
</html>

You might have noticed, we using a <small> tag that changes to uploading.. as we start uploading and becomes uploaded!!! once the file is uploaded on the backend successfully!

Divide and stream the file in chunks to the backend

Sometimes, the file size can be large, so it's not a good practice to send the complete file at once. Some of the proxy servers such as Nginx might block it because it seems malicious

So, we will be splitting this file into a chunk size of ~5000 bytes and sending it to the backend one by one.

If we carefully look at the event parameter, we find out that, once it has read the file, we can access the content of the file as an array buffer in the event.target.result field.

We are going to split the array buffer of this file into chunks of 5000 bytes.

// file content
const content = event.target.result
// fix chunk size
const CHUNK_SIZE = 5000
// total chunks
const totalChunks = event.target.result.byteLength / CHUNK_SIZE
// loop over each chunk
for (let chunk = 0; chunk < totalChunks + 1; chunk++) {
// prepare the chunk
let CHUNK = content.slice(chunk * CHUNK_SIZE, (chunk + 1) * CHUNK_SIZE)
// todo - send it to the backend
}

Now, we need to send these chunks to the backend. To hit the backend server, my old friend fetch is here to rescue.

Before we send the chunks to the backend, we need to make sure we do it in order otherwise the file will be corrupted.

The second thing is to use async await while uploading because we don't want to flood the backend server with requests.

fileReader.onload = async (event) => {
const content = event.target.result;
const CHUNK_SIZE = 1000;
const totalChunks = event.target.result.byteLength / CHUNK_SIZE;
// generate a file name
const fileName = Math.random().toString(36).slice(-6) + file.files[0].name;
for (let chunk = 0; chunk < totalChunks + 1; chunk++) {
let CHUNK = content.slice(chunk * CHUNK_SIZE, (chunk + 1) * CHUNK_SIZE)
await fetch('/upload?fileName=' + fileName, {
'method' : 'POST',
'headers' : {
'content-type' : "application/octet-stream",
'content-length' : CHUNK.length
},
'body': CHUNK
})
}
status.innerHTML = ‘uploaded!!!;
}

As you can see, we have added the file name as a query parameter and you might wonder why we are sending the file name as well. See, all the API calls to the backend server are stateless so to append the content to a file, we need to have a unique identifier which would be the file name for our case.

As, the person might want to upload the file with the same file name, to make sure the backend does work as expected, we need a unique identifier. For that, we use the below given beautiful one-liner -

Math.random().toString(36).slice(-6)

Ideally, we should not send any custom header because most of the proxies such as Nginx or HAProxy might block it.

Receive the chunks and store them on the server

Because we have completely set up the frontend, the next step is to listen to the file chunks and write them to the server.

To extract the file name from the query params of the request, we use the below piece of code -

const query = new URLSearchParams(req.url);
const fileName = query.get(/upload?fileName’);

So, our final code looks like this -

server.on('request', (req, res) => {
if(req.url === '/' && req.method == 'GET') {
return res.end(fs.readFileSync(__dirname + '/index.html'))
}
if(req.url=== '/upload' && req.method == 'POST') {
const query = new URLSearchParams(req.url);
const fileName = query.get(/upload?fileName’);
req.on('data', chunk => {
fs.appendFileSync(fileName, chunk); // append to a file on the disk
})
return res.end('Yay! file is uploaded.')
}
})

Conclusion

We learned how to build a file upload service with vanilla JS. Obviously, it's not the most efficient implementation, but it's more than enough to give you a fair idea of a couple of core concepts.

We can extend it to have a progress bar while uploading, retry chunk upload in case of failure, upload multiple files, upload multiple chunks at once, and so on.

I’m active on Twitter as the2ndfloorguy and would love to hear your thoughts. And in case you are interested in my other articles, you can find them here.