<<Out Of Memory>> error and Swap Memory

  • Crispin CIRHUZA
  • 12/08/2023
Rhinoceros Software Blog- <<Out Of Memory>> error and Swap Memory

Deployed applications or simply all running processes use your RAM memory to temporarily store the data they process. So some applications are very greedy in RAM, sometimes it's because the processing performed by them requires it but often also it's just because they are simply poorly developed. In this article you will discover two ways to work around this problem.

During the development of an application, a software or a web service, the pressure that the Product Owner imposes on the developer sometimes becomes to much that the developer loses control and defines delivering as the only priority. Admittedly, all the elements of the specifications are covered but only the performance tests will go wrong and in the worst case if these tests are not included in the requirements defined by the CPO or his manager, everything passes icognito.

The "Out Of Memory" error is one of the errors that often occur in production due to lack of attention from stakeholders (developers, CPOs and managers) but also by the simple fact that the production environment is different from that of development, different abilities etc.

In the code below we will see a simple code which copies the content of a file (a few gigas) to another and which will have the audacity to pass the camel through the needle, so read in one times all these gigas and copy them to the destination file knowing that the code will run on a private server in the cloud (VPS) which will only have 2Gb of RAM. Of course, the code works without problem, although there is a serious one, because his PC has a capacity of 16Gb of RAM which he screws up without realizing.


      const { readFile, writeFile } = require("fs/promises");
      
      const DESTINATION_PATH = "./receiver.mp4";
      
      const copyFile = async ({ sourceFilePath }) => {
        readFile(sourceFilePath)
          .then((content) => {
            writeFile(DESTINATION_PATH, content)
              .then(() => {
                console.info("content copied successfully");
              })
              .catch((error) => {
                console.error(
                  `an error occurred when writing the destination file ${DESTINATION_PATH}`,
                  error
                );
              });
          })
          .catch((error) => {
            console.error(
              `an error occurred when reading the source file data at ${sourceFilePath}`,
              error
            );
          });
      };
      
      // executing the function
      copyFile({ sourceFilePath: "./bigfile.mp4" });
      

During its execution, the code above sends all 2Gb of the video through RAM to copy them into the destination file. Yes it works, the content is copied in full but at what cost? Time, resources (CPU and RAM) are the victims and once in production, with only 2G of RAM (used say at 35%), this code will crash and that's when the << Out error Of Memory >> occurs because only 65% ​​of 2G will remain, or 1.3G which cannot contain the 2G needed.

An unawarenessless IT, consulting his Task Manager, will notice that the RAM is full and will rush to add more and more, which will be precisely a solution but which is another crime because these resources that are screwed up cost money to your business and it would be wise to put it to good use.

It's not bad to have more RAM that we don't mess up, so before thinking about this solution, we must first make sure that the application codes are optimized and that's what we're going to do, before thinking about adding resources, we will first optimize the code above.

To do this, we will therefore use the Streams of NodeJS, yes, the streams of data, so instead of taking the 2 Gb in one go, we will pour it little by little to the destination file going through a pipeline that connects the two.


      const { pipeline } = require("node:stream/promises");
      const { createReadStream, createWriteStream } = require("fs");
      
      const DESTINATION_PATH = "./receiver.mp4";
      
      const copyFile = async ({ sourceFilePath }) => {
        var readerStream = createReadStream(sourceFilePath);
      
        var writerStream = createWriteStream(DESTINATION_PATH);
      
        await pipeline(readerStream, writerStream)
          .then(() => {
            console.info("Data transferred successfully");
          })
          .catch((error) => {
            console.error("An error occurred when piping read/write streams", error);
          });
      };
      
      // Executing the function
      copyFile({ sourceFilePath: "./bigfile.mp4" });
      

In the previous code, we made optimizations and our script don't consume to much ressources anymore. This is an efficient solution to our problem. It should be noted that, yes OOM errors occur when application codes are not optimized but, it may be that a code is optimized and because it performs intense calculations, it needs more RAM.

Rhinoceros Software Blog- <<Out Of Memory>> error and Swap Memory

Apart from the solution known to everyone, the increase of RAM, in a cloud environment this can become very expensive if it was not budgeted for at the start. That's when the "Memory Swap" comes in help. It is a technique that sysadmins use to define a portion of hard disk storage space to allocate for RAM when it reaches a user defined threshold. Thus, we can say that if my RAM is used at 60%, that we hand over to the space allocated to the hard disk.

Below, I present the different commands to type in order to be able to configure/create a swap memory for your RAM. Thus you would save your cash from being expensed in not necessary stuffs.

It is also important to note that the read/write speed of random access memory (RAM) is higher than that of mass memory (Hard disk) so if you have a particular consideration for processing time etc, consider adding RAM.


      # Créer le fichier d'échange en spécifiant la taille désirée en Go
      sudo fallocate -l [SIZE]G /swapfile
      
      # Limiter les droits d'accès en écriture à root uniquement
      sudo chmod 600 /swapfile
      
      # Activer le fichier d'échange
      sudo swapon /swapfile
      
      # Ajuster le paramètre swappiness (seuil d'utilisation de la mémoire swap)
      sudo sysctl vm.swappiness=40
      
      # Persister le paramètre swappiness après un redémarrage
      echo "vm.swappiness=40" | sudo tee -a /etc/sysctl.conf
      
Conclusion

Your infrastructure is your IT's most precious asset and it represents a significant element of your assets as a company. Applications that overload the RAM present a significant danger to other applications because if the RAM becomes saturated then no other application deployed on the same server can function. By entrusting us with the development of your IS, you have the guarantee that the applications delivered do not pose a threat to your IT infrastructure or to their sisters already in production with you.