Solutions in search of a problem

Vladimir Adamić's blog

Using newline-separated JSON to efficiently process large amounts of data with Node.js

February 21, 2020

We are all used to using JSON as a data format for communication between services. To process data from a JSON string, a computer needs to parse it as a whole. Parsing can get problematic in cases when a JSON contains a massive amount of data. Increasing the heap memory is just a temporary solution and can only get you so far. What we want to achieve is having many small interchangeable processes running in a cluster. The processes need to be able to serve many concurrent requests without the risk of one request starving the process of memory needed to serve the others. We need to try achieving this goal regardless of the number of requests and their size.