Fastest way of reading huge files in Go with small RAM

Issue

I want to read the data from the different text or JSON or CSV files. Which is the approach I should follow?

I have the blog post File read and Read 2 GB text file with small RAM for the different approaches for file reading.

Different approaches:

  • Reading a file in chunks
  • Reading file chunks concurrently
  • Reading the entire file into memory
  • Splitting a long string into words
  • Scanning word by word

What is the fastest way of reading the file with small RAM?

Solution

There are basically two different ways to approach parsing a file: document parsing and stream parsing.

Document parsing reads the data from the file and turns it into a big set of objects that you can query, like the HTML DOM in a browser. The advantage is you have the complete data at your fingertips, this is often simpler. The disadvantage is you have to store it all in memory.

dom = parse(stuff)

// Now do whatever you like with the DOM

Stream parsing instead reads a single element at a time and presents it to you for immediate use, then it moves on to the next one.

for element := range stream(stuff) {
    ...examine one element at a time...
}

The advantage is you don’t have to load the whole thing into memory. The disadvantage is you must work with the data as it goes by. This is very useful for searches or anything else which needs to process one by one.


Fortunately Go provides libraries to handle the common formats for you.

A simple example is handling a CSV file.

package main

import(
    "encoding/csv"
    "fmt"
    "log"
    "os"
    "io"
)

func main() {
    file, err := os.Open("test.csv")
    if err != nil {
        log.Fatal(err)
    }

    parser := csv.NewReader(file)

    ...
}

We can slurp the whole thing into memory as a big [][]string.

records, err := parser.ReadAll()
if err != nil {
    log.Fatal(err)
}

for _,record := range records {
    fmt.Println(record)
}

Or we can save a bunch of memory and deal with the rows one at a time.

for {
    record, err := parser.Read()
    if err == io.EOF {
        break
    }
    if err != nil {
        log.Fatal(err)
    }

    fmt.Println(record)
}

Since every line of a CSV is functionally the same, processing it one row at a time makes the most sense.

JSON and XML are more complex because they are large, nested structures, but they can also be streamed. There’s an example of streaming in the encoding/json documentation.


What if your code isn’t a simple loop? What if you want to take advantage of concurrency? Use a channel and a goroutine to feed it concurrent with the rest of the program.

records := make( chan []string )
go func() {
    parser := csv.NewReader(file)

    defer close(records)
    for {
        record, err := parser.Read()
        if err == io.EOF {
            break
        }
        if err != nil {
            log.Fatal(err)
        }

        records <- record
    }
}();

Now you can pass records to a function which can process them.

func print_records( records chan []string ) {
    for record := range records {
        fmt.Println(record)
    }
}

Answered By – Schwern

Answer Checked By – Timothy Miller (GoLangFix Admin)

Leave a Reply

Your email address will not be published.