Importing a Large .ndjson file - Mailing list pgsql-general

From Sankar P
Subject Importing a Large .ndjson file
Date
Msg-id CAMSEaH5SfyfXN_rSah41dOOA_aAik4hZED0qp52=1wqzjz-pMA@mail.gmail.com
Whole thread Raw
Responses Re: Importing a Large .ndjson file  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
Hi

I have a .ndjson file. It is a new-line-delimited JSON file. It is
about 10GB and has about 100,000 records.

Some sample records:
```
{ "key11": "value11", "key12": [ "value12.1", "value12.2"], "key13": {
"k111": "v111" } } \n\r
{ "key21": "value21", "key22": [ "value22.1", "value22.2"] }
```
Now I want to INSERT these json records into my postgres table of the
following schema:

```
CREATE TABLE myTable (id BIGSERIAL, content JSONB);
```

Where I want the records to be inserted to the `content` field of my
postgres table.

What is the best way to do this on a postgresql database, deployed in
kubernetes, with a 1 GB RAM allocated ?

I can probably write a that would read this file line-by-line and
INSERT into the database, in a transaction. But that I believe would
take a lot of network traffic and I want to know if there is a better
way to do this.

Thanks.

-- 
Sankar P
http://psankar.blogspot.com



pgsql-general by date:

Previous
From: Laurenz Albe
Date:
Subject: Re: Conflict with recovery on PG version 11.6
Next
From: Toomas Kristin
Date:
Subject: Re: Conflict with recovery on PG version 11.6