Hi
I have a .ndjson file. It is a new-line-delimited JSON file. It is
about 10GB and has about 100,000 records.
Some sample records:
```
{ "key11": "value11", "key12": [ "value12.1", "value12.2"], "key13": {
"k111": "v111" } } \n\r
{ "key21": "value21", "key22": [ "value22.1", "value22.2"] }
```
Now I want to INSERT these json records into my postgres table of the
following schema:
```
CREATE TABLE myTable (id BIGSERIAL, content JSONB);
```
Where I want the records to be inserted to the `content` field of my
postgres table.
What is the best way to do this on a postgresql database, deployed in
kubernetes, with a 1 GB RAM allocated ?
I can probably write a that would read this file line-by-line and
INSERT into the database, in a transaction. But that I believe would
take a lot of network traffic and I want to know if there is a better
way to do this.
Thanks.
--
Sankar P
http://psankar.blogspot.com