Re: Importing a Large .ndjson file - Mailing list pgsql-general

From Michael Lewis
Subject Re: Importing a Large .ndjson file
Date
Msg-id CAHOFxGotx8i1U+B6bFyw_zviqN0sFk6xKFA_MHtwr-3G8ScyRg@mail.gmail.com
Whole thread Raw
In response to Re: Importing a Large .ndjson file  (Sankar P <sankar.curiosity@gmail.com>)
List pgsql-general
I spoke too soon. While this worked fine when there were no indexes
and finished within 10 minutes, with GIN index on the jsonb column, it
is taking hours and still not completing.

It is always recommended to create indexes AFTER loading data. Sometimes it can be faster to drop all indexes on the table, load huge data, and re-create the indexes but there's no hard & fast rule. If you are adding 100k records to an empty or near empty table, I would remove all indexes and create them after. Be sure you have sufficient maintenance_work_mem also. 

pgsql-general by date:

Previous
From: Sankar P
Date:
Subject: Re: Importing a Large .ndjson file
Next
From: Tom Lane
Date:
Subject: Re: Importing a Large .ndjson file