[NOVICE] Bulk load billions of records into Postgres cluster - Mailing list pgsql-novice

From balasubramaniam
Subject [NOVICE] Bulk load billions of records into Postgres cluster
Date
Msg-id CACFhHyuehAgUm6cQ4RbELZC5HSnc9Zsi9hpQjo+g2q+kVW1i-Q@mail.gmail.com
Whole thread Raw
Responses Re: [NOVICE] Bulk load billions of records into Postgres cluster  (Aleksey Tsalolikhin <atsaloli.tech@gmail.com>)
List pgsql-novice
Hi All,

We have a proven NoSQL production setup with a few billion rows. We are planning to move towards a more structured data model with few tables.

I am looking for a completely open-source and battle-tested database and Postgres seems to be the right start.

Due to our increasing scale demands, I am planning to start with Postgresql cluster. Ability to ingest data at scale, around a few TBs, in the fastest possible duration is highly critical for our use case. I have read through official documentation and also about COPY FROM command, but none of these talk specifically about cluster setup.

1) What is the standard and fastest way to ingest billions of records into Postgres at scale.
2) Is there a tool to generate the sql script for COPY FROM command for ready use? I want to avoid writing another custom tool and maintain it.

Thanks in advance,
bala

pgsql-novice by date:

Previous
From: Neha Khatri
Date:
Subject: [NOVICE] Doc compilation on Solaris
Next
From: Rounak Jain
Date:
Subject: [NOVICE] where and how to store calculated data?