PGDump question/issue - Mailing list pgsql-novice

From Ayden Gera
Subject PGDump question/issue
Date
Msg-id CANYJdW+Zq=CUbyMLiVqQ3nipS4S=W_Jn6J_bY=49qw1EDbBwyg@mail.gmail.com
Whole thread Raw
Responses Re: PGDump question/issue
List pgsql-novice
Hi,
Hoping someone may have a solution to this problem.
We get a daily PGDump file (@3Gb) from our SaaS provider (for BI purpose). In it, it has a Drop Table IF Exists command..
This file has no row level security etc.
We want  to use the same file to populate Supabase with and add row level security.. but I believe the drop table will destroy the rls each day and manually adding it back *unless mabe scripted) isn't an option.

We have an inhouse Postgresql we can also use to potentially load and then do its own PGDump with data only..

But the other issue we have is the source tables don't always have any unique keys that we can tell.. so to be safe and avoid data duplicate risk.. we prefer to delete the entire tables data before inserting..

Does anyone have any suggestions on how to best automate the daily updating of data into the supabase tables without losing any RLS we might configure on those tables?
Or what commands should we run on our own PG to get our own data only/insert + commands to drop all data in all tables before running it.

I was also wondering if we could send PGDump from SaaS to Supabase Db1 and then stream data to DB2 (Prod) but unclear if we can and/or risk data duplication risk if we cannot somehow delete the tables in Prod just before streaming..

Thanks in advance!



pgsql-novice by date:

Previous
From: Laurenz Albe
Date:
Subject: Re: Subqueries
Next
From: Laurenz Albe
Date:
Subject: Re: PGDump question/issue