Re: Idea: INSERT INTO ... NATURAL SELECT ... - Mailing list pgsql-sql

From Sven Berkvens-Matthijsse
Subject Re: Idea: INSERT INTO ... NATURAL SELECT ...
Date
Msg-id 7edf1575-d544-f643-e331-d340b26e87b4@berkvens.net
Whole thread Raw
In response to Idea: INSERT INTO ... NATURAL SELECT ...  (Sven Berkvens-Matthijsse <sven@postgresql.berkvens.net>)
List pgsql-sql
Hi Martin,

On 01/02/2019 19.41, Martin Stöcker wrote:
> Hi Sven,
>
> in many cases I prefer "copy from" to import data.  So I can create my 
> test data via spreadsheet and csv.

Sure, that works, but it would then be useful if the COPY command would 
actually read the first line of a CSV file and use it to find the target 
columns in the table, but it does not.

> Sometimes it is helpfull to do some shell stuff to create it or import 
> with psql.

Yes, that definitely works.

> But if you prefer to have column names and column data near to each 
> other, why not using json?
>
> postgres=#create table test ( i integer, t text);
> postgres=# insert into test(select * from 
> json_to_recordset('[{"i":1,"t":"foo"},{"i":"7","t":"bar"}]') as x(i 
> int, t text));

That also works, but requires one to name all the columns and their 
types in the "AS x" part. That makes the statement very verbose. My 
proposal would not require the types to be stated and would only require 
the column names in the "data part" of the statement.

> INSERT 0 2
> postgres=# select * from test;
>  i |  t
> ---+-----
>   1 | foo
>  7 | bar
> (2 rows)
>
> Regards Martin

Thanks for your thoughts!

With kind regards,
Sven



pgsql-sql by date:

Previous
From: Sven Berkvens-Matthijsse
Date:
Subject: Re: Idea: INSERT INTO ... NATURAL SELECT ...
Next
From: Ian Tan
Date:
Subject: Help on SQL query