Inserting streamed data - Mailing list pgsql-general

From Kevin Old
Subject Inserting streamed data
Date
Msg-id 1036087909.3123.54.camel@oc
Whole thread Raw
List pgsql-general
Hello everyone,

I have data that is streamed to my server and stored in a text file.  I
need to get that data into my database as fast as possible.  There are
approximately 160,000 rows in this text file.  I understand I can use
the COPY command to insert large chunks of data from a text file, but I
can't use it in this situation.  Each record in the text file has 502
"fields".  I pull out 50 of those.  I haven't found a way to manipulate
the COPY command to pull out the values I need.  So that solution would
be out.

I have a perl script that goes through the file and pulls out the 50
fields, then inserts them into the database, but it seems to be very
slow.  I think I just need some minor performance tuning, but dont' know
which variables to set in the postgresql.conf file that would help with
the speed of the inserts.

Here's my postgresql.conf file now:

max_connections = 10
shared_buffers = 20


I'm running a Solaris 2.7 with 2GB RAM.

Also, saw this at
http://developer.postgresql.org/docs/postgres/kernel-resources.html

[snip...]

Solaris

        At least in version 2.6, the default maximum size of a shared
        memory segments is too low for PostgreSQL. The relevant settings
        can be changed in /etc/system, for example:

        set shmsys:shminfo_shmmax=0x2000000
        set shmsys:shminfo_shmmin=1
        set shmsys:shminfo_shmmni=256
        set shmsys:shminfo_shmseg=256

        set semsys:seminfo_semmap=256
        set semsys:seminfo_semmni=512
        set semsys:seminfo_semmns=512
        set semsys:seminfo_semmsl=32

[snip...]

Should I do this?

Thanks,
Kevin

--
Kevin Old <kold@carolina.rr.com>


pgsql-general by date:

Previous
From: "scott.marlowe"
Date:
Subject: Re: DAFS?
Next
From: Chris Gamache
Date:
Subject: Creating a unique identifier...