Thread: Insufficient memory for this operation.
(WinXP, PG8b5, 1GB, 3,2GHz) Hi, I regulary get the above error message when I run my applications in parallel. There are minimum of 5 applications which have to access the server in parallel, but in the production environment this number will about 30-50 (with the additional clients). Each of them connects to the server (here comes the error message), runs a query and disconnects. If one of them is stopped, then everything works well, but 5 applications seem to be too large. I monitor the PGAdmin Server Status window. It shows that maximum 2 connections are concurrent in the same time, not more. My postgresql.conf file: max_connections = 100 shared_buffers = 20000 # min 16, at least max_connections*2, 8KB each work_mem = 16384 # min 64, size in KB maintenance_work_mem = 16384 # min 1024, size in KB max_stack_depth = 2048 # min 100, size in KB (The NT Task Manager reports a 769MB memory usage.) What should I increase/decrease to acheave the required performance? Or what do I do wrong? Many thanks, -- Csaba Együd --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03.
Hi, it wasn't that! :) That brrr.,*.:$;,^%^%roaaggh BDE :{ on the client side... that was the problem. There were stuck in connections in the BDE stack. After closing all BDE client, the problem went away. Sorry for disturbing ... I discover again and again that Postgres is really GOOD. If you have a problem, you can be sure, that the problem is somewhere else ... Should avoid using BDE?... Maybe. Bye, -- Csaba -----Original Message----- From: pgsql-general-owner@postgresql.org [mailto:pgsql-general-owner@postgresql.org] On Behalf Of Együd Csaba Sent: Tuesday, December 14, 2004 9:50 AM To: pgsql-general@postgresql.org Subject: [GENERAL] Insufficient memory for this operation. (WinXP, PG8b5, 1GB, 3,2GHz) Hi, I regulary get the above error message when I run my applications in parallel. There are minimum of 5 applications which have to access the server in parallel, but in the production environment this number will about 30-50 (with the additional clients). Each of them connects to the server (here comes the error message), runs a query and disconnects. If one of them is stopped, then everything works well, but 5 applications seem to be too large. I monitor the PGAdmin Server Status window. It shows that maximum 2 connections are concurrent in the same time, not more. My postgresql.conf file: max_connections = 100 shared_buffers = 20000 # min 16, at least max_connections*2, 8KB each work_mem = 16384 # min 64, size in KB maintenance_work_mem = 16384 # min 1024, size in KB max_stack_depth = 2048 # min 100, size in KB (The NT Task Manager reports a 769MB memory usage.) What should I increase/decrease to acheave the required performance? Or what do I do wrong? Many thanks, -- Csaba Együd --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03. ---------------------------(end of broadcast)--------------------------- TIP 9: the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match --- Incoming mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03. --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03.
Együd Csaba (Freemail) <csegyud@freemail.hu> writes: > shared_buffers = 20000 # min 16, at least max_connections*2, 8KB each You can lower this to 10,000 or even lower. > max_connections = 100 > work_mem = 16384 # min 64, size in KB That's 16M per connection with a maximum of 100 connections. So that's up to 1.6G that postgres has been told it can grab. It's unlikely it would grab it all at once though unless lots of connections are running queries with big sorts. -- greg
Hi Grag, thank you, I realized the problem. I treated these values as per server values not per connection. It's now working with 20 or more concurrent connections well. bye, -- Csaba -----Original Message----- From: pgsql-general-owner@postgresql.org [mailto:pgsql-general-owner@postgresql.org] On Behalf Of Greg Stark Sent: Tuesday, December 14, 2004 6:25 PM To: pgsql-general@postgresql.org Subject: Re: [GENERAL] Insufficient memory for this operation. Együd Csaba (Freemail) <csegyud@freemail.hu> writes: > shared_buffers = 20000 # min 16, at least max_connections*2, 8KB each You can lower this to 10,000 or even lower. > max_connections = 100 > work_mem = 16384 # min 64, size in KB That's 16M per connection with a maximum of 100 connections. So that's up to 1.6G that postgres has been told it can grab. It's unlikely it would grab it all at once though unless lots of connections are running queries with big sorts. -- greg ---------------------------(end of broadcast)--------------------------- TIP 8: explain analyze is your friend --- Incoming mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03. --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.805 / Virus Database: 547 - Release Date: 2004.12.03.