Re: Understanding memory usage - Mailing list psycopg

From Damiano Albani
Subject Re: Understanding memory usage
Date
Msg-id CAKys9514yvi9EcSccpLGtGzETfNE--jCwqBiJBy_6iFJ9-pwsA@mail.gmail.com
Whole thread Raw
In response to Re: Understanding memory usage  (Daniele Varrazzo <daniele.varrazzo@gmail.com>)
List psycopg
On Thu, Oct 31, 2013 at 12:01 PM, Daniele Varrazzo <daniele.varrazzo@gmail.com> wrote:

I easily expect a much bigger overhead in building millions of Python
object compared to building 20. Not only for the 37 bytes of overhead
each string has (sys.getsizeof()), but also for the consequences for
the GC to manage objects in the millions.

For the record, I've eventually settled for a solution using pgnumpy.
It's capable of handling results made of millions of rows with very little overhead as far as I could see.
As my original goal was to feed the data to Pandas down the line, pgnumpy seems spot on.

--
Damiano Albani

psycopg by date:

Previous
From: Daniele Varrazzo
Date:
Subject: Re: Understanding memory usage
Next
From: Denis Papathanasiou
Date:
Subject: Best strategy for bulk inserts where some violate unique constraint?