Thread: BUG #16316: The application has lost the database connection:
The following bug has been logged on the website: Bug reference: 16316 Logged by: bo li Email address: boli9301@163.com PostgreSQL version: 12.2 Operating system: mac os 10.15 Description: I used a PostgreSQL 12 to storage my data. I use python to connect my SQL and query my SQL to export the csv file. When I want to query the table with 500MB, there will be an error that shows the The application has lost the database connection. If I truncate this table, it's ok. I wonder how to fix this problem. Thank you
Hi, On Tue, Mar 24, 2020 at 05:46:33PM +0000, PG Bug reporting form wrote: >The following bug has been logged on the website: > >Bug reference: 16316 >Logged by: bo li >Email address: boli9301@163.com >PostgreSQL version: 12.2 >Operating system: mac os 10.15 >Description: > >I used a PostgreSQL 12 to storage my data. I use python to connect my SQL >and query my SQL to export the csv file. When I want to query the table with >500MB, there will be an error that shows the The application has lost the >database connection. If I truncate this table, it's ok. I wonder how to fix >this problem. Thank you > It's impossible to answer this question, because it's impossible to say what's the root cause - it might be a failure at the application level, it might be a PostgreSQL issue, or something entirely different. You have to check if there's an error written into the PostgreSQL server log, for example. If there is some relevant info, post it here. It also helps to say which python drivers you're using, what's the exact error message you get, show some code so that we can try reproducing the issue, etc. regards -- Tomas Vondra http://www.2ndQuadrant.com PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
On Thu, Mar 26, 2020 at 03:55:11AM +0800, Bo Li_Chongqing University wrote: >Hi Tomas, >Thank you very much. > > >When I want to insert into a table, if the size of table is bigger than 100 M or much more, the connection will be lost.I guess that local SQL database is not stronger to process the big data. If I limit the size to 10000, it is ok, instead,the connection will be lost. > > >I used this query in the Pgadmin to insert into a table, if I add limit 100, it works. If remove it, the connection willbe lost. I just want to generate an new table in the Pgadmin >insert into variable_capacity_factor >SELECT v.generation_plant_id, t.raw_timepoint_id, round(cast(capacity_factor as numeric),7),gen_tech,load_zone_id,to_char(t.timestamp_utc,'YYYYMMDD')as timestamp >FROM a2020_new_variable_capacity_factors_historical_temp v >JOIN projection_to_future_timepoint ON(v.raw_timepoint_id = historical_timepoint_id) >JOIN a2020_new_generation_plant_scenario_member_temp USING(generation_plant_id) >JOIN sampled_timepoint_temp as t ON(t.raw_timepoint_id = future_timepoint_id) >JOIN a2020_generation_plant as t1 ON(t1.generation_plant_id = v.generation_plant_id) >WHERE generation_plant_scenario_id = 10 >AND t.time_sample_id=1 > That's neat, but it doesn't really tell us anything. Did you check the server log as I suggested? Are there any errors or interesting messages in the log? regards -- Tomas Vondra http://www.2ndQuadrant.com PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services