We are getting following error while running this query. As checked, the query is fine. Please guide on this
Query :
with recursive ctee as ( select * , 1 as level from kms_chunked_content where processid = '6598' and nodeid = 1 union all select e.*, c.level + 1 from ctee c join kms_chunked_content e on e.parentnodeid = cast(c.nodeid as text) and e.is_active = 'Y' and e.processid = '6598' ) select * from ctee order by position;
Exception :
org.jkiss.dbeaver.model.sql.DBSQLException: SQL Error [53400]: ERROR: temporary file size exceeds temp_file_limit (52428800kB) at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCStatementImpl.executeStatement(JDBCStatementImpl.java:134) at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:467) at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$0(SQLQueryJob.java:407) at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:146) at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:405) at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:849)
Note : We have set temp_file_limit =50GB still query getting issue.
This error is raised because the query used more than temp_file_limit (see https://www.postgresql.org/docs/current/runtime-config-resource.html#RUNTIME-CONFIG-RESOURCE-DISK). Your options are to raise that parameter (and maybe increase the underlying disk given the message subject) or find a way to consume less disk in temporary files. It's hard to know without knowing the query and the underlying data, but maybe rewriting the query or splitting it into multiple queries over a temporary table would work.