So I tried one more thing. I executed the raw query on version 17 with a LIMIT 1 clause and it returned 1 record. So I increased that to LIMIT 100 and it returned 100 records. I increased to LIMIT 1000 and it returned 1000 records. I increased to 10000 and it returned 10000 records. I increased to 100000 and it returned 19959 records as that is all there are, and it only took a few seconds to return. However, I then removed the LIMIT clause, and once again it was hung and never returned.
Why would it return with a LIMIT clause, but not without the LIMIT clause?
I wrote a script to create all of the tables, views, and function in an effort to recreate the issue. I ran the script on both version 16 and version 17 and executed the function on each. On both servers, the function returned results, so the attempt to recreate the problem failed. I then ran both versions of the server simultaneously on different ports and attempted a dump from 16 to version 17. I used the pg_dump from version 17. Once again the restore to version 17 got hung up and did not finish. It hangs at the point where it attempts to REFRESH MATERIALIZED view. The materialized view in question uses the function report.GetReportPoolTrainees that we have been discussing. I deleted the materialized view in the version 16 database and then did a dump/restore to the version 17 database, ran ANALYZE, and attempted to execute the query that the function calls. No luck, it would not return.
What is so puzzling to me is that if I do a fresh install of version 16, everything works as it should. But not when I do the exact same thing on version 17.
Lowell Hought <lowell.hought@gmail.com> writes: > I can try. I am not sure how to go about that. I did not see on the bug > report page where I could upload files, and I am afraid the file size of > the tables needed might be too large for email.
No, uploading stuff to that webform doesn't work. But at this point we're just conversing on the pgsql-bugs mailing list, so anything you can squeeze into email is fine. Having said that, nobody likes multi-gigabyte emails.
> The entire database when > written to an sql dump file is about 20 GB, so not terribly large. I could > attempt to dump the schema definition in one file and then the underlying > tables in another. Would that work? Or would you also need the files for > the function and any views the query relies upon?
Yeah, we'd need all the moving parts.
Usually people with this kind of problem don't want to expose their data anyway, for privacy and/or legal reasons. So what I'd suggest is trying to create some little script that generates fake data that's close enough to trigger the problem. Then you just need to provide that script and the DDL and function definitions.