Thread: Getting TOAST errors
I am occasionally getting this kind of error when attempting a SELECT statement. PGError: ERROR: missing chunk number 0 for toast value 27143 in pg_toast_2619 What does this mean? Is some sort of corruption creeping into the database? Postgres 9.0 linux.
Tim Uckun <timuckun@gmail.com> writes: > I am occasionally getting this kind of error when attempting a SELECT statement. > PGError: ERROR: missing chunk number 0 for toast value 27143 in pg_toast_2619 > What does this mean? Is some sort of corruption creeping into the database? Hard to tell. We've seen enough reports like that to make it seem like there may be some bug buried there, but no one has provided anything to do any debugging work with. Can you create a reproducible test case? > Postgres 9.0 linux. 9.0.what? regards, tom lane
> > Hard to tell. We've seen enough reports like that to make it seem like > there may be some bug buried there, but no one has provided anything to > do any debugging work with. Can you create a reproducible test case? Not really. I have a nightly process which downloads data and sticks it into a text field. Afterwards another process reads that text data and processes it creating rows in another table. The problem occurs in the last step and at seemingly random intervals. For example one time it might happen when you are creating row 1000 another time it might be when you are creating row 2000. > >> Postgres 9.0 linux. > > 9.0.what? 9.0.4
Tim Uckun <timuckun@gmail.com> writes: >> Hard to tell. We've seen enough reports like that to make it seem like >> there may be some bug buried there, but no one has provided anything to >> do any debugging work with. Can you create a reproducible test case? > Not really. I have a nightly process which downloads data and sticks > it into a text field. Afterwards another process reads that text data > and processes it creating rows in another table. The problem occurs in > the last step and at seemingly random intervals. For example one time > it might happen when you are creating row 1000 another time it might > be when you are creating row 2000. Well, I'm not asking for perfect reproducibility --- a test case that fails even 1% of the time would be great. regards, tom lane
> >> Not really. I have a nightly process which downloads data and sticks >> it into a text field. Afterwards another process reads that text data >> and processes it creating rows in another table. The problem occurs in >> the last step and at seemingly random intervals. For example one time >> it might happen when you are creating row 1000 another time it might >> be when you are creating row 2000. > > Well, I'm not asking for perfect reproducibility --- a test case that > fails even 1% of the time would be great. What exactly do you need? The database is not too large but the data is proprietary. Despite this I am willing to provide a sampling of the data in the two tables involved. The code itself is ruby but has a lot of library dependencies so it might not be possible to give you a working application. The idea is pretty simple though, you fetch a text field, it contains CSV data, you iterate through the data updating or inserting records into the second table.
Tim Uckun <timuckun@gmail.com> writes: >> Well, I'm not asking for perfect reproducibility --- a test case that >> fails even 1% of the time would be great. > What exactly do you need? A self-contained test case (code and data) that triggers the error. If it only does so probabilistically, once in every-so-many runs, that's fine. > The database is not too large but the data is proprietary. Despite > this I am willing to provide a sampling of the data in the two tables > involved. Perhaps you could sanitize or anonymize the data? It's unlikely that a bug of this sort has all that much to do with the exact data content. Whether the bug would still show for a "sample" is a different question, and one you'd have to resolve by experiment. regards, tom lane
> > A self-contained test case (code and data) that triggers the error. > If it only does so probabilistically, once in every-so-many runs, > that's fine. I'll see what I can do. Give me a few days. Cheers.