This looks good to me. A little commentary around why pglz_maximum_compressed_size() returns a universally correct answer (there's no way the compressed size can ever be larger than this because...) would be nice for peasants like myself.
If you're looking to continue down this code line in your next patch, the next TODO item is a little more involved: a user-land (ala PG_DETOAST_DATUM) iterator API for access of TOAST datums would allow the optimization of searching of large objects like JSONB types, and so on, where the thing you are looking for is not at a known location in the object. So, things like looking for a particular substring in a string, or looking for a particular key in a JSONB. "Iterate until you find the thing." would allow optimization of some code lines that currently require full decompression of the objects.
P.
Thanks for your comment. I've updated the patch.
As for the iterator API, I've implemented a de-TOAST iterator actually[0].
And I’m looking for more of its application scenarios and perfecting it.