Re: crashes due to setting max_parallel_workers=0 - Mailing list pgsql-hackers

From Tomas Vondra
Subject Re: crashes due to setting max_parallel_workers=0
Date
Msg-id c8d0e9a7-77a9-a161-2ece-8cabea340612@2ndquadrant.com
Whole thread Raw
In response to Re: crashes due to setting max_parallel_workers=0  (Robert Haas <robertmhaas@gmail.com>)
List pgsql-hackers

On 03/27/2017 05:51 PM, Robert Haas wrote:
> On Mon, Mar 27, 2017 at 9:54 AM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
>> Robert Haas <robertmhaas@gmail.com> writes:
>>> On Mon, Mar 27, 2017 at 1:29 AM, Rushabh Lathia
>>> <rushabh.lathia@gmail.com> wrote:
>>>> But it seems a bit futile to produce the parallel plan in the first place,
>>>> because with max_parallel_workers=0 we can't possibly get any parallel
>>>> workers ever. I wonder why compute_parallel_worker() only looks at
>>>> max_parallel_workers_per_gather, i.e. why shouldn't it do:
>>>> parallel_workers = Min(parallel_workers, max_parallel_workers);
>>>> Perhaps this was discussed and is actually intentional, though.
>>
>>> It was intentional.  See the last paragraph of
>>> https://www.postgresql.org/message-id/CA%2BTgmoaMSn6a1780VutfsarCu0LCr%3DCO2yi4vLUo-JQbn4YuRA@mail.gmail.com
>>
>> Since this has now come up twice, I suggest adding a comment there
>> that explains why we're intentionally ignoring max_parallel_workers.
> 
> Hey, imagine if the comments explained the logic behind the code!
> 
> Good idea.  How about the attached?
> 

Certainly an improvement. But perhaps we should also mention this at 
compute_parallel_worker, i.e. that not looking at max_parallel_workers 
is intentional.

regards

-- 
Tomas Vondra                  http://www.2ndQuadrant.com
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services



pgsql-hackers by date:

Previous
From: Rafia Sabih
Date:
Subject: Re: [COMMITTERS] pgsql: Improve access to parallel queryfrom procedural languages.
Next
From: Kyotaro HORIGUCHI
Date:
Subject: Re: free space map and visibility map