Hi Aleksander,
On 3/2/18 7:18 AM, Aleksander Alekseev wrote:
>
>> You do realize we have the actual source database available, I hope? Since
>> it's our own system... There is no need to scrape the data back out -- if
>> we can just define what kind of reports we want, we can trivially run it on
>> the source database. Or if we want it more often, we can easily make a
>> webview for it. It's basically just a "map this URL to a SQL query"...
>
> I don't think commitfest.cputube.org has the SQL data on whether patch
> pass the tests. It just displays SVG images from travis-ci.org. Also
> unfortunately both commitfest.postgresql.org and commitfest.cputube.org
> currently don't have any kind of public API and don't allow to export
> data, e.g. in CSV or JSON.
>
> I guess it would be nice if both services supported export, in any
> format, so anyone could build any kind of reports or automation tools
> without parsing HTML with regular expressions or depending on other
> people.
Yes, that would be good. I just had a chance to look through the data
and the thing I was most hoping to do with it would be a bit complicated.
I would like to get a list of submitter patches totals vs the total
number of patches they are reviewing. In the past I have done this by
eyeball.
Do you think you could put something like that together?
> If I'm not mistaken, there was a discussion regarding public APIs.
> I wonder what prevents adding it, at least a simple export of everything.
> After all, it is indeed just mapping URL to a SQL query. For instance,
> this one:
>
> select array_to_json(array_agg(row_to_json(tbl))) from tbl;
I would be happy with a page that is simply a json dump of the
publicly-viewable fields in each table. Then I can do whatever I want
with the data.
Thanks,
--
-David
david@pgmasters.net