Re: Seeking Advice: PostgreSQL Performance Troubleshooting Without Third-Party Tools - Mailing list pgsql-admin

From Keith
Subject Re: Seeking Advice: PostgreSQL Performance Troubleshooting Without Third-Party Tools
Date
Msg-id CAHw75vtLsBL-O5EGVTgzZR=MUXFG=k0r3vEYma7YVdaiaDZJ1Q@mail.gmail.com
Whole thread
In response to Seeking Advice: PostgreSQL Performance Troubleshooting Without Third-Party Tools  (mahamood hussain <hussain.ieg@gmail.com>)
List pgsql-admin
For log analysis, pgbadger (https://pgbadger.darold.net/) is the best tool I've come across. Just be sure to adjust your log_line_prefix to include information such as client host/IP, role, database and error codes. Turn on other settings such as logging checkpointing, temp file generation and locking. The more you log, the more that pgbadger can help analyze. Just keep an eye on your log generation since you don't want that to in turn affect your database. If you can write the logs to a different mount point. that can isolate that IO from the database.

For trending other things, I'd recommend starting with something like Prometheus (https://prometheus.io/) & Grafana (https://grafana.com/). Get an exporter set up (sql_exporter is a good one - https://github.com/burningalchemist/sql_exporter) and set up alerts and trending your metrics so you can go back and look at history when those alerts fire. If you need to start scaling bigger, then you can start looking into something like Datadog and since you'll already have a better idea of what you actually need to monitor by then, it won't seem nearly as overwhelming jumping to something like that.

Keith

On Fri, Apr 10, 2026 at 1:49 PM mahamood hussain <hussain.ieg@gmail.com> wrote:

Hi Team,

We are currently working on a migration project from DB2 to PostgreSQL. Post-migration, we’re observing several performance issues such as long-running queries and occasional instance crashes. It also appears that some application-side workloads may not be optimized for PostgreSQL.

From a DBA perspective, I’m looking to proactively identify problem areas—such as:

  • Long-running queries
  • Jobs/stored procedures consuming high temp space
  • Queries resulting in sequential scans due to missing indexes
  • Lock waits, deadlocks, and memory-heavy operations

We already have key parameters enabled (pg_stat_statements, pg_buffercache, etc.), and PostgreSQL is generating logs in .csv format. However, the main challenge is efficiently analyzing these logs and identifying performance bottlenecks at scale (databases ranging from ~1TB to 15TB).

We currently don’t have third-party monitoring tools like Datadog, so I’m looking for recommendations on free or lightweight tools and best practices to:

  • Parse and analyze PostgreSQL logs (especially CSV logs)
  • Identify top resource-consuming queries and patterns
  • Correlate temp usage, memory pressure, and query behavior
  • Generate actionable insights for the engineering team

Any suggestions on tools, scripts, or approaches that have worked well in similar large-scale environments would be greatly appreciated.

pgsql-admin by date:

Previous
From: Ron Johnson
Date:
Subject: Re: Seeking Advice: PostgreSQL Performance Troubleshooting Without Third-Party Tools
Next
From: bertrand HARTWIG
Date:
Subject: Re: Seeking Advice: PostgreSQL Performance Troubleshooting Without Third-Party Tools