Closing the browser window deletes your data automatically.
"}},{"@type":"Question","name":"How do I export PostgreSQL slow query logs?","acceptedAnswer":{"@type":"Answer","text":"To capture slow queries in PostgreSQL: - Open the postgresql.conf
file and set the following parameters:
log_min_duration_statement = 1000 # log queries taking longer than 1 second log_statement = 'none' logging_collector = on log_directory = 'log' log_filename = 'postgresql-slow.log'
- Restart PostgreSQL for the changes to take effect. - To find the config file path, run:
psql -U postgres -c 'SHOW config_file'
"}},{"@type":"Question","name":"How do I export slow query logs from MySQL?","acceptedAnswer":{"@type":"Answer","text":"To enable slow query logging in MySQL (self-hosted): - Edit the my.cnf
file and add:
slow_query_log = 1 long_query_time = 1 log_output = FILE slow_query_log_file = /var/lib/mysql/slow.log
- Restart MySQL to apply the changes.
To enable logging without a restart, run:
SET GLOBAL slow_query_log_file = '/var/lib/mysql/slow.log'; SET GLOBAL long_query_time = 1; SET GLOBAL slow_query_log = ON; FLUSH LOGS;
"}},{"@type":"Question","name":"How do I export slow queries from MySQL or PostgreSQL on AWS RDS?","acceptedAnswer":{"@type":"Answer","text":"To enable slow query logs on AWS RDS-managed instances:
MySQL :
Update your DB parameter group:
slow_query_log = 1 long_query_time = 1 log_output = FILE
Download the logs from the Logs & events tab in the RDS Console.
PostgreSQL : Update your parameter group:
log_min_duration_statement = 500 logging_collector = on
Download the logs from the Logs & events tab in the RDS Console.
"}},{"@type":"Question","name":"How can I export slow queries from the MySQL slow_log table?","acceptedAnswer":{"@type":"Answer","text":"To convert slow query entries from the slow_log table into a file:
mysql -h <host> -u <user> -p --raw --skip-column-names --quick --silent --no-auto-rehash --compress -e " SELECT CONCAT( '# Time: ', DATE_FORMAT(start_time, '%y%m%d %H:%i:%s'), '\\n', '# User@Host: ', user_host, '\\n','# Query_time: ', TIME_TO_SEC(query_time), 'Rows_sent: ', rows_sent, ' Rows_examined: ', rows_examined, '\\n', 'SET timestamp=', UNIX_TIMESTAMP(start_time), ';', '\\n', sql_text, ';' ) FROM mysql.slow_log;" > slow_queries.log
Replace <host>
and <user>
with your values.
"}},{"@type":"Question","name":"Why can't I see my queries in the log?","acceptedAnswer":{"@type":"Answer","text":"Check the following:
Slow query logging is enabled The threshold is set correctly:long_query_time
for MySQLlog_min_duration_statement
for PostgreSQL The log file path is set and accessible "}},{"@type":"Question","name":"What file formats are supported?","acceptedAnswer":{"@type":"Answer","text":"The log visualizer supports:
Plaintext logs (the default format for MySQL) CSV logs from PostgreSQL (if CSV logging is enabled) "}},{"@type":"Question","name":"What should I do if my log file does not upload or parse correctly?","acceptedAnswer":{"@type":"Answer","text":"Make sure: - The file follows the MySQL or PostgreSQL slow query log format - The file is complete and not truncated or corrupted - See the documentation for help with exporting and formatting log files
"}}]}]Platform Solutions Resources Company Pricing
Tools PostgreSQL and MySQL query log analyzer PostgreSQL and MySQL query log analyzer Analyze slow query logs from PostgreSQL or MySQL and view them on a timeline to identify and fix performance issues.
Upload a slow query log file No data will be stored on our side. Once you close the browser window, the log file will be deleted. To learn how to export the slow query log, see the FAQ.
Drag & drop a file, or click Upload file .
Slow queries log visualizer FAQ How do I export PostgreSQL slow query logs?
How do I export slow query logs from MySQL?
How do I export slow queries from MySQL or PostgreSQL on AWS RDS?
How can I export slow queries from the MySQL slow_log table?
Why can't I see my queries in the log?
What file formats are supported?
What should I do if my log file does not upload or parse correctly?
Aiven is an AI-ready open source data platform that combines open-choice services to rapidly stream, store and serve data across major cloud providers — simply and securely.
Copyright © Aiven 2016-2025. Apache, Apache Kafka, Kafka, Apache Flink, Flink, Apache Cassandra, and Cassandra are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries. ClickHouse is a registered trademark of ClickHouse, Inc. https://clickhouse.com. M3, M3 Aggregator, OpenSearch, AlloyDB Omni, PostgreSQL, MySQL, InfluxDB, Grafana, Dragonfly, Valkey, Thanos, Terraform, and Kubernetes are trademarks and property of their respective owners. *Redis is a registered trademark of Redis Ltd. and the Redis box logo is a mark of Redis Ltd. Any rights therein are reserved to Redis Ltd. Any use by Aiven is for referential purposes only and does not indicate any sponsorship, endorsement or affiliation between Redis and Aiven. All product and service names used in this website are for identification purposes only and do not imply endorsement.