2ndQuadrant is now part of EDB

Bringing together some of the world's top PostgreSQL experts.

2ndQuadrant | PostgreSQL
Mission Critical Databases
  • Contact us
  • EN
    • FR
    • IT
    • ES
    • DE
    • PT
  • Support & Services
  • Products
  • Downloads
    • Installers
      • Postgres Installer
      • 2UDA – Unified Data Analytics
    • Whitepapers
      • Business Case for PostgreSQL Support
      • Security Best Practices for PostgreSQL
    • Case Studies
      • Performance Tuning
        • BenchPrep
        • tastyworks
      • Distributed Clusters
        • ClickUp
        • European Space Agency (ESA)
        • Telefónica del Sur
        • Animal Logic
      • Database Administration
        • Agilis Systems
      • Professional Training
        • Met Office
        • London & Partners
      • Database Upgrades
        • Alfred Wegener Institute (AWI)
      • Database Migration
        • International Game Technology (IGT)
        • Healthcare Software Solutions (HSS)
        • Navionics
  • Postgres Learning Center
    • Webinars
      • Upcoming Webinars
      • Webinar Library
    • Whitepapers
      • Business Case for PostgreSQL Support
      • Security Best Practices for PostgreSQL
    • Blog
    • Training
      • Course Catalogue
    • Case Studies
      • Performance Tuning
        • BenchPrep
        • tastyworks
      • Distributed Clusters
        • ClickUp
        • European Space Agency (ESA)
        • Telefónica del Sur
        • Animal Logic
      • Database Administration
        • Agilis Systems
      • Professional Training
        • Met Office
        • London & Partners
      • Database Upgrades
        • Alfred Wegener Institute (AWI)
      • Database Migration
        • International Game Technology (IGT)
        • Healthcare Software Solutions (HSS)
        • Navionics
    • Books
      • PostgreSQL 11 Administration Cookbook
      • PostgreSQL 10 Administration Cookbook
      • PostgreSQL High Availability Cookbook – 2nd Edition
      • PostgreSQL 9 Administration Cookbook – 3rd Edition
      • PostgreSQL Server Programming Cookbook – 2nd Edition
      • PostgreSQL 9 Cookbook – Chinese Edition
    • Videos
    • Events
    • PostgreSQL
      • PostgreSQL – History
      • Who uses PostgreSQL?
      • PostgreSQL FAQ
      • PostgreSQL vs MySQL
      • The Business Case for PostgreSQL
      • Security Information
      • Documentation
  • About Us
    • About 2ndQuadrant
    • 2ndQuadrant’s Passion for PostgreSQL
    • News
    • Careers
    • Team Profile
  • Blog
  • Menu Menu
You are here: Home1 / Blog2 / Andrew's PlanetPostgreSQL3 / Up to date access to postgres logs
Andrew Dunstan

Up to date access to postgres logs

May 5, 2017/2 Comments/in Andrew's PlanetPostgreSQL /by Andrew Dunstan

Some of my Italian colleagues have made a nifty little gadget called redislog for pushing postgres logs into Redis, the distributed in-memory cache. From there it can be fed into things like logstash. I thought it would be interesting instead to make the logs available via the Redis Foreign Data Wrapper as a Postgres table. That way we would have easy access to the running logs from Postgres with almost no effort. Here’s what I did.

First I built and installed redislog and redis_fdw. Then I added redislog to my server’s shared_preload_libraries,  set log_min_duration_statement to 0 and restarted.

Then I created a database called logger and did this in it:

create extension redis_fdw;
create server localredis
  foreign data wrapper redis_fdw;
create foreign table redis_postgres_log(
    log_entry json
  )
  server localredis
  options (tabletype 'list', singleton_key 'postgres', database '0');
create type log_type as (                                                                
  user_name text,
  database_name text,
  process_id int,
  remote_host text,
  session_id text,
  session_line_num int,
  command_tag text,
  session_start_time timestamptz,
  virtual_transaction_id text,
  transaction_id text,
  error_severity text,
  sql_state_code text,
  detail_log text,
  detail text,
  hint text,
  internal_query text,
  internal_query_pos text,
  context text,
  query text,
  query_pos text,
  file_location text,
  application_name text,
  message text,
  "@timestamp"  timestamptz);

create view postgres_log as 
  select (x).*
  from  redis_postgres_log r,
    json_populate_record(NULL::log_type, r.log_entry) as x;

alter table postgres_log
  rename "@timestamp" to log_timestamp;

After that I could select a random row from the view:

logger=# select * from public.postgres_log offset 10000 limit 1;
-[ RECORD 1 ]----------+-----------------------------------------------------------------------------------------------------
user_name              | andrew
database_name          | pgb3
process_id             | 27513
remote_host            | [local]
session_id             | 590b44c4.6b79
session_line_num       | 1936
command_tag            | UPDATE
session_start_time     | 2017-05-04 11:12:04-04
virtual_transaction_id | 3/2498
transaction_id         | 3216
error_severity         | LOG
sql_state_code         | 
detail_log             | 
detail                 | 
hint                   | 
internal_query         | 
internal_query_pos     | 
context                | 
query                  | 
query_pos              | 
file_location          | 
application_name       | pgbench
message                | duration: 0.235 ms  statement: UPDATE pgbench_tellers SET tbalance = tbalance + -4736 WHERE tid = 8;
log_timestamp          | 2017-05-04 11:12:09.593-04

There’s a bunch of work to do to make this more scalable, for example by partitioning the log. It might also be that we need to enhance redislog and/or redis_fdw to make this work better. But in principle this is a pretty nice result, a painless way of getting up to the second log entries as a postgres table.

Share this entry
  • Share on Facebook
  • Share on Twitter
  • Share on WhatsApp
  • Share on LinkedIn
2 replies
  1. rbt
    rbt says:
    May 6, 2017 at 3:34 am

    Does this perform better than using the file FDW to suck in the CSV file directly from the log directory?

    Reply
    • Andrew Dunstan
      Andrew Dunstan says:
      May 6, 2017 at 1:40 pm

      Probably not. But I was interested to see how well it might work anyway, since redislog is also useful for other purposes. Also, I have seen cases where extraneous matter seems to have got into CSV logs making then unreadable by the File FDW. Redislog is pretty much guaranteed to have nothing but JSON.

      Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

Get in touch with us!

Recent Posts

  • Random Data December 3, 2020
  • Webinar: COMMIT Without Fear – The Beauty of CAMO [Follow Up] November 13, 2020
  • Full-text search since PostgreSQL 8.3 November 5, 2020
  • Random numbers November 3, 2020
  • Webinar: Best Practices for Bulk Data Loading in PostgreSQL [Follow Up] November 2, 2020

Featured External Blogs

Tomas Vondra's Blog

Our Bloggers

  • Simon Riggs
  • Alvaro Herrera
  • Andrew Dunstan
  • Craig Ringer
  • Francesco Canovai
  • Gabriele Bartolini
  • Giulio Calacoci
  • Ian Barwick
  • Marco Nenciarini
  • Mark Wong
  • Pavan Deolasee
  • Petr Jelinek
  • Shaun Thomas
  • Tomas Vondra
  • Umair Shahid

PostgreSQL Cloud

2QLovesPG 2UDA 9.6 backup Barman BDR Business Continuity community conference database DBA development devops disaster recovery greenplum Hot Standby JSON JSONB logical replication monitoring OmniDB open source Orange performance PG12 pgbarman pglogical PG Phriday postgres Postgres-BDR postgres-xl PostgreSQL PostgreSQL 9.6 PostgreSQL10 PostgreSQL11 PostgreSQL 11 PostgreSQL 11 New Features postgresql repmgr Recovery replication security sql wal webinar webinars

Support & Services

24/7 Production Support

Developer Support

Remote DBA for PostgreSQL

PostgreSQL Database Monitoring

PostgreSQL Health Check

PostgreSQL Performance Tuning

Database Security Audit

Upgrade PostgreSQL

PostgreSQL Migration Assessment

Migrate from Oracle to PostgreSQL

Products

HA Postgres Clusters

Postgres-BDR®

2ndQPostgres

pglogical

repmgr

Barman

Postgres Cloud Manager

SQL Firewall

Postgres-XL

OmniDB

Postgres Installer

2UDA

Postgres Learning Center

Introducing Postgres

Blog

Webinars

Books

Videos

Training

Case Studies

Events

About Us

About 2ndQuadrant

What does 2ndQuadrant Mean?

News

Careers 

Team Profile

© 2ndQuadrant Ltd. All rights reserved. | Privacy Policy
  • Twitter
  • LinkedIn
  • Facebook
  • Youtube
  • Mail
Logical Replication in PostgreSQL 10 What’s new in Postgres-XL 9.6
Scroll to top
×