photo by Janita Top
It was the day when I finally fell in love with Postgres. I got a challenging task to generate data that haven’t existed for reporting. It turned out that it’s very easy to achieve in Postgres.
Clients could receive a service for a long time, the report should show regular milestones to track the progress. For example, there should be a row for every three-weeks interval. Several events should take place around that time, and report should show whether that number of KPIs was reached on that milestone. Service start and end dates are stored in one table, while different types of events are stored in other tables.
I can’t say it’s a very challenging task. If I got this task in SQL Server environment, I would write a table function that would take three parameters, start date, end date and step, and it would add the step to start date in some sort of loop until the end date is met. There could be a complication in a step parameter, depending on whether it should be in days, weeks, or months, probably making it number of days would be the best.
Postgres already has this function, it is called generate_series. The version of it that generates timestamps, it takes starts and end timestamps and step in the interval format, i.e. it is possible to pass any time interval, from seconds to years without any extra conversions. It saved me some time as I wouldn’t need to test any custom-made functions.
It is not often I face a task like this, but Postgres already has a function for it. The range of built-in well thought through functions for any type of tasks in Postgres is astonishing. I believe it shows how working closely with the community makes product better.
Kate
Data masseuse