You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@griffin.apache.org by Vikram Jain <vi...@enquero.com> on 2019/09/27 07:04:30 UTC

Running profiling job on data Aggregated every 30 minutes

Hello All,
I have 2 tables where my date information contains hours and minutes along with the date. Sample schema of 2 tables are as follows -

Table 1-
Column`                              Format
Day                                        yyyy-MM-dd
Hour                                      HH
Minute                                 MM
Sales                                      10001

Table 2 -
Column                                Format
Day                                        yyyy-MM-dd : hh.mm
Sales                                      10001


Multiple records are written in both these tables every minute.
My task is to compute average of sales column for every 30 minutes worth of data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 00 mins)
I am unable to think of a solution to do it with Griffin. Is there a way we can model this scenario?

Thanks a lot for your help in advance.

Regards,
Vikram