You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@griffin.apache.org by Vikram Jain <vi...@enquero.com> on 2019/09/27 05:50:03 UTC
Aggregating data every 30 minutes in a profiling job
Hello All,
I have 2 tables where my date information contains hours and minutes along with the date. Sample schema of 2 tables are as follows -
Table 1-
Day
Hour
Minute
Sales
yyyy-MM-dd
HH
MM
10001
Table 2 -
Day
Sales
yyyy-MM-dd : hh.mm
10001
Multiple records is written in both these tables every minute.
My ask is to compute average of sales column for every 30 minutes worth of data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 00 mins)
I am unable to think of a solution to do it with Griffin. Is there a way we can model this scenario?
Thanks a lot for your help in advance.
Regards,
Vikram
RE: Aggregating data every 30 minutes in a profiling job
Posted by Vikram Jain <vi...@enquero.com>.
I'm sorry. I realized the formatting was lost in the previous email. To avoid the confusion, please find the ask again with correct formatting for table schema.
I have 2 tables where my date information contains hours and minutes along with the date. Sample schema of 2 tables are as follows -
Table 1 -
[cid:image001.png@01D57526.B0774070]
Table 2 –
[cid:image002.png@01D57526.E0A3B5D0]
Multiple records is written in both these tables every minute.
My ask is to compute average of sales column for every 30 minutes worth of data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 00 mins) I am unable to think of a solution to do it with Griffin. Is there a way we can model this scenario?
Thanks a lot for your help in advance.
Regards,
Vikram
-----Original Message-----
From: Vikram Jain <vi...@enquero.com>
Sent: Friday, September 27, 2019 11:20 AM
To: dev@griffin.apache.org
Subject: Aggregating data every 30 minutes in a profiling job
WARNING: This email originated outside of Enquero. DO NOT CLICK links or attachments unless you recognize the sender and know the content is safe.
Hello All,
I have 2 tables where my date information contains hours and minutes along with the date. Sample schema of 2 tables are as follows -
Table 1-
Day
Hour
Minute
Sales
yyyy-MM-dd
HH
MM
10001
Table 2 -
Day
Sales
yyyy-MM-dd : hh.mm
10001
Multiple records is written in both these tables every minute.
My ask is to compute average of sales column for every 30 minutes worth of data. (Average of data from hh:01 mins to hh: 30 mins and hh:31 mins to (hh+1): 00 mins) I am unable to think of a solution to do it with Griffin. Is there a way we can model this scenario?
Thanks a lot for your help in advance.
Regards,
Vikram