You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by GitBox <gi...@apache.org> on 2019/02/22 01:19:44 UTC

[GitHub] clintropolis commented on issue #7124: rework segment metadata query "size" analysis

clintropolis commented on issue #7124: rework segment metadata query "size" analysis
URL: https://github.com/apache/incubator-druid/issues/7124#issuecomment-466235056
 
 
   > Can such a feature allow comparisons to https://cloud.google.com/bigquery/pricing ?
   
   I'm not sure, I think it would depend on which of "size data requires to be loaded on historicals" or "size data requires to be at rest in deep storage" is the more appropriate metric to compare to the big query thing. My suggestion would provide a means to get the former value, which I think is maybe the better thing to use in Druid world to compare here since data needs to be loaded to be useful, so ... yes I guess? 😅That said, you might need both values if you also wanted to also meter segments that are in deep storage and _not_ loaded, or are trying to bill for all costs.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org