You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Philip Lee <ph...@gmail.com> on 2015/12/07 20:54:50 UTC

Hello a question about metrics

Hello, a question about metrics.

I want to evaluate some queris on Spark, Flink, and Hive for a comparison.

I am using 'vmstat' to check metrics to see the amount of memory used,
swap, io, cpu. My way of evaulating is right? becaues they use JVM's
resource for memory, cpu.

Is there any linux application you use for metrics?

Best,
Phil.

Re: Hello a question about metrics

Posted by Philip Lee <ph...@gmail.com>.
Thanks for your suggestion!

I will try it later!
My group member want to use linux application like cpustat, memstat, vmstat.

The point is that running on spark and flink is on JVM, right?
FYI, cpustat and memstat capture the hardware resource, not virtual mahcine.
Do you think it could be appropriate metrics by using cpustat and memstat
on running on Flink and Spark?

Regatds,
Phil



On Tue, Dec 8, 2015 at 8:50 AM, Chiwan Park <ch...@apache.org> wrote:

> Hi Philip,
>
> As far as I know, ganglia[1] is widely used for monitoring Hadoop
> ecosystem. If your Hadoop distribution is installed by Apache Ambari[2],
> you can fetch metrics easily from pre-installed ganglia.
>
> [1]: http://ganglia.sourceforge.net
> [2]: https://ambari.apache.org
>
> > On Dec 8, 2015, at 4:54 AM, Philip Lee <ph...@gmail.com> wrote:
> >
> > Hello, a question about metrics.
> >
> > I want to evaluate some queris on Spark, Flink, and Hive for a
> comparison.
> >
> > I am using 'vmstat' to check metrics to see the amount of memory used,
> swap, io, cpu. My way of evaulating is right? becaues they use JVM's
> resource for memory, cpu.
> >
> > Is there any linux application you use for metrics?
> >
> > Best,
> > Phil.
>
> Regards,
> Chiwan Park
>
>
>
>

Re: Hello a question about metrics

Posted by Chiwan Park <ch...@apache.org>.
Hi Philip,

As far as I know, ganglia[1] is widely used for monitoring Hadoop ecosystem. If your Hadoop distribution is installed by Apache Ambari[2], you can fetch metrics easily from pre-installed ganglia.

[1]: http://ganglia.sourceforge.net
[2]: https://ambari.apache.org

> On Dec 8, 2015, at 4:54 AM, Philip Lee <ph...@gmail.com> wrote:
> 
> Hello, a question about metrics.
> 
> I want to evaluate some queris on Spark, Flink, and Hive for a comparison.
> 
> I am using 'vmstat' to check metrics to see the amount of memory used, swap, io, cpu. My way of evaulating is right? becaues they use JVM's resource for memory, cpu.
> 
> Is there any linux application you use for metrics?
> 
> Best,
> Phil.

Regards,
Chiwan Park