You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@impala.apache.org by Todd Lipcon <to...@cloudera.com> on 2017/02/21 03:12:00 UTC

status-benchmark.cc compilation time

Hi all,

I've noticed that status-benchmark.cc takes 8+ minutes to compile on my dev
box. It seems this is due to use of macro expansions to unroll a loop 1000x
for the benchmarks.

Whether that's actually a useful benchmark that reflects real-life usage of
Status, I'm not sure. But I'd wager that no one is looking at the results
of this benchmark on a regular basis, and maybe we could remove or disable
it for a normal build?

Any objections to a patch which either disables the unrolling (subject to
an #ifdef that could be easily re-enabled) or removes the benchmark from
CMakeLists.txt (also easy to re-enable)? This is by far the long pole in my
compilation.

-Todd
-- 
Todd Lipcon
Software Engineer, Cloudera

Re: status-benchmark.cc compilation time

Posted by Lars Volker <lv...@cloudera.com>.
I think -notests already skips the benchmarks. However, I understood that
the proposition is to even disable building the benchmarks without
-notests, i.e. they'll be disabled by default and you'd need to specify
-build_benchmarks to build them.

I'm in favor of doing that, including building them in exhaustive runs.

On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com> wrote:

> +1 for not compiling the benchmarks in -notests
>
> On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com> wrote:
>
> > > On which note, would anyone object if we disabled benchmark compilation
> > by
> > > default when building the BE tests? I mean separating out -notests into
> > > -notests and -build_benchmarks (the latter false by default).
> >
> > I think this is a great idea.
> >
> > > I don't mind if the benchmarks bitrot as a result, because we don't run
> > > them regularly or pay attention to their output except when developing
> a
> > > feature. Of course, maybe an 'exhaustive' run should build the
> benchmarks
> > > as well just to keep us honest, but I'd be happy if 95% of Jenkins
> builds
> > > didn't bother.
> >
> > The pre-merge (aka GVM aka GVO) testing builds
> > http://jenkins.impala.io:8080/job/all-build-options, which builds
> > without the "-notests" flag.
> >
>

Re: status-benchmark.cc compilation time

Posted by Henry Robinson <he...@cloudera.com>.
Fair enough, always worth checking the easy things first :)

On 23 February 2017 at 10:20, Zachary Amsden <za...@cloudera.com> wrote:

> Yes.  If you take a look at the benchmark, you'll notice the JNI call to
> initialize the frontend doesn't even have the right signature anymore.
> That's one easy way to bitrot while still compiling.
>
> Even fixing that isn't enough to get it off the ground.
>
>  - Zach
>
> On Tue, Feb 21, 2017 at 11:44 AM, Henry Robinson <he...@cloudera.com>
> wrote:
>
> > Did you run . bin/set-classpath.sh before running expr-benchmark?
> >
> > On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com>
> wrote:
> >
> > > Unfortunately some of the benchmarks have actually bit-rotted.  For
> > > example, expr-benchmark compiles but immediately throws JNI exceptions.
> > >
> > > On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <
> marcel@cloudera.com>
> > > wrote:
> > >
> > > > I'm also in favor of not compiling it on the standard commandline.
> > > >
> > > > However, I'm very much against allowing the benchmarks to bitrot. As
> > > > was pointed out, those benchmarks can be valuable tools during
> > > > development, and keeping them in working order shouldn't really
> impact
> > > > the development process.
> > > >
> > > > In other words, let's compile them as part of gvo.
> > > >
> > > > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com>
> > > > wrote:
> > > > > +1 for not compiling the benchmarks in -notests
> > > > >
> > > > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com>
> > > wrote:
> > > > >
> > > > >> > On which note, would anyone object if we disabled benchmark
> > > > compilation
> > > > >> by
> > > > >> > default when building the BE tests? I mean separating out
> -notests
> > > > into
> > > > >> > -notests and -build_benchmarks (the latter false by default).
> > > > >>
> > > > >> I think this is a great idea.
> > > > >>
> > > > >> > I don't mind if the benchmarks bitrot as a result, because we
> > don't
> > > > run
> > > > >> > them regularly or pay attention to their output except when
> > > > developing a
> > > > >> > feature. Of course, maybe an 'exhaustive' run should build the
> > > > benchmarks
> > > > >> > as well just to keep us honest, but I'd be happy if 95% of
> Jenkins
> > > > builds
> > > > >> > didn't bother.
> > > > >>
> > > > >> The pre-merge (aka GVM aka GVO) testing builds
> > > > >> http://jenkins.impala.io:8080/job/all-build-options, which builds
> > > > >> without the "-notests" flag.
> > > > >>
> > > >
> > >
> >
> >
> >
> > --
> > Henry Robinson
> > Software Engineer
> > Cloudera
> > 415-994-6679
> >
>



-- 
Henry Robinson
Software Engineer
Cloudera
415-994-6679

Re: status-benchmark.cc compilation time

Posted by Tim Armstrong <ta...@cloudera.com>.
I think for status-benchmark.cc we should just reduce the unrolling - I
don't see a valid reason to unroll a loop that many times unless you're
just testing the compiler. No reason we can't just unroll the loop, say 10
times and run that 100 times and get an equally valid result.

Todd's suggestion about just running the benchmark a couple of iterations
is a reasonable idea, although I think it depends whether the benchmarks
are once-off experiments (in which case it seems ok to let them bit-rot) or
they are actually likely to be reused.

I think if we're going to more actively maintain benchmarks we should also
consider more proactively disabling or removing once-off benchmarks that
aren't likely to be reused.

On Thu, Feb 23, 2017 at 10:26 AM, Henry Robinson <he...@cloudera.com> wrote:

> I think the main problem I want to avoid is paying the cost of linking,
> which is expensive for Impala as it often generates multi-hundred-MB
> binaries per benchmark or test.
>
> Building the benchmarks during GVO seems the best solution to that to me.
>
> On 23 February 2017 at 10:23, Todd Lipcon <to...@cloudera.com> wrote:
>
> > One thing we've found useful in Kudu to prevent bitrot of benchmarks is
> to
> > actually use gtest and gflags for the benchmark programs.
> >
> > We set some flag like --benchmark_num_rows or --benchmark_num_iterations
> > with a default that's low enough to only run for a second or two, and run
> > it as part of our normal test suite. Rarely catches any bugs, but serves
> to
> > make sure that the code keeps working. Then, when a developer wants to
> > actually test a change for performance, they can run it with
> > --num_iterations=<high number>.
> >
> > Doesn't help the weird case of status-benchmark where *compiling* takes
> 10
> > minutes... but I think the manual unrolling of 1000 status calls in there
> > is probably unrealistic anyway regarding how the different options
> perform
> > in a whole-program setting.
> >
> > -Todd
> >
> > On Thu, Feb 23, 2017 at 10:20 AM, Zachary Amsden <za...@cloudera.com>
> > wrote:
> >
> > > Yes.  If you take a look at the benchmark, you'll notice the JNI call
> to
> > > initialize the frontend doesn't even have the right signature anymore.
> > > That's one easy way to bitrot while still compiling.
> > >
> > > Even fixing that isn't enough to get it off the ground.
> > >
> > >  - Zach
> > >
> > > On Tue, Feb 21, 2017 at 11:44 AM, Henry Robinson <he...@cloudera.com>
> > > wrote:
> > >
> > > > Did you run . bin/set-classpath.sh before running expr-benchmark?
> > > >
> > > > On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com>
> > > wrote:
> > > >
> > > > > Unfortunately some of the benchmarks have actually bit-rotted.  For
> > > > > example, expr-benchmark compiles but immediately throws JNI
> > exceptions.
> > > > >
> > > > > On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <
> > > marcel@cloudera.com>
> > > > > wrote:
> > > > >
> > > > > > I'm also in favor of not compiling it on the standard
> commandline.
> > > > > >
> > > > > > However, I'm very much against allowing the benchmarks to bitrot.
> > As
> > > > > > was pointed out, those benchmarks can be valuable tools during
> > > > > > development, and keeping them in working order shouldn't really
> > > impact
> > > > > > the development process.
> > > > > >
> > > > > > In other words, let's compile them as part of gvo.
> > > > > >
> > > > > > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <
> > alex.behm@cloudera.com>
> > > > > > wrote:
> > > > > > > +1 for not compiling the benchmarks in -notests
> > > > > > >
> > > > > > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <
> jbapple@cloudera.com
> > >
> > > > > wrote:
> > > > > > >
> > > > > > >> > On which note, would anyone object if we disabled benchmark
> > > > > > compilation
> > > > > > >> by
> > > > > > >> > default when building the BE tests? I mean separating out
> > > -notests
> > > > > > into
> > > > > > >> > -notests and -build_benchmarks (the latter false by
> default).
> > > > > > >>
> > > > > > >> I think this is a great idea.
> > > > > > >>
> > > > > > >> > I don't mind if the benchmarks bitrot as a result, because
> we
> > > > don't
> > > > > > run
> > > > > > >> > them regularly or pay attention to their output except when
> > > > > > developing a
> > > > > > >> > feature. Of course, maybe an 'exhaustive' run should build
> the
> > > > > > benchmarks
> > > > > > >> > as well just to keep us honest, but I'd be happy if 95% of
> > > Jenkins
> > > > > > builds
> > > > > > >> > didn't bother.
> > > > > > >>
> > > > > > >> The pre-merge (aka GVM aka GVO) testing builds
> > > > > > >> http://jenkins.impala.io:8080/job/all-build-options, which
> > builds
> > > > > > >> without the "-notests" flag.
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Henry Robinson
> > > > Software Engineer
> > > > Cloudera
> > > > 415-994-6679
> > > >
> > >
> >
> >
> >
> > --
> > Todd Lipcon
> > Software Engineer, Cloudera
> >
>
>
>
> --
> Henry Robinson
> Software Engineer
> Cloudera
> 415-994-6679
>

Re: status-benchmark.cc compilation time

Posted by Henry Robinson <he...@cloudera.com>.
I think the main problem I want to avoid is paying the cost of linking,
which is expensive for Impala as it often generates multi-hundred-MB
binaries per benchmark or test.

Building the benchmarks during GVO seems the best solution to that to me.

On 23 February 2017 at 10:23, Todd Lipcon <to...@cloudera.com> wrote:

> One thing we've found useful in Kudu to prevent bitrot of benchmarks is to
> actually use gtest and gflags for the benchmark programs.
>
> We set some flag like --benchmark_num_rows or --benchmark_num_iterations
> with a default that's low enough to only run for a second or two, and run
> it as part of our normal test suite. Rarely catches any bugs, but serves to
> make sure that the code keeps working. Then, when a developer wants to
> actually test a change for performance, they can run it with
> --num_iterations=<high number>.
>
> Doesn't help the weird case of status-benchmark where *compiling* takes 10
> minutes... but I think the manual unrolling of 1000 status calls in there
> is probably unrealistic anyway regarding how the different options perform
> in a whole-program setting.
>
> -Todd
>
> On Thu, Feb 23, 2017 at 10:20 AM, Zachary Amsden <za...@cloudera.com>
> wrote:
>
> > Yes.  If you take a look at the benchmark, you'll notice the JNI call to
> > initialize the frontend doesn't even have the right signature anymore.
> > That's one easy way to bitrot while still compiling.
> >
> > Even fixing that isn't enough to get it off the ground.
> >
> >  - Zach
> >
> > On Tue, Feb 21, 2017 at 11:44 AM, Henry Robinson <he...@cloudera.com>
> > wrote:
> >
> > > Did you run . bin/set-classpath.sh before running expr-benchmark?
> > >
> > > On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com>
> > wrote:
> > >
> > > > Unfortunately some of the benchmarks have actually bit-rotted.  For
> > > > example, expr-benchmark compiles but immediately throws JNI
> exceptions.
> > > >
> > > > On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <
> > marcel@cloudera.com>
> > > > wrote:
> > > >
> > > > > I'm also in favor of not compiling it on the standard commandline.
> > > > >
> > > > > However, I'm very much against allowing the benchmarks to bitrot.
> As
> > > > > was pointed out, those benchmarks can be valuable tools during
> > > > > development, and keeping them in working order shouldn't really
> > impact
> > > > > the development process.
> > > > >
> > > > > In other words, let's compile them as part of gvo.
> > > > >
> > > > > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <
> alex.behm@cloudera.com>
> > > > > wrote:
> > > > > > +1 for not compiling the benchmarks in -notests
> > > > > >
> > > > > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jbapple@cloudera.com
> >
> > > > wrote:
> > > > > >
> > > > > >> > On which note, would anyone object if we disabled benchmark
> > > > > compilation
> > > > > >> by
> > > > > >> > default when building the BE tests? I mean separating out
> > -notests
> > > > > into
> > > > > >> > -notests and -build_benchmarks (the latter false by default).
> > > > > >>
> > > > > >> I think this is a great idea.
> > > > > >>
> > > > > >> > I don't mind if the benchmarks bitrot as a result, because we
> > > don't
> > > > > run
> > > > > >> > them regularly or pay attention to their output except when
> > > > > developing a
> > > > > >> > feature. Of course, maybe an 'exhaustive' run should build the
> > > > > benchmarks
> > > > > >> > as well just to keep us honest, but I'd be happy if 95% of
> > Jenkins
> > > > > builds
> > > > > >> > didn't bother.
> > > > > >>
> > > > > >> The pre-merge (aka GVM aka GVO) testing builds
> > > > > >> http://jenkins.impala.io:8080/job/all-build-options, which
> builds
> > > > > >> without the "-notests" flag.
> > > > > >>
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Henry Robinson
> > > Software Engineer
> > > Cloudera
> > > 415-994-6679
> > >
> >
>
>
>
> --
> Todd Lipcon
> Software Engineer, Cloudera
>



-- 
Henry Robinson
Software Engineer
Cloudera
415-994-6679

Re: status-benchmark.cc compilation time

Posted by Todd Lipcon <to...@cloudera.com>.
One thing we've found useful in Kudu to prevent bitrot of benchmarks is to
actually use gtest and gflags for the benchmark programs.

We set some flag like --benchmark_num_rows or --benchmark_num_iterations
with a default that's low enough to only run for a second or two, and run
it as part of our normal test suite. Rarely catches any bugs, but serves to
make sure that the code keeps working. Then, when a developer wants to
actually test a change for performance, they can run it with
--num_iterations=<high number>.

Doesn't help the weird case of status-benchmark where *compiling* takes 10
minutes... but I think the manual unrolling of 1000 status calls in there
is probably unrealistic anyway regarding how the different options perform
in a whole-program setting.

-Todd

On Thu, Feb 23, 2017 at 10:20 AM, Zachary Amsden <za...@cloudera.com>
wrote:

> Yes.  If you take a look at the benchmark, you'll notice the JNI call to
> initialize the frontend doesn't even have the right signature anymore.
> That's one easy way to bitrot while still compiling.
>
> Even fixing that isn't enough to get it off the ground.
>
>  - Zach
>
> On Tue, Feb 21, 2017 at 11:44 AM, Henry Robinson <he...@cloudera.com>
> wrote:
>
> > Did you run . bin/set-classpath.sh before running expr-benchmark?
> >
> > On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com>
> wrote:
> >
> > > Unfortunately some of the benchmarks have actually bit-rotted.  For
> > > example, expr-benchmark compiles but immediately throws JNI exceptions.
> > >
> > > On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <
> marcel@cloudera.com>
> > > wrote:
> > >
> > > > I'm also in favor of not compiling it on the standard commandline.
> > > >
> > > > However, I'm very much against allowing the benchmarks to bitrot. As
> > > > was pointed out, those benchmarks can be valuable tools during
> > > > development, and keeping them in working order shouldn't really
> impact
> > > > the development process.
> > > >
> > > > In other words, let's compile them as part of gvo.
> > > >
> > > > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com>
> > > > wrote:
> > > > > +1 for not compiling the benchmarks in -notests
> > > > >
> > > > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com>
> > > wrote:
> > > > >
> > > > >> > On which note, would anyone object if we disabled benchmark
> > > > compilation
> > > > >> by
> > > > >> > default when building the BE tests? I mean separating out
> -notests
> > > > into
> > > > >> > -notests and -build_benchmarks (the latter false by default).
> > > > >>
> > > > >> I think this is a great idea.
> > > > >>
> > > > >> > I don't mind if the benchmarks bitrot as a result, because we
> > don't
> > > > run
> > > > >> > them regularly or pay attention to their output except when
> > > > developing a
> > > > >> > feature. Of course, maybe an 'exhaustive' run should build the
> > > > benchmarks
> > > > >> > as well just to keep us honest, but I'd be happy if 95% of
> Jenkins
> > > > builds
> > > > >> > didn't bother.
> > > > >>
> > > > >> The pre-merge (aka GVM aka GVO) testing builds
> > > > >> http://jenkins.impala.io:8080/job/all-build-options, which builds
> > > > >> without the "-notests" flag.
> > > > >>
> > > >
> > >
> >
> >
> >
> > --
> > Henry Robinson
> > Software Engineer
> > Cloudera
> > 415-994-6679
> >
>



-- 
Todd Lipcon
Software Engineer, Cloudera

Re: status-benchmark.cc compilation time

Posted by Zachary Amsden <za...@cloudera.com>.
Yes.  If you take a look at the benchmark, you'll notice the JNI call to
initialize the frontend doesn't even have the right signature anymore.
That's one easy way to bitrot while still compiling.

Even fixing that isn't enough to get it off the ground.

 - Zach

On Tue, Feb 21, 2017 at 11:44 AM, Henry Robinson <he...@cloudera.com> wrote:

> Did you run . bin/set-classpath.sh before running expr-benchmark?
>
> On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com> wrote:
>
> > Unfortunately some of the benchmarks have actually bit-rotted.  For
> > example, expr-benchmark compiles but immediately throws JNI exceptions.
> >
> > On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <ma...@cloudera.com>
> > wrote:
> >
> > > I'm also in favor of not compiling it on the standard commandline.
> > >
> > > However, I'm very much against allowing the benchmarks to bitrot. As
> > > was pointed out, those benchmarks can be valuable tools during
> > > development, and keeping them in working order shouldn't really impact
> > > the development process.
> > >
> > > In other words, let's compile them as part of gvo.
> > >
> > > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com>
> > > wrote:
> > > > +1 for not compiling the benchmarks in -notests
> > > >
> > > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com>
> > wrote:
> > > >
> > > >> > On which note, would anyone object if we disabled benchmark
> > > compilation
> > > >> by
> > > >> > default when building the BE tests? I mean separating out -notests
> > > into
> > > >> > -notests and -build_benchmarks (the latter false by default).
> > > >>
> > > >> I think this is a great idea.
> > > >>
> > > >> > I don't mind if the benchmarks bitrot as a result, because we
> don't
> > > run
> > > >> > them regularly or pay attention to their output except when
> > > developing a
> > > >> > feature. Of course, maybe an 'exhaustive' run should build the
> > > benchmarks
> > > >> > as well just to keep us honest, but I'd be happy if 95% of Jenkins
> > > builds
> > > >> > didn't bother.
> > > >>
> > > >> The pre-merge (aka GVM aka GVO) testing builds
> > > >> http://jenkins.impala.io:8080/job/all-build-options, which builds
> > > >> without the "-notests" flag.
> > > >>
> > >
> >
>
>
>
> --
> Henry Robinson
> Software Engineer
> Cloudera
> 415-994-6679
>

Re: status-benchmark.cc compilation time

Posted by Henry Robinson <he...@cloudera.com>.
Did you run . bin/set-classpath.sh before running expr-benchmark?

On 21 February 2017 at 11:30, Zachary Amsden <za...@cloudera.com> wrote:

> Unfortunately some of the benchmarks have actually bit-rotted.  For
> example, expr-benchmark compiles but immediately throws JNI exceptions.
>
> On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <ma...@cloudera.com>
> wrote:
>
> > I'm also in favor of not compiling it on the standard commandline.
> >
> > However, I'm very much against allowing the benchmarks to bitrot. As
> > was pointed out, those benchmarks can be valuable tools during
> > development, and keeping them in working order shouldn't really impact
> > the development process.
> >
> > In other words, let's compile them as part of gvo.
> >
> > On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com>
> > wrote:
> > > +1 for not compiling the benchmarks in -notests
> > >
> > > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com>
> wrote:
> > >
> > >> > On which note, would anyone object if we disabled benchmark
> > compilation
> > >> by
> > >> > default when building the BE tests? I mean separating out -notests
> > into
> > >> > -notests and -build_benchmarks (the latter false by default).
> > >>
> > >> I think this is a great idea.
> > >>
> > >> > I don't mind if the benchmarks bitrot as a result, because we don't
> > run
> > >> > them regularly or pay attention to their output except when
> > developing a
> > >> > feature. Of course, maybe an 'exhaustive' run should build the
> > benchmarks
> > >> > as well just to keep us honest, but I'd be happy if 95% of Jenkins
> > builds
> > >> > didn't bother.
> > >>
> > >> The pre-merge (aka GVM aka GVO) testing builds
> > >> http://jenkins.impala.io:8080/job/all-build-options, which builds
> > >> without the "-notests" flag.
> > >>
> >
>



-- 
Henry Robinson
Software Engineer
Cloudera
415-994-6679

Re: status-benchmark.cc compilation time

Posted by Zachary Amsden <za...@cloudera.com>.
Unfortunately some of the benchmarks have actually bit-rotted.  For
example, expr-benchmark compiles but immediately throws JNI exceptions.

On Tue, Feb 21, 2017 at 10:55 AM, Marcel Kornacker <ma...@cloudera.com>
wrote:

> I'm also in favor of not compiling it on the standard commandline.
>
> However, I'm very much against allowing the benchmarks to bitrot. As
> was pointed out, those benchmarks can be valuable tools during
> development, and keeping them in working order shouldn't really impact
> the development process.
>
> In other words, let's compile them as part of gvo.
>
> On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com>
> wrote:
> > +1 for not compiling the benchmarks in -notests
> >
> > On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com> wrote:
> >
> >> > On which note, would anyone object if we disabled benchmark
> compilation
> >> by
> >> > default when building the BE tests? I mean separating out -notests
> into
> >> > -notests and -build_benchmarks (the latter false by default).
> >>
> >> I think this is a great idea.
> >>
> >> > I don't mind if the benchmarks bitrot as a result, because we don't
> run
> >> > them regularly or pay attention to their output except when
> developing a
> >> > feature. Of course, maybe an 'exhaustive' run should build the
> benchmarks
> >> > as well just to keep us honest, but I'd be happy if 95% of Jenkins
> builds
> >> > didn't bother.
> >>
> >> The pre-merge (aka GVM aka GVO) testing builds
> >> http://jenkins.impala.io:8080/job/all-build-options, which builds
> >> without the "-notests" flag.
> >>
>

Re: status-benchmark.cc compilation time

Posted by Marcel Kornacker <ma...@cloudera.com>.
I'm also in favor of not compiling it on the standard commandline.

However, I'm very much against allowing the benchmarks to bitrot. As
was pointed out, those benchmarks can be valuable tools during
development, and keeping them in working order shouldn't really impact
the development process.

In other words, let's compile them as part of gvo.

On Tue, Feb 21, 2017 at 10:50 AM, Alex Behm <al...@cloudera.com> wrote:
> +1 for not compiling the benchmarks in -notests
>
> On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com> wrote:
>
>> > On which note, would anyone object if we disabled benchmark compilation
>> by
>> > default when building the BE tests? I mean separating out -notests into
>> > -notests and -build_benchmarks (the latter false by default).
>>
>> I think this is a great idea.
>>
>> > I don't mind if the benchmarks bitrot as a result, because we don't run
>> > them regularly or pay attention to their output except when developing a
>> > feature. Of course, maybe an 'exhaustive' run should build the benchmarks
>> > as well just to keep us honest, but I'd be happy if 95% of Jenkins builds
>> > didn't bother.
>>
>> The pre-merge (aka GVM aka GVO) testing builds
>> http://jenkins.impala.io:8080/job/all-build-options, which builds
>> without the "-notests" flag.
>>

Re: status-benchmark.cc compilation time

Posted by Alex Behm <al...@cloudera.com>.
+1 for not compiling the benchmarks in -notests

On Mon, Feb 20, 2017 at 7:55 PM, Jim Apple <jb...@cloudera.com> wrote:

> > On which note, would anyone object if we disabled benchmark compilation
> by
> > default when building the BE tests? I mean separating out -notests into
> > -notests and -build_benchmarks (the latter false by default).
>
> I think this is a great idea.
>
> > I don't mind if the benchmarks bitrot as a result, because we don't run
> > them regularly or pay attention to their output except when developing a
> > feature. Of course, maybe an 'exhaustive' run should build the benchmarks
> > as well just to keep us honest, but I'd be happy if 95% of Jenkins builds
> > didn't bother.
>
> The pre-merge (aka GVM aka GVO) testing builds
> http://jenkins.impala.io:8080/job/all-build-options, which builds
> without the "-notests" flag.
>

Re: status-benchmark.cc compilation time

Posted by Jim Apple <jb...@cloudera.com>.
> On which note, would anyone object if we disabled benchmark compilation by
> default when building the BE tests? I mean separating out -notests into
> -notests and -build_benchmarks (the latter false by default).

I think this is a great idea.

> I don't mind if the benchmarks bitrot as a result, because we don't run
> them regularly or pay attention to their output except when developing a
> feature. Of course, maybe an 'exhaustive' run should build the benchmarks
> as well just to keep us honest, but I'd be happy if 95% of Jenkins builds
> didn't bother.

The pre-merge (aka GVM aka GVO) testing builds
http://jenkins.impala.io:8080/job/all-build-options, which builds
without the "-notests" flag.

Re: status-benchmark.cc compilation time

Posted by Henry Robinson <he...@cloudera.com>.
On which note, would anyone object if we disabled benchmark compilation by
default when building the BE tests? I mean separating out -notests into
-notests and -build_benchmarks (the latter false by default).

I don't mind if the benchmarks bitrot as a result, because we don't run
them regularly or pay attention to their output except when developing a
feature. Of course, maybe an 'exhaustive' run should build the benchmarks
as well just to keep us honest, but I'd be happy if 95% of Jenkins builds
didn't bother.

On 20 February 2017 at 19:18, Jim Apple <jb...@cloudera.com> wrote:

> https://issues.cloudera.org/browse/IMPALA-3784
>
> I'd prefer it removed from git if it is removed from CMakeLists.txt,
> but I'm OK with either that or removing the unrolling.
>
> Also, as a reminder, "-notests" skips building the tests.
>
> On Mon, Feb 20, 2017 at 7:12 PM, Todd Lipcon <to...@cloudera.com> wrote:
> > Hi all,
> >
> > I've noticed that status-benchmark.cc takes 8+ minutes to compile on my
> dev
> > box. It seems this is due to use of macro expansions to unroll a loop
> 1000x
> > for the benchmarks.
> >
> > Whether that's actually a useful benchmark that reflects real-life usage
> of
> > Status, I'm not sure. But I'd wager that no one is looking at the results
> > of this benchmark on a regular basis, and maybe we could remove or
> disable
> > it for a normal build?
> >
> > Any objections to a patch which either disables the unrolling (subject to
> > an #ifdef that could be easily re-enabled) or removes the benchmark from
> > CMakeLists.txt (also easy to re-enable)? This is by far the long pole in
> my
> > compilation.
> >
> > -Todd
> > --
> > Todd Lipcon
> > Software Engineer, Cloudera
>



-- 
Henry Robinson
Software Engineer
Cloudera
415-994-6679

Re: status-benchmark.cc compilation time

Posted by Jim Apple <jb...@cloudera.com>.
https://issues.cloudera.org/browse/IMPALA-3784

I'd prefer it removed from git if it is removed from CMakeLists.txt,
but I'm OK with either that or removing the unrolling.

Also, as a reminder, "-notests" skips building the tests.

On Mon, Feb 20, 2017 at 7:12 PM, Todd Lipcon <to...@cloudera.com> wrote:
> Hi all,
>
> I've noticed that status-benchmark.cc takes 8+ minutes to compile on my dev
> box. It seems this is due to use of macro expansions to unroll a loop 1000x
> for the benchmarks.
>
> Whether that's actually a useful benchmark that reflects real-life usage of
> Status, I'm not sure. But I'd wager that no one is looking at the results
> of this benchmark on a regular basis, and maybe we could remove or disable
> it for a normal build?
>
> Any objections to a patch which either disables the unrolling (subject to
> an #ifdef that could be easily re-enabled) or removes the benchmark from
> CMakeLists.txt (also easy to re-enable)? This is by far the long pole in my
> compilation.
>
> -Todd
> --
> Todd Lipcon
> Software Engineer, Cloudera