You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@accumulo.apache.org by Russ Weeks <rw...@newbrightidea.com> on 2015/05/09 02:01:53 UTC

1.6 -> 1.7 compat problems with lexicoders?

Hey, folks,
I've built some code against Accumulo 1.6.1 (HDP 2.2) and I'm having a bit
of trouble running it against the Accumulo 1.7 jars (a8ef75e)

The error I'm seeing is:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 5.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 5.0 (TID 4, localhost): java.lang.NoSuchMethodError:
org.apache.accumulo.core.client.lexicoder.LongLexicoder.decode([B)Ljava/lang/Long;
at
com.phemi.agile.pidx.PartitionedIndexRDDPartitioner.getPartition(PartitionedIndexBuilder.java:164)
at org.apache.spark.util.collection.ExternalSorter.org
$apache$spark$util$collection$ExternalSorter$$getPartition(ExternalSorter.scala:113)
at
org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:220)
at
org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

The code which triggers the error is pretty straightforward:

    @Override
    public int getPartition(Object keyObj) {
        Key key = (Key)keyObj;
        long partId = longLex.decode(key.getRow().copyBytes());
        if (partId < 0 || partId >= numPartitions) {
            throw new IllegalArgumentException("Key " + key + " invalid
partition: " + partId);
        }
        return (int)partId;
    }

The decode method exists in 1.7's LongLexicoder, of course, but its
implementation has been moved to the abstract parent class AbstractEncoder.
It doesn't seem to me like that should break any method signature
definitions, and yet...

If this isn't a compatibility problem, what am I doing wrong here?

Thanks,
-Russ

Re: 1.6 -> 1.7 compat problems with lexicoders?

Posted by Russ Weeks <rw...@newbrightidea.com>.
Should have mentioned, switching to the 1.6.1 jars at runtime fixes the
problem.

On Fri, May 8, 2015 at 5:01 PM, Russ Weeks <rw...@newbrightidea.com> wrote:

> Hey, folks,
> I've built some code against Accumulo 1.6.1 (HDP 2.2) and I'm having a bit
> of trouble running it against the Accumulo 1.7 jars (a8ef75e)
>
> The error I'm seeing is:
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 0 in stage 5.0 failed 1 times, most recent
> failure: Lost task 0.0 in stage 5.0 (TID 4, localhost):
> java.lang.NoSuchMethodError:
> org.apache.accumulo.core.client.lexicoder.LongLexicoder.decode([B)Ljava/lang/Long;
> at
> com.phemi.agile.pidx.PartitionedIndexRDDPartitioner.getPartition(PartitionedIndexBuilder.java:164)
> at org.apache.spark.util.collection.ExternalSorter.org
> $apache$spark$util$collection$ExternalSorter$$getPartition(ExternalSorter.scala:113)
> at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:220)
> at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:64)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> The code which triggers the error is pretty straightforward:
>
>     @Override
>     public int getPartition(Object keyObj) {
>         Key key = (Key)keyObj;
>         long partId = longLex.decode(key.getRow().copyBytes());
>         if (partId < 0 || partId >= numPartitions) {
>             throw new IllegalArgumentException("Key " + key + " invalid
> partition: " + partId);
>         }
>         return (int)partId;
>     }
>
> The decode method exists in 1.7's LongLexicoder, of course, but its
> implementation has been moved to the abstract parent class AbstractEncoder.
> It doesn't seem to me like that should break any method signature
> definitions, and yet...
>
> If this isn't a compatibility problem, what am I doing wrong here?
>
> Thanks,
> -Russ
>

Re: 1.6 -> 1.7 compat problems with lexicoders?

Posted by Josh Elser <jo...@gmail.com>.
Thanks for raising an issue. Your assessment seems good. Great detective 
work on this one!

Russ Weeks wrote:
> I hope this is a false alarm but I put together a simple test case to take
> Spark out of the equation... the code still fails with a NoSuchMethodError.
> I've raised ACCUMULO-3789.
>
> On Fri, May 8, 2015 at 5:01 PM, Russ Weeks<rw...@newbrightidea.com>  wrote:
>
>> Hey, folks,
>> I've built some code against Accumulo 1.6.1 (HDP 2.2) and I'm having a bit
>> of trouble running it against the Accumulo 1.7 jars (a8ef75e)
>>
>> The error I'm seeing is:
>>
>> Exception in thread "main" org.apache.spark.SparkException: Job aborted
>> due to stage failure: Task 0 in stage 5.0 failed 1 times, most recent
>> failure: Lost task 0.0 in stage 5.0 (TID 4, localhost):
>> java.lang.NoSuchMethodError:
>> org.apache.accumulo.core.client.lexicoder.LongLexicoder.decode([B)Ljava/lang/Long;
>> at
>> com.phemi.agile.pidx.PartitionedIndexRDDPartitioner.getPartition(PartitionedIndexBuilder.java:164)
>> at org.apache.spark.util.collection.ExternalSorter.org
>> $apache$spark$util$collection$ExternalSorter$$getPartition(ExternalSorter.scala:113)
>> at
>> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:220)
>> at
>> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
>> at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>> at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>> at org.apache.spark.scheduler.Task.run(Task.scala:64)
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> The code which triggers the error is pretty straightforward:
>>
>>      @Override
>>      public int getPartition(Object keyObj) {
>>          Key key = (Key)keyObj;
>>          long partId = longLex.decode(key.getRow().copyBytes());
>>          if (partId<  0 || partId>= numPartitions) {
>>              throw new IllegalArgumentException("Key " + key + " invalid
>> partition: " + partId);
>>          }
>>          return (int)partId;
>>      }
>>
>> The decode method exists in 1.7's LongLexicoder, of course, but its
>> implementation has been moved to the abstract parent class AbstractEncoder.
>> It doesn't seem to me like that should break any method signature
>> definitions, and yet...
>>
>> If this isn't a compatibility problem, what am I doing wrong here?
>>
>> Thanks,
>> -Russ
>>
>

Re: 1.6 -> 1.7 compat problems with lexicoders?

Posted by Russ Weeks <rw...@newbrightidea.com>.
I hope this is a false alarm but I put together a simple test case to take
Spark out of the equation... the code still fails with a NoSuchMethodError.
I've raised ACCUMULO-3789.

On Fri, May 8, 2015 at 5:01 PM, Russ Weeks <rw...@newbrightidea.com> wrote:

> Hey, folks,
> I've built some code against Accumulo 1.6.1 (HDP 2.2) and I'm having a bit
> of trouble running it against the Accumulo 1.7 jars (a8ef75e)
>
> The error I'm seeing is:
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 0 in stage 5.0 failed 1 times, most recent
> failure: Lost task 0.0 in stage 5.0 (TID 4, localhost):
> java.lang.NoSuchMethodError:
> org.apache.accumulo.core.client.lexicoder.LongLexicoder.decode([B)Ljava/lang/Long;
> at
> com.phemi.agile.pidx.PartitionedIndexRDDPartitioner.getPartition(PartitionedIndexBuilder.java:164)
> at org.apache.spark.util.collection.ExternalSorter.org
> $apache$spark$util$collection$ExternalSorter$$getPartition(ExternalSorter.scala:113)
> at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:220)
> at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:64)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
> The code which triggers the error is pretty straightforward:
>
>     @Override
>     public int getPartition(Object keyObj) {
>         Key key = (Key)keyObj;
>         long partId = longLex.decode(key.getRow().copyBytes());
>         if (partId < 0 || partId >= numPartitions) {
>             throw new IllegalArgumentException("Key " + key + " invalid
> partition: " + partId);
>         }
>         return (int)partId;
>     }
>
> The decode method exists in 1.7's LongLexicoder, of course, but its
> implementation has been moved to the abstract parent class AbstractEncoder.
> It doesn't seem to me like that should break any method signature
> definitions, and yet...
>
> If this isn't a compatibility problem, what am I doing wrong here?
>
> Thanks,
> -Russ
>