You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Muhammad Rezaul Karim <re...@yahoo.com> on 2017/02/14 16:10:03 UTC

NoSuchMethodException in SQL

Hi All,
I am receiving the following exception while executing SQL queries: 
 java.lang.NoSuchMethodException: org.apache.spark.io.LZ4CompressionCodec.<init>(org.apache.spark.SparkConf)
    at java.lang.Class.getConstructor0(Class.java:3082)
    at java.lang.Class.getConstructor(Class.java:1825)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:71)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
    at org.apache.spark.sql.execution.SparkPlan.org$apache$spark$sql$execution$SparkPlan$$decodeUnsafeRows(SparkPlan.scala:250)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeCollect$1.apply(SparkPlan.scala:276)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeCollect$1.apply(SparkPlan.scala:275)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
    at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:275)
    at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$1.apply(BroadcastExchangeExec.scala:78)
    at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$1.apply(BroadcastExchangeExec.scala:75)
    at org.apache.spark.sql.execution.SQLExecution$.withExecutionId(SQLExecution.scala:94)
    at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:74)
    at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:74)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

My SQL query is: 
%sql  select * from land where Price >= 10000 AND CLUSTER = 2 

I am experiencing the above exception in the 1st run always but when I re-execute the same query for the 2nd or 3rd time, I don't get this error. 

Am I doing something wrong? Someone, please help me out. 





Kinds regards,Reza

Re: NoSuchMethodException in SQL

Posted by Jeff Zhang <zj...@gmail.com>.
zeppelin will create SparkContext implicitly for users, so it might be too
late to set it after interpreter is opened. You can try to set that in
interpreter setting.




Muhammad Rezaul Karim <re...@yahoo.com>于2017年2月16日周四 下午11:52写道:

> Hi Lee,
>
> Thanks for the info that really helped. I set the compression codec in
> the Spark side -i.e. inside the SPARK_HOME and now the problem resolved.
> However, I was wondering if it's possible to set the same from the Zeppelin
> notebook.
>
> I tried in the following way:
>
> %spark conf.set("spark.io.compression.codec", "lz4")
>
> But getting an error. Please suggest.
>
>
>
> On Thursday, February 16, 2017 7:40 AM, Jongyoul Lee <jo...@gmail.com>
> wrote:
>
>
> Hi, Can you check if the script passes in spark-shell or not? AFAIK, you
> have to add compression codec by yourself in Spark side.
> On Wed, Feb 15, 2017 at 1:10 AM, Muhammad Rezaul Karim <
> reza_cse_du@yahoo.com> wrote:
>
> Hi All,
>
> I am receiving the following exception while executing SQL queries:
>
>  java.lang. NoSuchMethodException: org.apache.spark.io.
> LZ4CompressionCodec.<init>( org.apache.spark.SparkConf)
>     at java.lang.Class. getConstructor0(Class.java: 3082)
>     at java.lang.Class. getConstructor(Class.java: 1825)
>     at org.apache.spark.io. CompressionCodec$.createCodec(
> CompressionCodec.scala:71)
>     at org.apache.spark.io. CompressionCodec$.createCodec(
> CompressionCodec.scala:65)
>     at org.apache.spark.sql. execution.SparkPlan.org
> <http://org.apache.spark.sql.execution.sparkplan.org/>$
> apache$spark$sql$execution$ SparkPlan$$decodeUnsafeRows(
> SparkPlan.scala:250)
>     at org.apache.spark.sql. execution.SparkPlan$$anonfun$
> executeCollect$1.apply( SparkPlan.scala:276)
>     at org.apache.spark.sql. execution.SparkPlan$$anonfun$
> executeCollect$1.apply( SparkPlan.scala:275)
>     at scala.collection. IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized. scala:33)
>     at scala.collection.mutable. ArrayOps$ofRef.foreach(
> ArrayOps.scala:186)
>     at org.apache.spark.sql. execution.SparkPlan.
> executeCollect(SparkPlan. scala:275)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1$$ anonfun$apply$1.apply(
> BroadcastExchangeExec.scala: 78)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1$$ anonfun$apply$1.apply(
> BroadcastExchangeExec.scala: 75)
>     at org.apache.spark.sql. execution.SQLExecution$.
> withExecutionId(SQLExecution. scala:94)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
>     at scala.concurrent.impl.Future$ PromiseCompletingRunnable.
> liftedTree1$1(Future.scala:24)
>     at scala.concurrent.impl.Future$ PromiseCompletingRunnable.run(
> Future.scala:24)
>     at java.util.concurrent. ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>     at java.util.concurrent. ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>
>     at java.lang.Thread.run(Thread. java:745)
>
>
>
> *My SQL query is: *
> %sql  select * from land where Price >= 10000 AND CLUSTER = 2
>
> I am experiencing the above exception in the 1st run always but when I
> re-execute the same query for the 2nd or 3rd time, I don't get this error.
>
> Am I doing something wrong? Someone, please help me out.
>
>
>
>
>
> Kinds regards,
> Reza
>
>
>
>
> --
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net
>

Re: NoSuchMethodException in SQL

Posted by Jongyoul Lee <jo...@gmail.com>.
Which version of Spark do you use? I've found the current version supports
"lz4" by default, then it looks like you don't have to set anything. If you
want to another type of compression, you can set the configuration settings
into interpreter tab instead of using "conf.set'. Actually, that doesn't
affect anything because SparkInterpreter of Apache Zeppelin create
SparkContext before you set the configuration. Thus you should use
interpreter tab for setting some configurations to SparkContext.

Hope this help,
Jongyoul

On Fri, Feb 17, 2017 at 12:52 AM, Muhammad Rezaul Karim <
reza_cse_du@yahoo.com> wrote:

> Hi Lee,
>
> Thanks for the info that really helped. I set the compression codec in
> the Spark side -i.e. inside the SPARK_HOME and now the problem resolved.
> However, I was wondering if it's possible to set the same from the Zeppelin
> notebook.
>
> I tried in the following way:
>
> %spark conf.set("spark.io.compression.codec", "lz4")
>
> But getting an error. Please suggest.
>
>
>
> On Thursday, February 16, 2017 7:40 AM, Jongyoul Lee <jo...@gmail.com>
> wrote:
>
>
> Hi, Can you check if the script passes in spark-shell or not? AFAIK, you
> have to add compression codec by yourself in Spark side.
>
> On Wed, Feb 15, 2017 at 1:10 AM, Muhammad Rezaul Karim <
> reza_cse_du@yahoo.com> wrote:
>
> Hi All,
>
> I am receiving the following exception while executing SQL queries:
>  java.lang. NoSuchMethodException: org.apache.spark.io.
> LZ4CompressionCodec.<init>( org.apache.spark.SparkConf)
>     at java.lang.Class. getConstructor0(Class.java: 3082)
>     at java.lang.Class. getConstructor(Class.java: 1825)
>     at org.apache.spark.io. CompressionCodec$.createCodec(
> CompressionCodec.scala:71)
>     at org.apache.spark.io. CompressionCodec$.createCodec(
> CompressionCodec.scala:65)
>     at org.apache.spark.sql. execution.SparkPlan.org
> <http://org.apache.spark.sql.execution.sparkplan.org/>$
> apache$spark$sql$execution$ SparkPlan$$decodeUnsafeRows(
> SparkPlan.scala:250)
>     at org.apache.spark.sql. execution.SparkPlan$$anonfun$
> executeCollect$1.apply( SparkPlan.scala:276)
>     at org.apache.spark.sql. execution.SparkPlan$$anonfun$
> executeCollect$1.apply( SparkPlan.scala:275)
>     at scala.collection. IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized. scala:33)
>     at scala.collection.mutable. ArrayOps$ofRef.foreach(
> ArrayOps.scala:186)
>     at org.apache.spark.sql. execution.SparkPlan.
> executeCollect(SparkPlan. scala:275)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1$$ anonfun$apply$1.apply(
> BroadcastExchangeExec.scala: 78)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1$$ anonfun$apply$1.apply(
> BroadcastExchangeExec.scala: 75)
>     at org.apache.spark.sql. execution.SQLExecution$.
> withExecutionId(SQLExecution. scala:94)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
>     at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$
> anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
>     at scala.concurrent.impl.Future$ PromiseCompletingRunnable.
> liftedTree1$1(Future.scala:24)
>     at scala.concurrent.impl.Future$ PromiseCompletingRunnable.run(
> Future.scala:24)
>     at java.util.concurrent. ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>     at java.util.concurrent. ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>     at java.lang.Thread.run(Thread. java:745)
>
>
> *My SQL query is: *
> %sql  select * from land where Price >= 10000 AND CLUSTER = 2
>
> I am experiencing the above exception in the 1st run always but when I
> re-execute the same query for the 2nd or 3rd time, I don't get this error.
>
> Am I doing something wrong? Someone, please help me out.
>
>
>
>
>
> Kinds regards,
> Reza
>
>
>
>
> --
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net
>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: NoSuchMethodException in SQL

Posted by Muhammad Rezaul Karim <re...@yahoo.com>.
Hi Lee,
Thanks for the info that really helped. I set the compression codec in the Spark side -i.e. inside the SPARK_HOME and now the problem resolved. However, I was wondering if it's possible to set the same from the Zeppelin notebook.
I tried in the following way: 

%spark conf.set("spark.io.compression.codec", "lz4") 

But getting an error. Please suggest. 



    On Thursday, February 16, 2017 7:40 AM, Jongyoul Lee <jo...@gmail.com> wrote:
 

 Hi, Can you check if the script passes in spark-shell or not? AFAIK, you have to add compression codec by yourself in Spark side.
On Wed, Feb 15, 2017 at 1:10 AM, Muhammad Rezaul Karim <re...@yahoo.com> wrote:

Hi All,
I am receiving the following exception while executing SQL queries: 
 java.lang. NoSuchMethodException: org.apache.spark.io. LZ4CompressionCodec.<init>( org.apache.spark.SparkConf)
    at java.lang.Class. getConstructor0(Class.java: 3082)
    at java.lang.Class. getConstructor(Class.java: 1825)
    at org.apache.spark.io. CompressionCodec$.createCodec( CompressionCodec.scala:71)
    at org.apache.spark.io. CompressionCodec$.createCodec( CompressionCodec.scala:65)
    at org.apache.spark.sql. execution.SparkPlan.org$ apache$spark$sql$execution$ SparkPlan$$decodeUnsafeRows( SparkPlan.scala:250)
    at org.apache.spark.sql. execution.SparkPlan$$anonfun$ executeCollect$1.apply( SparkPlan.scala:276)
    at org.apache.spark.sql. execution.SparkPlan$$anonfun$ executeCollect$1.apply( SparkPlan.scala:275)
    at scala.collection. IndexedSeqOptimized$class. foreach(IndexedSeqOptimized. scala:33)
    at scala.collection.mutable. ArrayOps$ofRef.foreach( ArrayOps.scala:186)
    at org.apache.spark.sql. execution.SparkPlan. executeCollect(SparkPlan. scala:275)
    at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$ anonfun$relationFuture$1$$ anonfun$apply$1.apply( BroadcastExchangeExec.scala: 78)
    at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$ anonfun$relationFuture$1$$ anonfun$apply$1.apply( BroadcastExchangeExec.scala: 75)
    at org.apache.spark.sql. execution.SQLExecution$. withExecutionId(SQLExecution. scala:94)
    at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$ anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
    at org.apache.spark.sql. execution.exchange. BroadcastExchangeExec$$ anonfun$relationFuture$1. apply(BroadcastExchangeExec. scala:74)
    at scala.concurrent.impl.Future$ PromiseCompletingRunnable. liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$ PromiseCompletingRunnable.run( Future.scala:24)
    at java.util.concurrent. ThreadPoolExecutor.runWorker( ThreadPoolExecutor.java:1142)
    at java.util.concurrent. ThreadPoolExecutor$Worker.run( ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread. java:745)

My SQL query is: 
%sql  select * from land where Price >= 10000 AND CLUSTER = 2 

I am experiencing the above exception in the 1st run always but when I re-execute the same query for the 2nd or 3rd time, I don't get this error. 

Am I doing something wrong? Someone, please help me out. 





Kinds regards,Reza




-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net


   

Re: NoSuchMethodException in SQL

Posted by Jongyoul Lee <jo...@gmail.com>.
Hi, Can you check if the script passes in spark-shell or not? AFAIK, you
have to add compression codec by yourself in Spark side.

On Wed, Feb 15, 2017 at 1:10 AM, Muhammad Rezaul Karim <
reza_cse_du@yahoo.com> wrote:

> Hi All,
>
> I am receiving the following exception while executing SQL queries:
>  java.lang.NoSuchMethodException: org.apache.spark.io.
> LZ4CompressionCodec.<init>(org.apache.spark.SparkConf)
>     at java.lang.Class.getConstructor0(Class.java:3082)
>     at java.lang.Class.getConstructor(Class.java:1825)
>     at org.apache.spark.io.CompressionCodec$.createCodec(
> CompressionCodec.scala:71)
>     at org.apache.spark.io.CompressionCodec$.createCodec(
> CompressionCodec.scala:65)
>     at org.apache.spark.sql.execution.SparkPlan.org$
> apache$spark$sql$execution$SparkPlan$$decodeUnsafeRows(
> SparkPlan.scala:250)
>     at org.apache.spark.sql.execution.SparkPlan$$anonfun$
> executeCollect$1.apply(SparkPlan.scala:276)
>     at org.apache.spark.sql.execution.SparkPlan$$anonfun$
> executeCollect$1.apply(SparkPlan.scala:275)
>     at scala.collection.IndexedSeqOptimized$class.
> foreach(IndexedSeqOptimized.scala:33)
>     at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>     at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.
> scala:275)
>     at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$
> anonfun$relationFuture$1$$anonfun$apply$1.apply(
> BroadcastExchangeExec.scala:78)
>     at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$
> anonfun$relationFuture$1$$anonfun$apply$1.apply(
> BroadcastExchangeExec.scala:75)
>     at org.apache.spark.sql.execution.SQLExecution$.
> withExecutionId(SQLExecution.scala:94)
>     at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$
> anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:74)
>     at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$
> anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:74)
>     at scala.concurrent.impl.Future$PromiseCompletingRunnable.
> liftedTree1$1(Future.scala:24)
>     at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(
> Future.scala:24)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>     at java.lang.Thread.run(Thread.java:745)
>
>
> *My SQL query is: *
> %sql  select * from land where Price >= 10000 AND CLUSTER = 2
>
> I am experiencing the above exception in the 1st run always but when I
> re-execute the same query for the 2nd or 3rd time, I don't get this error.
>
> Am I doing something wrong? Someone, please help me out.
>
>
>
>
>
> Kinds regards,
> Reza
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net