You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by mina lee <mi...@apache.org> on 2016/07/06 07:14:55 UTC

[ANNOUNCE] Apache Zeppelin 0.6.0 released

The Apache Zeppelin community is pleased to announce the availability of
the 0.6.0 release.

Zeppelin is a collaborative data analytics and visualization tool for
distributed, general-purpose data processing system such as Apache Spark,
Apache Flink, etc.

The community put significant effort into improving Apache Zeppelin since
the last release, focusing on having new backend support, implementing
authentication and authorization for enterprise. More than 70+ contributors
provided 360+ patches for new features, improvements and bug fixes. More
than 200+ issues have been resolved.

We encourage download the latest release from
http://zeppelin.apache.org/download.html

Release note is available at
http://zeppelin.apache.org/releases/zeppelin-release-0.6.0.html
<http://zeppelin.incubator.apache.org/releases/zeppelin-release-0.6.0.html>

We welcome your help and feedback. For more information on the project and
how to get involved, visit our website at http://zeppelin.apache.org/

Thanks to all users and contributors who have helped to improve Apache
Zeppelin.

Regards,
The Apache Zeppelin community

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Jeff Steinmetz <je...@gmail.com>.
Great work - kudos to the 70+ contributors!

Best
Jeff Steinmetz





On 7/6/16, 12:14 AM, "mina lee" <mi...@apache.org> wrote:

>The Apache Zeppelin community is pleased to announce the availability of
>the 0.6.0 release.
>
>Zeppelin is a collaborative data analytics and visualization tool for
>distributed, general-purpose data processing system such as Apache Spark,
>Apache Flink, etc.
>
>The community put significant effort into improving Apache Zeppelin since
>the last release, focusing on having new backend support, implementing
>authentication and authorization for enterprise. More than 70+ contributors
>provided 360+ patches for new features, improvements and bug fixes. More
>than 200+ issues have been resolved.
>
>We encourage download the latest release from
>http://zeppelin.apache.org/download.html
>
>Release note is available at
>http://zeppelin.apache.org/releases/zeppelin-release-0.6.0.html
><http://zeppelin.incubator.apache.org/releases/zeppelin-release-0.6.0.html>
>
>We welcome your help and feedback. For more information on the project and
>how to get involved, visit our website at http://zeppelin.apache.org/
>
>Thanks to all users and contributors who have helped to improve Apache
>Zeppelin.
>
>Regards,
>The Apache Zeppelin community


Re: Zeppelin 0.6.0 on CDH 5.7.1

Posted by Felix Cheung <fe...@hotmail.com>.
Spark would need hive-site.xml be on CLASS_PATH, hence I'd suggest putting a copy under SPARK_HOME/conf. I don't recall zeppelin/conf is on, and that would probably be the reason why it wouldn't work.

Do you have the exception stack for %sql not working?


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Monday, July 11, 2016 9:17 PM
Subject: Re: Zeppelin 0.6.0 on CDH 5.7.1
To: Felix Cheung <fe...@hotmail.com>>
Cc: <de...@zeppelin.apache.org>>


That fixed the problem with using SQLContext in Scala code. The error below went away. But, using Spark SQL (%spark.sql) still gets the error below. Is there somewhere else hive-site.xml is needed?

Thanks,
Ben

On Jul 11, 2016, at 8:56 PM, Felix Cheung <fe...@hotmail.com>> wrote:

This error looks different - it doesn't have JavaDeserializationStream on the stack, instead it has hive.client.ClientWrapper.conf.

Do you still have hive-site.xml under SPARK_HOME/conf?


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Monday, July 11, 2016 3:20 PM
Subject: Re: Zeppelin 0.6.0 on CDH 5.7.1
To: Felix Cheung <fe...@hotmail.com>>
Cc: <de...@zeppelin.apache.org>>


It looks like not mismatched versions. I tried local[*] mode and got the same error again just for Spark SQL (SQLContext).

java.lang.NullPointerException
at org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
at org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks,
Ben


On Jul 9, 2016, at 11:43 PM, Felix Cheung <fe...@hotmail.com>> wrote:

Right, it is possible we might encounter this with Spark 2.0.

-users@ for now

This error seems to be serialization related. Commonly this can be caused by mismatch versions. What is spark.master set to? Could you try with local[*] instead of yarn-client to see if Spark running by Zeppelin is somehow different?

java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
 at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
 at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Saturday, July 9, 2016 10:54 PM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Hi JL,

Spark is version 1.6.0 and Akka is 2.2.3. But, Cloudera always back ports things from newer versions. They told me that they ported some bug fixes from Spark 2.0.

Please let me know if you need any more information.

Cheers,
Ben


On Jul 9, 2016, at 10:12 PM, Jongyoul Lee <jo...@gmail.com>> wrote:

Hi all,

Could you guys check the CDH's version of Spark? As I've tested it for a long time ago, it is a little bit different from vanila one, for example, the CDH's one has a different version of some depedencies including Akka.

Regards,
JL

On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Feix,

I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:

java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks for helping.

Ben


On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com>> wrote:

For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Friday, July 8, 2016 6:53 AM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1".

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>> wrote:

Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL










--
???, Jongyoul Lee, ???
http://madeng.net<http://madeng.net/>










Re: Zeppelin 0.6.0 on CDH 5.7.1

Posted by Felix Cheung <fe...@hotmail.com>.
This error looks different - it doesn't have JavaDeserializationStream on the stack, instead it has hive.client.ClientWrapper.conf.

Do you still have hive-site.xml under SPARK_HOME/conf?


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Monday, July 11, 2016 3:20 PM
Subject: Re: Zeppelin 0.6.0 on CDH 5.7.1
To: Felix Cheung <fe...@hotmail.com>>
Cc: <de...@zeppelin.apache.org>>


It looks like not mismatched versions. I tried local[*] mode and got the same error again just for Spark SQL (SQLContext).

java.lang.NullPointerException
at org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
at org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks,
Ben


On Jul 9, 2016, at 11:43 PM, Felix Cheung <fe...@hotmail.com>> wrote:

Right, it is possible we might encounter this with Spark 2.0.

-users@ for now

This error seems to be serialization related. Commonly this can be caused by mismatch versions. What is spark.master set to? Could you try with local[*] instead of yarn-client to see if Spark running by Zeppelin is somehow different?

java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
 at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
 at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Saturday, July 9, 2016 10:54 PM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Hi JL,

Spark is version 1.6.0 and Akka is 2.2.3. But, Cloudera always back ports things from newer versions. They told me that they ported some bug fixes from Spark 2.0.

Please let me know if you need any more information.

Cheers,
Ben


On Jul 9, 2016, at 10:12 PM, Jongyoul Lee <jo...@gmail.com>> wrote:

Hi all,

Could you guys check the CDH's version of Spark? As I've tested it for a long time ago, it is a little bit different from vanila one, for example, the CDH's one has a different version of some depedencies including Akka.

Regards,
JL

On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Feix,

I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:

java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks for helping.

Ben


On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com>> wrote:

For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Friday, July 8, 2016 6:53 AM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>> wrote:

Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL










--
???, Jongyoul Lee, ???
http://madeng.net<http://madeng.net/>







Re: Zeppelin 0.6.0 on CDH 5.7.1

Posted by Benjamin Kim <bb...@gmail.com>.
It looks like not mismatched versions. I tried local[*] mode and got the same error again just for Spark SQL (SQLContext).

java.lang.NullPointerException
	at org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
	at org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
	at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
	at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
	at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
	at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
	at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
	at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
	at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
	at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
	at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
	at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
	at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
	at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
	at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Thanks,
Ben


> On Jul 9, 2016, at 11:43 PM, Felix Cheung <fe...@hotmail.com> wrote:
> 
> Right, it is possible we might encounter this with Spark 2.0.
> 
> -users@ for now
> 
> This error seems to be serialization related. Commonly this can be caused by mismatch versions. What is spark.master set to? Could you try with local[*] instead of yarn-client to see if Spark running by Zeppelin is somehow different?
> 
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>  at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
>  at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
>  at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
> 
> 
> _____________________________
> From: Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>>
> Sent: Saturday, July 9, 2016 10:54 PM
> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
> To: <users@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> Cc: <dev@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> 
> 
> Hi JL,
> 
> Spark is version 1.6.0 and Akka is 2.2.3. But, Cloudera always back ports things from newer versions. They told me that they ported some bug fixes from Spark 2.0.
> 
> Please let me know if you need any more information.
> 
> Cheers,
> Ben
> 
> 
> On Jul 9, 2016, at 10:12 PM, Jongyoul Lee <jongyoul@gmail.com <ma...@gmail.com>> wrote:
> 
> Hi all,
> 
> Could you guys check the CDH's version of Spark? As I've tested it for a long time ago, it is a little bit different from vanila one, for example, the CDH's one has a different version of some depedencies including Akka.
> 
> Regards,
> JL
> 
> On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Feix,
> 
> I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:
> 
> java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 
> Thanks for helping.
> 
> Ben
> 
> 
> On Jul 8, 2016, at 10:47 PM, Felix Cheung <felixcheung_m@hotmail.com <ma...@hotmail.com>> wrote:
> 
> For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.
> 
> 
> _____________________________
> From: Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>>
> Sent: Friday, July 8, 2016 6:53 AM
> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
> To: <users@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> Cc: <dev@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> 
> 
> Felix,
> 
> I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz <http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz> using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.
> 
> I did this because we are using HBase 1.2 within CDH 5.7.1.
> 
> Hope this helps clarify.
> 
> Thanks,
> Ben
> 
> 
> 
> On Jul 8, 2016, at 2:01 AM, Felix Cheung <felixcheung_m@hotmail.com <ma...@hotmail.com>> wrote:
> 
> Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?
> 
> 
> 
> 
> 
> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> 
> Moon,
> 
> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:
> 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
> 15 are for data nodes
> jdk1.8_60 and CDH 5.7.1 installed
> 
> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
> 
> This is our QA cluster where we are testing before deploying to production.
> 
> If you need more information, please let me know.
> 
> Thanks,
> Ben
> 
>  
> 
> On Jul 7, 2016, at 7:54 PM, moon soo Lee <moon@apache.org <ma...@apache.org>> wrote:
> 
> Randy,
> 
> Helium is not included in 0.6.0 release. Could you check which version are you using?
> I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150 <https://github.com/apache/zeppelin/pull/1150>
> 
> Ben,
> I can not reproduce the error, could you share how to reproduce error, or share your environment?
> 
> Thanks,
> moon
> 
> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
> I don't- I hoped providing that information may help finding & fixing the problem.
> 
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Hi Randy,
> 
> Do you know of any way to fix it or know of a workaround?
> 
> Thanks,
> Ben
> 
> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
> 
> HTTP 500 errors from a Helium URL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net <http://madeng.net/>
> 
> 
> 


Zeppelin 0.6.0 on CDH 5.7.1

Posted by Felix Cheung <fe...@hotmail.com>.
Right, it is possible we might encounter this with Spark 2.0.

-users@ for now

This error seems to be serialization related. Commonly this can be caused by mismatch versions. What is spark.master set to? Could you try with local[*] instead of yarn-client to see if Spark running by Zeppelin is somehow different?

java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
 at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
 at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
 at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
 at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Saturday, July 9, 2016 10:54 PM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Hi JL,

Spark is version 1.6.0 and Akka is 2.2.3. But, Cloudera always back ports things from newer versions. They told me that they ported some bug fixes from Spark 2.0.

Please let me know if you need any more information.

Cheers,
Ben


On Jul 9, 2016, at 10:12 PM, Jongyoul Lee <jo...@gmail.com>> wrote:

Hi all,

Could you guys check the CDH's version of Spark? As I've tested it for a long time ago, it is a little bit different from vanila one, for example, the CDH's one has a different version of some depedencies including Akka.

Regards,
JL

On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Feix,

I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:

java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Thanks for helping.

Ben


On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com>> wrote:

For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Friday, July 8, 2016 6:53 AM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>> wrote:

Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL










--
???, Jongyoul Lee, ???
http://madeng.net<http://madeng.net/>




Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
Hi JL,

Spark is version 1.6.0 and Akka is 2.2.3. But, Cloudera always back ports things from newer versions. They told me that they ported some bug fixes from Spark 2.0.

Please let me know if you need any more information.

Cheers,
Ben


> On Jul 9, 2016, at 10:12 PM, Jongyoul Lee <jo...@gmail.com> wrote:
> 
> Hi all,
> 
> Could you guys check the CDH's version of Spark? As I've tested it for a long time ago, it is a little bit different from vanila one, for example, the CDH's one has a different version of some depedencies including Akka.
> 
> Regards,
> JL
> 
> On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Feix,
> 
> I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:
> 
> java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:348)
> 	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> 	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> 	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> 
> Thanks for helping.
> 
> Ben
> 
> 
>> On Jul 8, 2016, at 10:47 PM, Felix Cheung <felixcheung_m@hotmail.com <ma...@hotmail.com>> wrote:
>> 
>> For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.
>> 
>> 
>> _____________________________
>> From: Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>>
>> Sent: Friday, July 8, 2016 6:53 AM
>> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
>> To: <users@zeppelin.apache.org <ma...@zeppelin.apache.org>>
>> Cc: <dev@zeppelin.apache.org <ma...@zeppelin.apache.org>>
>> 
>> 
>> Felix,
>> 
>> I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz <http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz> using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.
>> 
>> I did this because we are using HBase 1.2 within CDH 5.7.1.
>> 
>> Hope this helps clarify.
>> 
>> Thanks,
>> Ben
>> 
>> 
>> 
>> On Jul 8, 2016, at 2:01 AM, Felix Cheung <felixcheung_m@hotmail.com <ma...@hotmail.com>> wrote:
>> 
>> Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?
>> 
>> 
>> 
>> 
>> 
>> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> 
>> Moon,
>> 
>> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:
>> 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
>> 15 are for data nodes
>> jdk1.8_60 and CDH 5.7.1 installed
>> 
>> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
>> 
>> This is our QA cluster where we are testing before deploying to production.
>> 
>> If you need more information, please let me know.
>> 
>> Thanks,
>> Ben
>> 
>>  
>> 
>> On Jul 7, 2016, at 7:54 PM, moon soo Lee <moon@apache.org <ma...@apache.org>> wrote:
>> 
>> Randy,
>> 
>> Helium is not included in 0.6.0 release. Could you check which version are you using?
>> I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150 <https://github.com/apache/zeppelin/pull/1150>
>> 
>> Ben,
>> I can not reproduce the error, could you share how to reproduce error, or share your environment?
>> 
>> Thanks,
>> moon
>> 
>> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
>> I don't- I hoped providing that information may help finding & fixing the problem.
>> 
>> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> Hi Randy,
>> 
>> Do you know of any way to fix it or know of a workaround?
>> 
>> Thanks,
>> Ben
>> 
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
>> 
>> HTTP 500 errors from a Helium URL
>> 
>> 
>> 
>> 
>> 
>> 
> 
> 
> 
> 
> -- 
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net <http://madeng.net/>


Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Jongyoul Lee <jo...@gmail.com>.
Hi all,

Could you guys check the CDH's version of Spark? As I've tested it for a
long time ago, it is a little bit different from vanila one, for example,
the CDH's one has a different version of some depedencies including Akka.

Regards,
JL

On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Feix,
>
> I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I
> get another error:
>
> java.lang.ClassNotFoundException:
> line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Thanks for helping.
>
> Ben
>
>
> On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com>
> wrote:
>
> For #1, do you know if Spark can find the Hive metastore config (typically
> in hive-site.xml) - Spark's log should indicate that.
>
>
> _____________________________
> From: Benjamin Kim <bb...@gmail.com>
> Sent: Friday, July 8, 2016 6:53 AM
> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
> To: <us...@zeppelin.apache.org>
> Cc: <de...@zeppelin.apache.org>
>
>
> Felix,
>
> I forgot to add that I built Zeppelin from source
> http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using
> this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6
> -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark
> -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1
> -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.
>
> I did this because we are using HBase 1.2 within CDH 5.7.1.
>
> Hope this helps clarify.
>
> Thanks,
> Ben
>
>
>
> On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>
> wrote:
>
> Is this possibly caused by CDH requiring a build-from-source instead of
> the official binary releases?
>
>
>
>
>
> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>
> wrote:
>
> Moon,
>
> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24
> cores, 64GB, 12TB storage each:
>
>    - 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and
>    a YARN resource manager
>    - 15 are for data nodes
>    - jdk1.8_60 and CDH 5.7.1 installed
>
>
> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has
> Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and
> HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
>
> This is our QA cluster where we are testing before deploying to production.
>
> If you need more information, please let me know.
>
> Thanks,
> Ben
>
>
>
> On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org> wrote:
>
> Randy,
>
> Helium is not included in 0.6.0 release. Could you check which version are
> you using?
> I created a fix for 500 errors from Helium URL in master branch.
> https://github.com/apache/zeppelin/pull/1150
>
> Ben,
> I can not reproduce the error, could you share how to reproduce error, or
> share your environment?
>
> Thanks,
> moon
>
> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com> wrote:
>
>> I don't- I hoped providing that information may help finding & fixing the
>> problem.
>>
>> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>
>>> Hi Randy,
>>>
>>> Do you know of any way to fix it or know of a workaround?
>>>
>>> Thanks,
>>> Ben
>>>
>>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>>
>>> HTTP 500 errors from a Helium URL
>>>
>>>
>>>
>>
>
>
>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Jongyoul Lee <jo...@gmail.com>.
Hi all,

Could you guys check the CDH's version of Spark? As I've tested it for a
long time ago, it is a little bit different from vanila one, for example,
the CDH's one has a different version of some depedencies including Akka.

Regards,
JL

On Sat, Jul 9, 2016 at 11:47 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Feix,
>
> I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I
> get another error:
>
> java.lang.ClassNotFoundException:
> line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
> at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Thanks for helping.
>
> Ben
>
>
> On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com>
> wrote:
>
> For #1, do you know if Spark can find the Hive metastore config (typically
> in hive-site.xml) - Spark's log should indicate that.
>
>
> _____________________________
> From: Benjamin Kim <bb...@gmail.com>
> Sent: Friday, July 8, 2016 6:53 AM
> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
> To: <us...@zeppelin.apache.org>
> Cc: <de...@zeppelin.apache.org>
>
>
> Felix,
>
> I forgot to add that I built Zeppelin from source
> http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using
> this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6
> -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark
> -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1
> -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.
>
> I did this because we are using HBase 1.2 within CDH 5.7.1.
>
> Hope this helps clarify.
>
> Thanks,
> Ben
>
>
>
> On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>
> wrote:
>
> Is this possibly caused by CDH requiring a build-from-source instead of
> the official binary releases?
>
>
>
>
>
> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>
> wrote:
>
> Moon,
>
> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24
> cores, 64GB, 12TB storage each:
>
>    - 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and
>    a YARN resource manager
>    - 15 are for data nodes
>    - jdk1.8_60 and CDH 5.7.1 installed
>
>
> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has
> Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and
> HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
>
> This is our QA cluster where we are testing before deploying to production.
>
> If you need more information, please let me know.
>
> Thanks,
> Ben
>
>
>
> On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org> wrote:
>
> Randy,
>
> Helium is not included in 0.6.0 release. Could you check which version are
> you using?
> I created a fix for 500 errors from Helium URL in master branch.
> https://github.com/apache/zeppelin/pull/1150
>
> Ben,
> I can not reproduce the error, could you share how to reproduce error, or
> share your environment?
>
> Thanks,
> moon
>
> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com> wrote:
>
>> I don't- I hoped providing that information may help finding & fixing the
>> problem.
>>
>> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>
>>> Hi Randy,
>>>
>>> Do you know of any way to fix it or know of a workaround?
>>>
>>> Thanks,
>>> Ben
>>>
>>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>>
>>> HTTP 500 errors from a Helium URL
>>>
>>>
>>>
>>
>
>
>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
Feix,

I added hive-site.xml to the conf directory and restarted Zeppelin. Now, I get another error:

java.lang.ClassNotFoundException: line1631424043$24.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:64)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
	at org.apache.spark.scheduler.Task.run(Task.scala:89)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Thanks for helping.

Ben


> On Jul 8, 2016, at 10:47 PM, Felix Cheung <fe...@hotmail.com> wrote:
> 
> For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.
> 
> 
> _____________________________
> From: Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>>
> Sent: Friday, July 8, 2016 6:53 AM
> Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
> To: <users@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> Cc: <dev@zeppelin.apache.org <ma...@zeppelin.apache.org>>
> 
> 
> Felix,
> 
> I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz <http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz> using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.
> 
> I did this because we are using HBase 1.2 within CDH 5.7.1.
> 
> Hope this helps clarify.
> 
> Thanks,
> Ben
> 
> 
> 
> On Jul 8, 2016, at 2:01 AM, Felix Cheung <felixcheung_m@hotmail.com <ma...@hotmail.com>> wrote:
> 
> Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?
> 
> 
> 
> 
> 
> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> 
> Moon,
> 
> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:
> 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
> 15 are for data nodes
> jdk1.8_60 and CDH 5.7.1 installed
> 
> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
> 
> This is our QA cluster where we are testing before deploying to production.
> 
> If you need more information, please let me know.
> 
> Thanks,
> Ben
> 
>  
> 
> On Jul 7, 2016, at 7:54 PM, moon soo Lee <moon@apache.org <ma...@apache.org>> wrote:
> 
> Randy,
> 
> Helium is not included in 0.6.0 release. Could you check which version are you using?
> I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150 <https://github.com/apache/zeppelin/pull/1150>
> 
> Ben,
> I can not reproduce the error, could you share how to reproduce error, or share your environment?
> 
> Thanks,
> moon
> 
> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
> I don't- I hoped providing that information may help finding & fixing the problem.
> 
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Hi Randy,
> 
> Do you know of any way to fix it or know of a workaround?
> 
> Thanks,
> Ben
> 
> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
> 
> HTTP 500 errors from a Helium URL
> 
> 
> 
> 
> 
> 


Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Felix Cheung <fe...@hotmail.com>.
For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Friday, July 8, 2016 6:53 AM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1".

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>> wrote:

Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL







Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Felix Cheung <fe...@hotmail.com>.
For #1, do you know if Spark can find the Hive metastore config (typically in hive-site.xml) - Spark's log should indicate that.


_____________________________
From: Benjamin Kim <bb...@gmail.com>>
Sent: Friday, July 8, 2016 6:53 AM
Subject: Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released
To: <us...@zeppelin.apache.org>>
Cc: <de...@zeppelin.apache.org>>


Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1".

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com>> wrote:

Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL







Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
Felix,

I forgot to add that I built Zeppelin from source http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz <http://mirrors.ibiblio.org/apache/zeppelin/zeppelin-0.6.0/zeppelin-0.6.0.tgz> using this command "mvn clean package -DskipTests -Pspark-1.6 -Phadoop-2.6 -Dspark.version=1.6.0-cdh5.7.1 -Dhadoop.version=2.6.0-cdh5.7.1 -Ppyspark -Pvendor-repo -Pbuild-distr -Dhbase.hbase.version=1.2.0-cdh5.7.1 -Dhbase.hadoop.version=2.6.0-cdh5.7.1”.

I did this because we are using HBase 1.2 within CDH 5.7.1.

Hope this helps clarify.

Thanks,
Ben



> On Jul 8, 2016, at 2:01 AM, Felix Cheung <fe...@hotmail.com> wrote:
> 
> Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?
> 
> 
> 
> 
> 
> On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> 
> Moon,
> 
> My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:
> 3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
> 15 are for data nodes
> jdk1.8_60 and CDH 5.7.1 installed
> 
> Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.
> 
> This is our QA cluster where we are testing before deploying to production.
> 
> If you need more information, please let me know.
> 
> Thanks,
> Ben
> 
>  
> 
>> On Jul 7, 2016, at 7:54 PM, moon soo Lee <moon@apache.org <ma...@apache.org>> wrote:
>> 
>> Randy,
>> 
>> Helium is not included in 0.6.0 release. Could you check which version are you using?
>> I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150 <https://github.com/apache/zeppelin/pull/1150>
>> 
>> Ben,
>> I can not reproduce the error, could you share how to reproduce error, or share your environment?
>> 
>> Thanks,
>> moon
>> 
>> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
>> I don't- I hoped providing that information may help finding & fixing the problem.
>> 
>> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> Hi Randy,
>> 
>> Do you know of any way to fix it or know of a workaround?
>> 
>> Thanks,
>> Ben
>> 
>>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
>>> 
>>> HTTP 500 errors from a Helium URL
>> 
>> 
> 


Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Felix Cheung <fe...@hotmail.com>.
Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL




Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Felix Cheung <fe...@hotmail.com>.
Is this possibly caused by CDH requiring a build-from-source instead of the official binary releases?





On Thu, Jul 7, 2016 at 8:22 PM -0700, "Benjamin Kim" <bb...@gmail.com>> wrote:

Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:

  *   3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
  *   15 are for data nodes
  *   jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben



On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org>> wrote:

Randy,

Helium is not included in 0.6.0 release. Could you check which version are you using?
I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com>> wrote:
I don't- I hoped providing that information may help finding & fixing the problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com>> wrote:
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com>> wrote:

HTTP 500 errors from a Helium URL




Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
Moon,

My environmental setup consists of an 18 node CentOS 6.7 cluster with 24 cores, 64GB, 12TB storage each:
3 of those nodes are used as Zookeeper servers, HDFS name nodes, and a YARN resource manager
15 are for data nodes
jdk1.8_60 and CDH 5.7.1 installed

Another node is an app server, 24 cores, 128GB memory, 1TB storage. It has Zeppelin 0.6.0 and Livy 0.2.0 running on it. Plus, Hive Metastore and HiveServer2, Hue, and Oozie are running on it from CDH 5.7.1.

This is our QA cluster where we are testing before deploying to production.

If you need more information, please let me know.

Thanks,
Ben

 

> On Jul 7, 2016, at 7:54 PM, moon soo Lee <mo...@apache.org> wrote:
> 
> Randy,
> 
> Helium is not included in 0.6.0 release. Could you check which version are you using?
> I created a fix for 500 errors from Helium URL in master branch. https://github.com/apache/zeppelin/pull/1150 <https://github.com/apache/zeppelin/pull/1150>
> 
> Ben,
> I can not reproduce the error, could you share how to reproduce error, or share your environment?
> 
> Thanks,
> moon
> 
> On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
> I don't- I hoped providing that information may help finding & fixing the problem.
> 
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Hi Randy,
> 
> Do you know of any way to fix it or know of a workaround?
> 
> Thanks,
> Ben
> 
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rgelhau@gmail.com <ma...@gmail.com>> wrote:
>> 
>> HTTP 500 errors from a Helium URL
> 
> 


Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by moon soo Lee <mo...@apache.org>.
Randy,

Helium is not included in 0.6.0 release. Could you check which version are
you using?
I created a fix for 500 errors from Helium URL in master branch.
https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or
share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com> wrote:

> I don't- I hoped providing that information may help finding & fixing the
> problem.
>
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Hi Randy,
>>
>> Do you know of any way to fix it or know of a workaround?
>>
>> Thanks,
>> Ben
>>
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>
>> HTTP 500 errors from a Helium URL
>>
>>
>>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by moon soo Lee <mo...@apache.org>.
Randy,

Helium is not included in 0.6.0 release. Could you check which version are
you using?
I created a fix for 500 errors from Helium URL in master branch.
https://github.com/apache/zeppelin/pull/1150

Ben,
I can not reproduce the error, could you share how to reproduce error, or
share your environment?

Thanks,
moon

On Thu, Jul 7, 2016 at 4:02 PM Randy Gelhausen <rg...@gmail.com> wrote:

> I don't- I hoped providing that information may help finding & fixing the
> problem.
>
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Hi Randy,
>>
>> Do you know of any way to fix it or know of a workaround?
>>
>> Thanks,
>> Ben
>>
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>
>> HTTP 500 errors from a Helium URL
>>
>>
>>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Corneau Damien <co...@gmail.com>.
FYI, the Helium logs in the console should be on master, but not on 0.6.0
branch.
Those 500 errors should not have any effect when running.


On Fri, Jul 8, 2016 at 8:01 AM, Randy Gelhausen <rg...@gmail.com> wrote:

> I don't- I hoped providing that information may help finding & fixing the
> problem.
>
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Hi Randy,
>>
>> Do you know of any way to fix it or know of a workaround?
>>
>> Thanks,
>> Ben
>>
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>
>> HTTP 500 errors from a Helium URL
>>
>>
>>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Corneau Damien <co...@gmail.com>.
FYI, the Helium logs in the console should be on master, but not on 0.6.0
branch.
Those 500 errors should not have any effect when running.


On Fri, Jul 8, 2016 at 8:01 AM, Randy Gelhausen <rg...@gmail.com> wrote:

> I don't- I hoped providing that information may help finding & fixing the
> problem.
>
> On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Hi Randy,
>>
>> Do you know of any way to fix it or know of a workaround?
>>
>> Thanks,
>> Ben
>>
>> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>>
>> HTTP 500 errors from a Helium URL
>>
>>
>>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Randy Gelhausen <rg...@gmail.com>.
I don't- I hoped providing that information may help finding & fixing the
problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Hi Randy,
>
> Do you know of any way to fix it or know of a workaround?
>
> Thanks,
> Ben
>
> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>
> HTTP 500 errors from a Helium URL
>
>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Randy Gelhausen <rg...@gmail.com>.
I don't- I hoped providing that information may help finding & fixing the
problem.

On Thu, Jul 7, 2016 at 5:53 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Hi Randy,
>
> Do you know of any way to fix it or know of a workaround?
>
> Thanks,
> Ben
>
> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
>
> HTTP 500 errors from a Helium URL
>
>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
Hi Randy,

Do you know of any way to fix it or know of a workaround?

Thanks,
Ben

> On Jul 7, 2016, at 2:08 PM, Randy Gelhausen <rg...@gmail.com> wrote:
> 
> HTTP 500 errors from a Helium URL


Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Randy Gelhausen <rg...@gmail.com>.
Ben, I have the same problem.

If you open your javascript console and watch while running any cell, you
will see several HTTP 500 errors from a Helium URL.

On Thu, Jul 7, 2016 at 4:27 PM, Benjamin Kim <bb...@gmail.com> wrote:

> To whom it may concern:
>
> After upgrading to Zeppelin 0.6.0, I am having a couple interpreter
> anomalies. Please look below, and I hope that there will be an easy fix for
> them.
>
> 1. Spark SQL gives me in this error in the Zeppelin Tutorial notebook, but
> the Scala code to populate and register the temp table runs fine.
>
> java.lang.NullPointerException
> at
> org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
> at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
> at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
> at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
> at
> org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
> at
> org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
> at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
> at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
> at
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
> at
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
> at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
> at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
> at
> org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
> at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
> at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
> at
> org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> 2. Livy Spark SQL does not work also.
>
> <console>:32: error: overloaded method value show with alternatives:
>   (truncate: Boolean)Unit <and>
>   (numRows: Int)Unit
>  cannot be applied to (Null)
>               sqlContext.sql("select age, count(1) value from bank  where
> age < 30  group by age  order by age").show(null)
>
> Thanks,
> Ben
>
> On Jul 6, 2016, at 12:14 AM, mina lee <mi...@apache.org> wrote:
>
> The Apache Zeppelin community is pleased to announce the availability of
> the 0.6.0 release.
>
> Zeppelin is a collaborative data analytics and visualization tool for
> distributed, general-purpose data processing system such as Apache Spark,
> Apache Flink, etc.
>
> The community put significant effort into improving Apache Zeppelin since
> the last release, focusing on having new backend support, implementing
> authentication and authorization for enterprise. More than 70+ contributors
> provided 360+ patches for new features, improvements and bug fixes. More
> than 200+ issues have been resolved.
>
> We encourage download the latest release from
> http://zeppelin.apache.org/download.html
>
> Release note is available at
> http://zeppelin.apache.org/releases/zeppelin-release-0.6.0.html
> <http://zeppelin.incubator.apache.org/releases/zeppelin-release-0.6.0.html>
>
> We welcome your help and feedback. For more information on the project and
> how to get involved, visit our website at http://zeppelin.apache.org/
>
> Thanks to all users and contributors who have helped to improve Apache
> Zeppelin.
>
> Regards,
> The Apache Zeppelin community
>
>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Randy Gelhausen <rg...@gmail.com>.
Ben, I have the same problem.

If you open your javascript console and watch while running any cell, you
will see several HTTP 500 errors from a Helium URL.

On Thu, Jul 7, 2016 at 4:27 PM, Benjamin Kim <bb...@gmail.com> wrote:

> To whom it may concern:
>
> After upgrading to Zeppelin 0.6.0, I am having a couple interpreter
> anomalies. Please look below, and I hope that there will be an easy fix for
> them.
>
> 1. Spark SQL gives me in this error in the Zeppelin Tutorial notebook, but
> the Scala code to populate and register the temp table runs fine.
>
> java.lang.NullPointerException
> at
> org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
> at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
> at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
> at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
> at
> org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
> at
> org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
> at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
> at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> at
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> at
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
> at
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
> at
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
> at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
> at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
> at
> org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
> at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
> at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
> at
> org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> 2. Livy Spark SQL does not work also.
>
> <console>:32: error: overloaded method value show with alternatives:
>   (truncate: Boolean)Unit <and>
>   (numRows: Int)Unit
>  cannot be applied to (Null)
>               sqlContext.sql("select age, count(1) value from bank  where
> age < 30  group by age  order by age").show(null)
>
> Thanks,
> Ben
>
> On Jul 6, 2016, at 12:14 AM, mina lee <mi...@apache.org> wrote:
>
> The Apache Zeppelin community is pleased to announce the availability of
> the 0.6.0 release.
>
> Zeppelin is a collaborative data analytics and visualization tool for
> distributed, general-purpose data processing system such as Apache Spark,
> Apache Flink, etc.
>
> The community put significant effort into improving Apache Zeppelin since
> the last release, focusing on having new backend support, implementing
> authentication and authorization for enterprise. More than 70+ contributors
> provided 360+ patches for new features, improvements and bug fixes. More
> than 200+ issues have been resolved.
>
> We encourage download the latest release from
> http://zeppelin.apache.org/download.html
>
> Release note is available at
> http://zeppelin.apache.org/releases/zeppelin-release-0.6.0.html
> <http://zeppelin.incubator.apache.org/releases/zeppelin-release-0.6.0.html>
>
> We welcome your help and feedback. For more information on the project and
> how to get involved, visit our website at http://zeppelin.apache.org/
>
> Thanks to all users and contributors who have helped to improve Apache
> Zeppelin.
>
> Regards,
> The Apache Zeppelin community
>
>
>

Re: [ANNOUNCE] Apache Zeppelin 0.6.0 released

Posted by Benjamin Kim <bb...@gmail.com>.
To whom it may concern:

After upgrading to Zeppelin 0.6.0, I am having a couple interpreter anomalies. Please look below, and I hope that there will be an easy fix for them.

1. Spark SQL gives me in this error in the Zeppelin Tutorial notebook, but the Scala code to populate and register the temp table runs fine.

java.lang.NullPointerException
	at org.apache.spark.sql.hive.client.ClientWrapper.conf(ClientWrapper.scala:205)
	at org.apache.spark.sql.hive.client.ClientWrapper.client(ClientWrapper.scala:261)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:273)
	at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:228)
	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:227)
	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:270)
	at org.apache.spark.sql.hive.HiveQLDialect.parse(HiveContext.scala:65)
	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:211)
	at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:114)
	at org.apache.spark.sql.execution.SparkSQLParser$$anonfun$org$apache$spark$sql$execution$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:113)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
	at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
	at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
	at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
	at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
	at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
	at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
	at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
	at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:208)
	at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
	at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
	at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:333)
	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:117)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
	at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

2. Livy Spark SQL does not work also.

<console>:32: error: overloaded method value show with alternatives:
  (truncate: Boolean)Unit <and>
  (numRows: Int)Unit
 cannot be applied to (Null)
              sqlContext.sql("select age, count(1) value from bank  where age < 30  group by age  order by age").show(null)

Thanks,
Ben

> On Jul 6, 2016, at 12:14 AM, mina lee <mi...@apache.org> wrote:
> 
> The Apache Zeppelin community is pleased to announce the availability of the 0.6.0 release.
> 
> Zeppelin is a collaborative data analytics and visualization tool for distributed, general-purpose data processing system such as Apache Spark, Apache Flink, etc.
> 
> The community put significant effort into improving Apache Zeppelin since the last release, focusing on having new backend support, implementing authentication and authorization for enterprise. More than 70+ contributors provided 360+ patches for new features, improvements and bug fixes. More than 200+ issues have been resolved.
> 
> We encourage download the latest release from http://zeppelin.apache.org/download.html <http://zeppelin.apache.org/download.html>
> 
> Release note is available at http://zeppelin.apache.org/releases/zeppelin-release-0.6.0.html <http://zeppelin.incubator.apache.org/releases/zeppelin-release-0.6.0.html>
> 
> We welcome your help and feedback. For more information on the project and how to get involved, visit our website at http://zeppelin.apache.org/ <http://zeppelin.apache.org/>
> 
> Thanks to all users and contributors who have helped to improve Apache Zeppelin. 
> 
> Regards,
> The Apache Zeppelin community
>