You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Scott Zelenka <sz...@cisco.com> on 2016/04/13 23:23:41 UTC

Install issue with CDH 5.7.0 & Spark 1.6.0

Hi,

I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also has the 
same symptoms) on a new CDH cluster running Hadoop 2.6.0-cdh5.7.0 and 
Spark 1.6.0, but I'm getting this error when I use SPARK_HOME to point 
to the /opt/cloudera/parcels/CDH/lib/spark directory in zeppelin-env.sh:

java.lang.NoSuchMethodException: 
org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.classServerUri()

Which seems to imply that there are no Interpreters available for Spark? 
Is there a way to get around this? I've tried deleting the build folder 
and pulling a fresh copy, but end up at the same place.

It built successfully on Ubuntu 14.0.4 LTS and Maven 3.3.3 using this 
command:

sudo mvn clean package -Dspark.version=1.6.0 -Pspark-1.6 
-Dhadoop.version=2.6.0-cdh5.6.0 -Phadoop-2.6 -Ppyspark -Pvendor-repo 
-DskipTests*
*
However, if I leave the configuration at it's default level, when I try 
to run the "Zeppelin Tutorial", it'll return this error:

akka.ConfigurationException: Akka JAR version [2.2.3] does not match the 
provided config version [2.3.11]

Which makes sense, because the CDH builds Spark under Akka version 
2.2.3, but I'm not sure why the builtin Spark is attempting to use 
2.2.3? Shouldn't I be able to run Zeppelin without any dependencies on 
CDH, or did the -Pvendor-repo mess up this build?
http://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_spark_ic.html

Any guidance is welcome!

thx,
z
-- 
Scott Zelenka
Jabber Engineering - US
Phone: (+1) 919-392-1394
Email: szelenka@cisco.com

This email may contain confidential and privileged material for the sole 
use of the intended recipient. Any review, use, distribution or 
disclosure by others is strictly prohibited. If you are not the intended 
recipient (or authorized to receive for the recipient), please contact 
the sender by reply email and delete all copies of this message.

For corporate legal information go to:
http://www.cisco.com/web/about/doing_business/legal/cri/index.html

Re: Install issue with CDH 5.7.0 & Spark 1.6.0

Posted by Felix Cheung <fe...@hotmail.com>.
I assume you mean CDH 5.6.
Unfortunately I don't have 5.7 to test with and they don't offer 5.7 on the quick start vm.
NoSuchMethodException would indicate some sort of breaking API change. I will try to dig into the branch 5.7 of the cloudera fork of spark source code.



    _____________________________
From: Scott Zelenka <sz...@cisco.com>
Sent: Thursday, April 14, 2016 8:30 PM
Subject: Re: Install issue with CDH 5.7.0 & Spark 1.6.0
To:  <us...@zeppelin.incubator.apache.org>


            Logs from the failed Spark 1.6.0 on CDH      5.7.0 attached. I couldn't find anything useful in them.
      
      It's basically immediately after the install completed, navigating      to the web UI and trying to run the Tutorial demo.
    
    I reverted        back to Spark 1.5.0 on CDH 2.6.0 and Zeppelin is working fine on        the same machine and integrated with Spark on YARN in CDH.
      
      thx,
      z
    
    On 4/13/16 8:53 PM, Felix Cheung wrote:
                            hi Scott        
                Vendor-repo would be the way to go. It is possible in this          case CDH Spark 1.6 has some incompatible API changes, though I          couldn't find it yet. Do you have more from the logs on          that NoSuchMethodException?        
                    _____________________________
        From: Scott Zelenka <sz...@cisco.com>
        Sent: Wednesday, April 13, 2016 2:23 PM
        Subject: Install issue with CDH 5.7.0 & Spark 1.6.0
        To: <us...@zeppelin.incubator.apache.org>
        
        
                Hi,
          
          I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also          has the same symptoms) on a new CDH cluster running Hadoop          2.6.0-cdh5.7.0 and Spark 1.6.0, but I'm getting this error          when I use SPARK_HOME to point to the          /opt/cloudera/parcels/CDH/lib/spark directory in          zeppelin-env.sh:
        
            java.lang.NoSuchMethodException:            org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.classServerUri()
            
          Which seems to imply              that there are no Interpreters available for Spark? Is              there a way to get around this? I've tried deleting the              build folder and pulling a fresh copy, but end up at the              same place.
              
            It built successfully on Ubuntu 14.0.4 LTS and          Maven 3.3.3 using this command:
          
          sudo mvn clean package -Dspark.version=1.6.0 -Pspark-1.6          -Dhadoop.version=2.6.0-cdh5.6.0 -Phadoop-2.6 -Ppyspark          -Pvendor-repo -DskipTests
          
          However, if I leave the configuration at it's default level,          when I try to run the "Zeppelin Tutorial", it'll return this          error:
          
          akka.ConfigurationException: Akka JAR version [2.2.3] does not          match the provided config version [2.3.11]
          
          Which makes sense, because the CDH builds Spark under Akka          version 2.2.3, but I'm not sure why the builtin Spark is          attempting to use 2.2.3? Shouldn't I be able to run Zeppelin          without any dependencies on CDH, or did the -Pvendor-repo mess          up this build?
          http://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_spark_ic.html
          
          Any guidance is welcome!
          
          thx,
          z 
        -- 
           Scott Zelenka 
            Jabber Engineering - US 
            Phone: (+1) 919-392-1394            
            Email: szelenka@cisco.com 
            
            This email may contain confidential and privileged material            for the sole use of the intended recipient. Any review, use,            distribution or disclosure by others is strictly prohibited.            If you are not the intended recipient (or authorized to            receive for the recipient), please contact the sender by            reply email and delete all copies of this message. 
            
            For corporate legal information go to: 
            http://www.cisco.com/web/about/doing_business/legal/cri/index.html                          
        
              
  


  

Re: Install issue with CDH 5.7.0 & Spark 1.6.0

Posted by Scott Zelenka <sz...@cisco.com>.
Logs from the failed Spark 1.6.0 on CDH 5.7.0 attached. I couldn't find 
anything useful in them.

It's basically immediately after the install completed, navigating to 
the web UI and trying to run the Tutorial demo.

I reverted back to Spark 1.5.0 on CDH 2.6.0 and Zeppelin is working fine 
on the same machine and integrated with Spark on YARN in CDH.

thx,
z

On 4/13/16 8:53 PM, Felix Cheung wrote:
> hi Scott
>
> Vendor-repo would be the way to go. It is possible in this case CDH 
> Spark 1.6 has some incompatible API changes, though I couldn't find it 
> yet. Do you have more from the logs on that NoSuchMethodException?
>
> _____________________________
> From: Scott Zelenka <szelenka@cisco.com <ma...@cisco.com>>
> Sent: Wednesday, April 13, 2016 2:23 PM
> Subject: Install issue with CDH 5.7.0 & Spark 1.6.0
> To: <users@zeppelin.incubator.apache.org 
> <ma...@zeppelin.incubator.apache.org>>
>
>
> Hi,
>
> I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also has the 
> same symptoms) on a new CDH cluster running Hadoop 2.6.0-cdh5.7.0 and 
> Spark 1.6.0, but I'm getting this error when I use SPARK_HOME to point 
> to the /opt/cloudera/parcels/CDH/lib/spark directory in zeppelin-env.sh:
>
> java.lang.NoSuchMethodException: 
> org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.classServerUri()
>
> Which seems to imply that there are no Interpreters available for 
> Spark? Is there a way to get around this? I've tried deleting the 
> build folder and pulling a fresh copy, but end up at the same place.
>
> It built successfully on Ubuntu 14.0.4 LTS and Maven 3.3.3 using this 
> command:
>
> sudo mvn clean package -Dspark.version=1.6.0 -Pspark-1.6 
> -Dhadoop.version=2.6.0-cdh5.6.0 -Phadoop-2.6 -Ppyspark -Pvendor-repo 
> -DskipTests*
> *
> However, if I leave the configuration at it's default level, when I 
> try to run the "Zeppelin Tutorial", it'll return this error:
>
> akka.ConfigurationException: Akka JAR version [2.2.3] does not match 
> the provided config version [2.3.11]
>
> Which makes sense, because the CDH builds Spark under Akka version 
> 2.2.3, but I'm not sure why the builtin Spark is attempting to use 
> 2.2.3? Shouldn't I be able to run Zeppelin without any dependencies on 
> CDH, or did the -Pvendor-repo mess up this build?
> http://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_spark_ic.html
>
> Any guidance is welcome!
>
> thx,
> z
> -- 
> Scott Zelenka
> Jabber Engineering - US
> Phone: (+1) 919-392-1394 <tel:%28+1%29%20919-392-1394>
> Email: szelenka@cisco.com
>
> This email may contain confidential and privileged material for the 
> sole use of the intended recipient. Any review, use, distribution or 
> disclosure by others is strictly prohibited. If you are not the 
> intended recipient (or authorized to receive for the recipient), 
> please contact the sender by reply email and delete all copies of this 
> message.
>
> For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/index.html
>
>


Re: Install issue with CDH 5.7.0 & Spark 1.6.0

Posted by Felix Cheung <fe...@hotmail.com>.
hi Scott
Vendor-repo would be the way to go. It is possible in this case CDH Spark 1.6 has some incompatible API changes, though I couldn't find it yet. Do you have more from the logs on that NoSuchMethodException?

    _____________________________
From: Scott Zelenka <sz...@cisco.com>
Sent: Wednesday, April 13, 2016 2:23 PM
Subject: Install issue with CDH 5.7.0 & Spark 1.6.0
To:  <us...@zeppelin.incubator.apache.org>


            Hi,
      
      I'm trying to build/install Zeppelin 0.6.0 (version 0.5.6 also has      the same symptoms) on a new CDH cluster running Hadoop      2.6.0-cdh5.7.0 and Spark 1.6.0, but I'm getting this error when I      use SPARK_HOME to point to the /opt/cloudera/parcels/CDH/lib/spark      directory in zeppelin-env.sh:
    
        java.lang.NoSuchMethodException:        org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.classServerUri()
        
      Which seems to imply that          there are no Interpreters available for Spark? Is there a way          to get around this? I've tried deleting the build folder and          pulling a fresh copy, but end up at the same place.
          
        It built successfully on Ubuntu 14.0.4 LTS and      Maven 3.3.3 using this command:
      
      sudo mvn clean package -Dspark.version=1.6.0 -Pspark-1.6      -Dhadoop.version=2.6.0-cdh5.6.0 -Phadoop-2.6 -Ppyspark      -Pvendor-repo -DskipTests
      
      However, if I leave the configuration at it's default level, when      I try to run the "Zeppelin Tutorial", it'll return this error:
      
      akka.ConfigurationException: Akka JAR version [2.2.3] does not      match the provided config version [2.3.11]
      
      Which makes sense, because the CDH builds Spark under Akka version      2.2.3, but I'm not sure why the builtin Spark is attempting to use      2.2.3? Shouldn't I be able to run Zeppelin without any      dependencies on CDH, or did the -Pvendor-repo mess up this build?
      http://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_spark_ic.html
      
      Any guidance is welcome!
      
      thx,
      z        
        -- 
              Scott Zelenka        
        Jabber Engineering - US        
        Phone: (+1) 919-392-1394        
        Email: szelenka@cisco.com        
        
        This email may contain confidential and privileged material for        the sole use of the intended recipient. Any review, use,        distribution or disclosure by others is strictly prohibited. If        you are not the intended recipient (or authorized to receive for        the recipient), please contact the sender by reply email and        delete all copies of this message.        
        
        For corporate legal information go to:        
http://www.cisco.com/web/about/doing_business/legal/cri/index.html