You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by Senthil <se...@gmail.com> on 2015/09/07 11:47:16 UTC

HDP 2.2 + Spark 1.4

Has anyone built HDP compatible RPMs (CentOS6) for Spark 1.4?

Using Ambari, I am trying to automate Hadoop cluster setup with Hadoop 2.6,
Hive 0.14 and Spark 1.4. Ambari supports HDP 2.2 and able to install Hadoop
2.6 and Hive. I like to customize the Ambari stack to include Spark 1.4
instead of 1.2 (bundled) .

Regards,

- Senthil

Re: HDP 2.2 + Spark 1.4

Posted by Senthil <se...@gmail.com>.
All,

I noticed Hortonworks has published Spark 1.4.1 Preview packages.
http://hortonworks.com/hadoop-tutorial/apache-spark-1-4-1-technical-preview-with-hdp/

It worked for me with HDP 2.2

Thanks

Senthil

On Wed, Sep 9, 2015 at 12:07 AM, Senthil <se...@gmail.com> wrote:

> Hi Christian
>
> Thank you so much. the build meets my requirements. However I have trouble
> in running the spark job after installation. The jobs are failing with the
> following error.  I get the same error when I run SparkPI app or
> sparkshell. Also I could not see the image illustration to modify yarn.site
> (hdp.stack and hdp.version) . Could you tell us how to set hdp-stack. Any
> help will be greatly appreciated.
>
>
> Here is the error on console as well as job log
>
> Application application_1441682258490_0008 failed 2 times due to AM
> Container for appattempt_1441682258490_0008_000002 exited with exitCode: 1
> For more detailed output, check application tracking page:
> http://ip-10-0-3-206.us-west-2.compute.internal:8088/proxy/application_1441682258490_0008/Then,
> click on links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_1441682258490_0008_02_000001
> Exit code: 1
> Exception message:
> /hadoop/yarn/local/usercache/root/appcache/application_1441682258490_0008/container_1441682258490_0008_02_000001/launch_container.sh:
> line 27:
> $PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
> bad substitution
> Stack trace: ExitCodeException exitCode=1:
> /hadoop/yarn/local/usercache/root/appcache/application_1441682258490_0008/container_1441682258490_0008_02_000001/launch_container.sh:
> line 27:
> $PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
> bad substitution
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
> at org.apache.hadoop.util.Shell.run(Shell.java:455)
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
> at
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
> at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
> at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Container exited with a non-zero exit code 1
> Failing this attempt. Failing the application.
>
>
>
> Regards,
>
> Senthil
>
> On Mon, Sep 7, 2015 at 10:42 PM, Christian Tzolov <ct...@pivotal.io>
> wrote:
>
>> Hi Senthil,
>>
>> I've build Spark 1.4 RPMs using BigTop. Here are the instructions:
>> http://blog.tzolov.net/2015/07/how-to-install-spark-140-on-pivotalhd.html?view=sidebar
>>
>> it has been tested with HDP2.2, PHD3.0. Make sure to set the right
>> stack.name and stack.versions!
>>
>> Cheers,
>> Christian
>>
>> On 7 September 2015 at 11:47, Senthil <se...@gmail.com> wrote:
>>
>>>
>>> Has anyone built HDP compatible RPMs (CentOS6) for Spark 1.4?
>>>
>>> Using Ambari, I am trying to automate Hadoop cluster setup with Hadoop
>>> 2.6, Hive 0.14 and Spark 1.4. Ambari supports HDP 2.2 and able to install
>>> Hadoop 2.6 and Hive. I like to customize the Ambari stack to include Spark
>>> 1.4 instead of 1.2 (bundled) .
>>>
>>> Regards,
>>>
>>> - Senthil
>>>
>>
>>
>>
>> --
>> Christian Tzolov <http://www.linkedin.com/in/tzolov> | Solution
>> Architect, EMEA Practice Team | Pivotal <http://pivotal.io/>
>> ctzolov@pivotal.io|+31610285517
>>
>
>
>
> --
> - Senthil
>



-- 
- Senthil

Re: HDP 2.2 + Spark 1.4

Posted by Senthil <se...@gmail.com>.
Hi Christian

Thank you so much. the build meets my requirements. However I have trouble
in running the spark job after installation. The jobs are failing with the
following error.  I get the same error when I run SparkPI app or
sparkshell. Also I could not see the image illustration to modify yarn.site
(hdp.stack and hdp.version) . Could you tell us how to set hdp-stack. Any
help will be greatly appreciated.


Here is the error on console as well as job log

Application application_1441682258490_0008 failed 2 times due to AM
Container for appattempt_1441682258490_0008_000002 exited with exitCode: 1
For more detailed output, check application tracking page:
http://ip-10-0-3-206.us-west-2.compute.internal:8088/proxy/application_1441682258490_0008/Then,
click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1441682258490_0008_02_000001
Exit code: 1
Exception message:
/hadoop/yarn/local/usercache/root/appcache/application_1441682258490_0008/container_1441682258490_0008_02_000001/launch_container.sh:
line 27:
$PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
bad substitution
Stack trace: ExitCodeException exitCode=1:
/hadoop/yarn/local/usercache/root/appcache/application_1441682258490_0008/container_1441682258490_0008_02_000001/launch_container.sh:
line 27:
$PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
bad substitution
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.



Regards,

Senthil

On Mon, Sep 7, 2015 at 10:42 PM, Christian Tzolov <ct...@pivotal.io>
wrote:

> Hi Senthil,
>
> I've build Spark 1.4 RPMs using BigTop. Here are the instructions:
> http://blog.tzolov.net/2015/07/how-to-install-spark-140-on-pivotalhd.html?view=sidebar
>
> it has been tested with HDP2.2, PHD3.0. Make sure to set the right
> stack.name and stack.versions!
>
> Cheers,
> Christian
>
> On 7 September 2015 at 11:47, Senthil <se...@gmail.com> wrote:
>
>>
>> Has anyone built HDP compatible RPMs (CentOS6) for Spark 1.4?
>>
>> Using Ambari, I am trying to automate Hadoop cluster setup with Hadoop
>> 2.6, Hive 0.14 and Spark 1.4. Ambari supports HDP 2.2 and able to install
>> Hadoop 2.6 and Hive. I like to customize the Ambari stack to include Spark
>> 1.4 instead of 1.2 (bundled) .
>>
>> Regards,
>>
>> - Senthil
>>
>
>
>
> --
> Christian Tzolov <http://www.linkedin.com/in/tzolov> | Solution
> Architect, EMEA Practice Team | Pivotal <http://pivotal.io/>
> ctzolov@pivotal.io|+31610285517
>



-- 
- Senthil

Re: HDP 2.2 + Spark 1.4

Posted by Christian Tzolov <ct...@pivotal.io>.
Hi Senthil,

I've build Spark 1.4 RPMs using BigTop. Here are the instructions:
http://blog.tzolov.net/2015/07/how-to-install-spark-140-on-pivotalhd.html?view=sidebar

it has been tested with HDP2.2, PHD3.0. Make sure to set the right
stack.name and stack.versions!

Cheers,
Christian

On 7 September 2015 at 11:47, Senthil <se...@gmail.com> wrote:

>
> Has anyone built HDP compatible RPMs (CentOS6) for Spark 1.4?
>
> Using Ambari, I am trying to automate Hadoop cluster setup with Hadoop
> 2.6, Hive 0.14 and Spark 1.4. Ambari supports HDP 2.2 and able to install
> Hadoop 2.6 and Hive. I like to customize the Ambari stack to include Spark
> 1.4 instead of 1.2 (bundled) .
>
> Regards,
>
> - Senthil
>



-- 
Christian Tzolov <http://www.linkedin.com/in/tzolov> | Solution Architect,
EMEA Practice Team | Pivotal <http://pivotal.io/>
ctzolov@pivotal.io|+31610285517