You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by lu...@sina.com on 2015/07/08 09:38:31 UTC

回复:RE: Hibench build fail

Hi Ted and Grace,       Retried with Spark 1.4.0��still failed with same phenomenon.here is a log.FYI.       What else details may help?        BTW, is it a necessary step to run Hibench test for my spark cluster?  I also tried to skip building Hibench to execute "bin/run-all.sh", but also got errors while running.    thanks.
      
       


--------------------------------

 

Thanks&Best regards!
San.Luo

----- ԭʼ�ʼ� -----
�����ˣ�"Huang, Jie" <ji...@intel.com>
�ռ��ˣ�Ted Yu <yu...@gmail.com>, ?? <lu...@sina.com>
�����ˣ�user <us...@spark.apache.org>
���⣺RE: Hibench build fail
���ڣ�2015��07��08�� 09��20��





Hi Hui,
 
Could you please add more descriptions (about the failure) in HiBench github Issues?
 
HiBench works with spark 1.2 and above.

 
Thank you && Best Regards,
Grace
��Huang Jie)
 
From: Ted Yu [mailto:yuzhihong@gmail.com]


Sent: Wednesday, July 8, 2015 12:50 AM

To: �޻�

Cc: user; Huang, Jie

Subject: Re: Hibench build fail
 

bq. Need I specify my spark version

 


Looks like the build used 1.4.0 SNAPSHOT. Please use 1.4.0 release.


 


Cheers



 

On Mon, Jul 6, 2015 at 11:50 PM, <lu...@sina.com> wrote:

Hi grace,
     recently I am trying Hibench to evaluate my spark cluster, however I got a problem in building Hibench, would you help to take a look? thanks.
     It fails at building Sparkbench, and you may check the attched pic for more info.
     My spark version :1.3.1,hadoop version :2.7.0 and HiBench version:4.0, python 2.6.6. It is reported that failed for spark1.4 and MR1,which I didn't install in my cluster.Need I specify my spark version and hadoop version when I am running "bin/build-all.sh"?
     thanks.
 

--------------------------------



 

Thanks&amp;Best regards!

San.Luo





---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscribe@spark.apache.org

For additional commands, e-mail: user-help@spark.apache.org