You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "WEI PENG (JIRA)" <ji...@apache.org> on 2018/09/15 01:42:00 UTC
[jira] [Updated] (SPARK-25434) failed to locate the winutils binary
in the hadoop binary path
[ https://issues.apache.org/jira/browse/SPARK-25434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
WEI PENG updated SPARK-25434:
-----------------------------
Component/s: PySpark
> failed to locate the winutils binary in the hadoop binary path
> --------------------------------------------------------------
>
> Key: SPARK-25434
> URL: https://issues.apache.org/jira/browse/SPARK-25434
> Project: Spark
> Issue Type: Bug
> Components: PySpark, Spark Shell
> Affects Versions: 2.3.1
> Reporter: WEI PENG
> Priority: Major
>
> C:\Users\WEI>pyspark
> Python 3.5.6 |Anaconda custom (64-bit)| (default, Aug 26 2018, 16:05:27) [MSC v.
> 1900 64 bit (AMD64)] on win32
> Type "help", "copyright", "credits" or "license" for more information.
> 2018-09-14 21:12:39 ERROR Shell:397 - Failed to locate the winutils binary in th
> e hadoop binary path
> java.io.IOException: Could not locate executable null\bin\winutils.exe in the Ha
> doop binaries.
> at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:379)
> at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:394)
> at org.apache.hadoop.util.Shell.<clinit>(Shell.java:387)
> at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
> at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(Secur
> ityUtil.java:611)
> at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupI
> nformation.java:273)
> at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(Use
> rGroupInformation.java:261)
> at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(
> UserGroupInformation.java:791)
> at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGrou
> pInformation.java:761)
> at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGr
> oupInformation.java:634)
> at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils
> .scala:2467)
> at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils
> .scala:2467)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2467)
> at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:220)
> at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.
> scala:408)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub
> mit$$secMgr$1(SparkSubmit.scala:408)
> at org.apache.spark.deploy.SparkSubmit$$anonfun$doPrepareSubmitEnvironme
> nt$7.apply(SparkSubmit.scala:416)
> at org.apache.spark.deploy.SparkSubmit$$anonfun$doPrepareSubmitEnvironme
> nt$7.apply(SparkSubmit.scala:416)
> at scala.Option.map(Option.scala:146)
> at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(Spark
> Submit.scala:415)
> at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSu
> bmit.scala:250)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 2018-09-14 21:12:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop lib
> rary for your platform... using builtin-java classes where applicable
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLeve
> l(newLevel).
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /__ / .__/\_,_/_/ /_/\_\ version 2.3.1
> /_/
> Using Python version 3.5.6 (default, Aug 26 2018 16:05:27)
> SparkSession available as 'spark'.
> >>>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org