You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/08/18 09:20:00 UTC

[jira] [Resolved] (SPARK-25137) NumberFormatException` when starting spark-shell from Mac terminal

     [ https://issues.apache.org/jira/browse/SPARK-25137?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-25137.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.4.0

Issue resolved by pull request 22130
[https://github.com/apache/spark/pull/22130]

> NumberFormatException` when starting spark-shell  from Mac terminal
> -------------------------------------------------------------------
>
>                 Key: SPARK-25137
>                 URL: https://issues.apache.org/jira/browse/SPARK-25137
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.4.0
>         Environment: MacOS High Sirra Version 10.13.6
>  
>            Reporter: Vinod KC
>            Assignee: Vinod KC
>            Priority: Trivial
>              Labels: easyfix
>             Fix For: 2.4.0
>
>
> NumberFormatException` when starting spark-shell from Mac terminal
> ./bin/spark-shell
>  18/08/17 08:43:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>  Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
>  Setting default log level to "WARN".
>  To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
>  Welcome to
>  ____ __
>  / __/__ ___ _____/ /__
>  _\ \/ _ \/ _ `/ __/ '_/
>  /___/ .__/_,_/_/ /_/_\ version 2.4.0-SNAPSHOT
>  /_/
>  Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181)
>  Type in expressions to have them evaluated.
>  Type :help for more information.
>  [ERROR] Failed to construct terminal; falling back to unsupported
>  java.lang.NumberFormatException: For input string: "0x100"
>  at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>  at java.lang.Integer.parseInt(Integer.java:580)
>  at java.lang.Integer.valueOf(Integer.java:766)
>  at jline.internal.InfoCmp.parseInfoCmp(InfoCmp.java:59)
>  at jline.UnixTerminal.parseInfoCmp(UnixTerminal.java:242)
>  at jline.UnixTerminal.<init>(UnixTerminal.java:65)
>  at jline.UnixTerminal.<init>(UnixTerminal.java:50)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at java.lang.Class.newInstance(Class.java:442)
>  at jline.TerminalFactory.getFlavor(TerminalFactory.java:211)
>  at jline.TerminalFactory.create(TerminalFactory.java:102)
>  at jline.TerminalFactory.get(TerminalFactory.java:186)
>  at jline.TerminalFactory.get(TerminalFactory.java:192)
>  at jline.console.ConsoleReader.<init>(ConsoleReader.java:243)
>  at jline.console.ConsoleReader.<init>(ConsoleReader.java:235)
>  at jline.console.ConsoleReader.<init>(ConsoleReader.java:223)
>  at scala.tools.nsc.interpreter.jline.JLineConsoleReader.<init>(JLineReader.scala:64)
>  at scala.tools.nsc.interpreter.jline.InteractiveReader.<init>(JLineReader.scala:33)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:858)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiater$1$1.apply(ILoop.scala:855)
>  at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:862)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$22$$anonfun$apply$10.apply(ILoop.scala:873)
>  at scala.util.Try$.apply(Try.scala:192)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$22.apply(ILoop.scala:873)
>  at scala.collection.immutable.Stream.map(Stream.scala:418)
>  at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:873)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$newReader$1$1.apply(ILoop.scala:893)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.newReader$1(ILoop.scala:893)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.scala$tools$nsc$interpreter$ILoop$$anonfun$$preLoop$1(ILoop.scala:897)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(ILoop.scala:964)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:990)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:891)
>  at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:891)
>  at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>  at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:891)
>  at org.apache.spark.repl.Main$.doMain(Main.scala:78)
>  at org.apache.spark.repl.Main$.main(Main.scala:58)
>  at org.apache.spark.repl.Main.main(Main.scala)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:847)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
>  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
>  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:922)
>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>  scala> Spark context Web UI available at [http://192.168.0.199:4040|http://192.168.0.199:4040/]
>  Spark context available as 'sc' (master = local[*], app id = local-1534475624109).
>  Spark session available as 'spark'.
>  scala> spark.version
>  res0: String = 2.4.0-SNAPSHOT
> This issue is fixed in jline  2.14.4
> Jline issue: [https://github.com/jline/jline2/issues/281]
> Spark needs to bump up the jline version



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org