You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Minnow Noir <mi...@gmail.com> on 2015/03/29 20:16:31 UTC
Arguments/parameters in Spark shell scripts?
How does one consume parameters passed to a Scala script via spark-shell
-i?
1. If I use an object with a main() method, the println outputs nothing as
if not called:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object Test {
def main(args: Array[String]) {
println("args(0): " + args(0))
}
}
System.exit(0)
spark-shell -i Test.scala pizza
=> no print output
2. If I use the Scala args insead, the compiler complains that the args
object
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
println("args(0): " + args(0))
System.exit(0)
spark-shell -i Test.scala pizza
=>
<console>:16: error: not found: value args
println("args(0): " + args(0))
Thanks,
Alec