You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "anthonyjschulte@gmail.com" <an...@gmail.com> on 2014/06/26 23:55:00 UTC
SparkSQL- saveAsParquetFile
Hi all:
I am attempting to execute a simple test of the SparkSQL system capability
of persisting to parquet files...
My code is:
val conf = new SparkConf()
.setMaster( """local[1]""")
.setAppName("test")
implicit val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
case class Trivial(trivial: String = "trivial")
val rdd = sc.parallelize(Seq(Trivial("s"), Trivial("T")))
rdd.saveAsParquetFile("trivial.parquet")
When this code executes, a trivial.parquet directory is created, and a
_temporary subdirectory, but there is no content in these files... only
directories. Is there an obvious mistake in my code which would cause this
execution to fail?
Thank you--
Tony
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-saveAsParquetFile-tp8375.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.