You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by cingram <ci...@gmail.com> on 2015/08/26 16:46:03 UTC

Spark 1.3.1 saveAsParquetFile hangs on app exit

I have a simple test that is hanging when using s3a with spark 1.3.1. Is
there something I need to do to cleanup the S3A file system? The write to S3
appears to have worked but this job hangs in the spark-shell and using
spark-submit. Any help would be greatly appreciated. TIA.

import sqlContext.implicits._
import com.datastax.spark.connector._
case class LU(userid: String, timestamp: Long, lat: Double, lon: Double)
val uid ="testuser"
val lue = sc.cassandraTable[LU]("test", "foo").where("userid=?", uid).toDF
lue.saveAsParquetFile("s3a://twc-scratch/craig_lues")



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-1-saveAsParquetFile-hangs-on-app-exit-tp24460.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark 1.3.1 saveAsParquetFile hangs on app exit

Posted by cingram <ci...@gmail.com>.
spark-shell-hang-on-exit.tdump
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n24461/spark-shell-hang-on-exit.tdump>  





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-1-saveAsParquetFile-hangs-on-app-exit-tp24460p24461.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark 1.3.1 saveAsParquetFile hangs on app exit

Posted by Cheng Lian <li...@gmail.com>.
Could you please show jstack result of the hanged process? Thanks!

Cheng

On 8/26/15 10:46 PM, cingram wrote:
> I have a simple test that is hanging when using s3a with spark 1.3.1. Is
> there something I need to do to cleanup the S3A file system? The write to S3
> appears to have worked but this job hangs in the spark-shell and using
> spark-submit. Any help would be greatly appreciated. TIA.
>
> import sqlContext.implicits._
> import com.datastax.spark.connector._
> case class LU(userid: String, timestamp: Long, lat: Double, lon: Double)
> val uid ="testuser"
> val lue = sc.cassandraTable[LU]("test", "foo").where("userid=?", uid).toDF
> lue.saveAsParquetFile("s3a://twc-scratch/craig_lues")
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-1-saveAsParquetFile-hangs-on-app-exit-tp24460.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org