You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/05/12 21:00:00 UTC
[jira] [Resolved] (SPARK-7276) withColumn is very slow on dataframe
with large number of columns
[ https://issues.apache.org/jira/browse/SPARK-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-7276.
-------------------------------------
Resolution: Pending Closed
Fix Version/s: 1.4.0
Issue resolved by pull request 5831
[https://github.com/apache/spark/pull/5831]
> withColumn is very slow on dataframe with large number of columns
> -----------------------------------------------------------------
>
> Key: SPARK-7276
> URL: https://issues.apache.org/jira/browse/SPARK-7276
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 1.3.1
> Reporter: Alexandre CLEMENT
> Assignee: Wenchen Fan
> Fix For: 1.4.0
>
>
> The code snippet demonstrates the problem.
> {code}
> import org.apache.spark.sql._
> import org.apache.spark.sql.types._
> val sparkConf = new SparkConf().setAppName("Spark Test").setMaster(System.getProperty("spark.master", "local[4]"))
> val sc = new SparkContext(sparkConf)
> val sqlContext = new SQLContext(sc)
> import sqlContext.implicits._
> val custs = Seq(
> Row(1, "Bob", 21, 80.5),
> Row(2, "Bobby", 21, 80.5),
> Row(3, "Jean", 21, 80.5),
> Row(4, "Fatime", 21, 80.5)
> )
> var fields = List(
> StructField("id", IntegerType, true),
> StructField("a", IntegerType, true),
> StructField("b", StringType, true),
> StructField("target", DoubleType, false))
> val schema = StructType(fields)
> var rdd = sc.parallelize(custs)
> var df = sqlContext.createDataFrame(rdd, schema)
> for (i <- 1 to 200) {
> val now = System.currentTimeMillis
> df = df.withColumn("a_new_col_" + i, df("a") + i)
> println(s"$i -> " + (System.currentTimeMillis - now))
> }
> df.show()
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org