You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jacek Tokar (JIRA)" <ji...@apache.org> on 2018/10/02 17:48:00 UTC
[jira] [Commented] (SPARK-7276) withColumn is very slow on
dataframe with large number of columns
[ https://issues.apache.org/jira/browse/SPARK-7276?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16635891#comment-16635891 ]
Jacek Tokar commented on SPARK-7276:
------------------------------------
I confirm Barry's observation.
> withColumn is very slow on dataframe with large number of columns
> -----------------------------------------------------------------
>
> Key: SPARK-7276
> URL: https://issues.apache.org/jira/browse/SPARK-7276
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 1.3.1
> Reporter: Alexandre CLEMENT
> Assignee: Wenchen Fan
> Priority: Major
> Fix For: 1.4.0
>
>
> The code snippet demonstrates the problem.
> {code}
> import org.apache.spark.sql._
> import org.apache.spark.sql.types._
> val sparkConf = new SparkConf().setAppName("Spark Test").setMaster(System.getProperty("spark.master", "local[4]"))
> val sc = new SparkContext(sparkConf)
> val sqlContext = new SQLContext(sc)
> import sqlContext.implicits._
> val custs = Seq(
> Row(1, "Bob", 21, 80.5),
> Row(2, "Bobby", 21, 80.5),
> Row(3, "Jean", 21, 80.5),
> Row(4, "Fatime", 21, 80.5)
> )
> var fields = List(
> StructField("id", IntegerType, true),
> StructField("a", IntegerType, true),
> StructField("b", StringType, true),
> StructField("target", DoubleType, false))
> val schema = StructType(fields)
> var rdd = sc.parallelize(custs)
> var df = sqlContext.createDataFrame(rdd, schema)
> for (i <- 1 to 200) {
> val now = System.currentTimeMillis
> df = df.withColumn("a_new_col_" + i, df("a") + i)
> println(s"$i -> " + (System.currentTimeMillis - now))
> }
> df.show()
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org