You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Devesh Raj Singh <ra...@gmail.com> on 2016/02/02 07:08:22 UTC

can we do column bind of 2 dataframes in spark R? similar to cbind in R?

Hi,

I want to merge 2 dataframes in sparkR columnwise similar to cbind in R. We
have "unionAll" for r bind but could not find anything for cbind in sparkR

-- 
Warm regards,
Devesh.

RE: can we do column bind of 2 dataframes in spark R? similar to cbind in R?

Posted by "Sun, Rui" <ru...@intel.com>.
Devesh,

The cbind-like operation is not supported by Scala DataFrame API, so it is also not supported in SparkR.

You may try to workaround this by trying the approach in http://stackoverflow.com/questions/32882529/how-to-zip-twoor-more-dataframe-in-spark

You could also submit a JIRA requesting such feature in Spark community.

From: Devesh Raj Singh [mailto:raj.devesh99@gmail.com]
Sent: Tuesday, February 2, 2016 2:08 PM
To: user@spark.apache.org
Subject: can we do column bind of 2 dataframes in spark R? similar to cbind in R?

Hi,

I want to merge 2 dataframes in sparkR columnwise similar to cbind in R. We have "unionAll" for r bind but could not find anything for cbind in sparkR

--
Warm regards,
Devesh.