You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anbutech <an...@outlook.com> on 2020/05/05 17:19:33 UTC

Pyspark and snowflake Column Mapping

Hi Team,

While working on the json data and we flattened the unstrucured data into
structured format.so here we are having spark data types like
Array<STRUCT&lt;key:value,... >> fields and Array<string> data type columns
in the databricks delta table.

while loading the data from  databricks spark connector to snowflake we
noticed that the Array<STRUCT&lt;>> and Array<string> columns mapped to
variant type in snowflake.actually we are expecting as same array type in
snowflake.

how do we handle this case while loading into snowflake.

please share your ideas.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org