You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Chammas (JIRA)" <ji...@apache.org> on 2015/05/11 19:55:59 UTC
[jira] [Commented] (SPARK-7507) pyspark.sql.types.StructType and
Row should implement __iter__()
[ https://issues.apache.org/jira/browse/SPARK-7507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14538283#comment-14538283 ]
Nicholas Chammas commented on SPARK-7507:
-----------------------------------------
On a related note, perhaps we should also offer a method to quickly turn Python dicts back into StructTypes or Rows.
> pyspark.sql.types.StructType and Row should implement __iter__()
> ----------------------------------------------------------------
>
> Key: SPARK-7507
> URL: https://issues.apache.org/jira/browse/SPARK-7507
> Project: Spark
> Issue Type: Sub-task
> Components: PySpark, SQL
> Reporter: Nicholas Chammas
> Priority: Minor
>
> {{StructType}} looks an awful lot like a Python dictionary.
> However, it doesn't implement {{\_\_iter\_\_()}}, so doing a quick conversion like this doesn't work:
> {code}
> >>> df = sqlContext.jsonRDD(sc.parallelize(['{"name": "El Magnifico"}']))
> >>> df.schema
> StructType(List(StructField(name,StringType,true)))
> >>> dict(df.schema)
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> TypeError: 'StructType' object is not iterable
> {code}
> This would be super helpful for doing any custom schema manipulations without having to go through the whole {{.json() -> json.loads() -> manipulate() -> json.dumps() -> .fromJson()}} charade.
> Same goes for {{Row}}, which offers an [{{asDict()}}|https://spark.apache.org/docs/1.3.1/api/python/pyspark.sql.html#pyspark.sql.Row.asDict] method but doesn't support the more Pythonic {{dict(Row)}}.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org