You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/06/01 17:42:39 UTC

[GitHub] [spark] ueshin commented on a diff in pull request #36640: [SPARK-39262][PYTHON] Correct the behavior of creating DataFrame from an RDD

ueshin commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r887134684


##########
python/pyspark/sql/session.py:
##########
@@ -611,8 +611,8 @@ def _inferSchema(
         :class:`pyspark.sql.types.StructType`
         """
         first = rdd.first()
-        if not first:
-            raise ValueError("The first row in RDD is empty, " "can not infer schema")
+        if first is None:

Review Comment:
   That's fine, but I feel the error messages are pretty inconsistent:
   
   1. for the case the first value is `None`:
   
   ```py
   >>> spark.createDataFrame(spark._sc.parallelize([None, None]))
   ...
   TypeError: Can not infer schema for type: <class 'NoneType'>
   ```
   
   should be:
   
   ```py
   ValueError: The first row in RDD is empty, can not infer schema
   ```
   
   ?
   
   2. if the first value is int:
   
   ```py
   >>> spark.createDataFrame(spark._sc.parallelize([0, 1]))
   ...
   TypeError: Can not infer schema for type: <class 'int'>
   ```
   
   then I think the error message for the case of string:
   
   ```py
   >>> spark.createDataFrame(spark._sc.parallelize(["", "a"]))
   ...
   ValueError: The first row in RDD is empty, can not infer schema
   ```
   
   should be:
   
   ```py
   TypeError: Can not infer schema for type: <class 'str'>
   ```
   
   ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org