You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "itholic (via GitHub)" <gi...@apache.org> on 2023/03/08 02:45:59 UTC

[GitHub] [spark] itholic commented on a diff in pull request #40316: [SPARK-42679][CONNECT] createDataFrame doesn't work with non-nullable schema

itholic commented on code in PR #40316:
URL: https://github.com/apache/spark/pull/40316#discussion_r1128906598


##########
python/pyspark/sql/tests/connect/test_connect_basic.py:
##########
@@ -2876,6 +2876,13 @@ def test_unsupported_io_functions(self):
             with self.assertRaises(NotImplementedError):
                 getattr(df.write, f)()
 
+    def test_inferred_schema(self):

Review Comment:
   Can we enrich the test by also including the test for schema_true and pandas DataFrame??
   
   such as:
   ```
   schema_true = StructType([StructField("id", IntegerType(), True)])
   cdf2 = self.connect.createDataFrame([[1]], schema=schema_true)
   sdf2 = self.spark.createDataFrame([[1]], schema=schema_true)
   self.assertEqual(cdf2.schema, sdf2.schema)
   self.assertEqual(cdf2.collect(), sdf2.collect())
   
   pdf1 = cdf1.toPandas()
   cdf3 = self.connect.createDataFrame(pdf1, cdf1.schema)
   sdf3 = self.spark.createDataFrame(pdf1, cdf1.schema)
   self.assertEqual(cdf3.schema, sdf3.schema)
   self.assertEqual(cdf3.collect(), sdf3.collect())
   
   pdf2 = cdf2.toPandas()
   cdf4 = self.connect.createDataFrame(pdf2, cdf2.schema)
   sdf4 = self.spark.createDataFrame(pdf2, cdf2.schema)
   self.assertEqual(cdf4.schema, sdf4.schema)
   self.assertEqual(cdf4.collect(), sdf4.collect())
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org