You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Alessandro Baretta <al...@gmail.com> on 2014/12/11 23:37:25 UTC

Where are the docs for the SparkSQL DataTypes?

Michael & other Spark SQL junkies,

As I read through the Spark API docs, in particular those for the
org.apache.spark.sql package, I can't seem to find details about the Scala
classes representing the various SparkSQL DataTypes, for instance
DecimalType. I find DataType classes in org.apache.spark.sql.api.java, but
they don't seem to match the similarly named scala classes. For instance,
DecimalType is documented as having a nullary constructor, but if I try to
construct an instance of org.apache.spark.sql.DecimalType without any
parameters, the compiler complains about the lack of a precisionInfo field,
which I have discovered can be passed in as None. Where is all this stuff
documented?

Alex

Re: Where are the docs for the SparkSQL DataTypes?

Posted by Michael Armbrust <mi...@databricks.com>.
I'd suggest looking at the reference in the programming guide:
http://spark.apache.org/docs/latest/sql-programming-guide.html#spark-sql-datatype-reference


On Thu, Dec 11, 2014 at 6:45 PM, Alessandro Baretta <al...@gmail.com>
wrote:

> Thanks. This is useful.
>
> Alex
>
> On Thu, Dec 11, 2014 at 4:35 PM, Cheng, Hao <ha...@intel.com> wrote:
>>
>> Part of it can be found at:
>>
>> https://github.com/apache/spark/pull/3429/files#diff-f88c3e731fcb17b1323b778807c35b38R34
>>
>> Sorry it's a TO BE reviewed PR, but still should be informative.
>>
>> Cheng Hao
>>
>> -----Original Message-----
>> From: Alessandro Baretta [mailto:alexbaretta@gmail.com]
>> Sent: Friday, December 12, 2014 6:37 AM
>> To: Michael Armbrust; dev@spark.apache.org
>> Subject: Where are the docs for the SparkSQL DataTypes?
>>
>> Michael & other Spark SQL junkies,
>>
>> As I read through the Spark API docs, in particular those for the
>> org.apache.spark.sql package, I can't seem to find details about the Scala
>> classes representing the various SparkSQL DataTypes, for instance
>> DecimalType. I find DataType classes in org.apache.spark.sql.api.java, but
>> they don't seem to match the similarly named scala classes. For instance,
>> DecimalType is documented as having a nullary constructor, but if I try to
>> construct an instance of org.apache.spark.sql.DecimalType without any
>> parameters, the compiler complains about the lack of a precisionInfo field,
>> which I have discovered can be passed in as None. Where is all this stuff
>> documented?
>>
>> Alex
>>
>

Re: Where are the docs for the SparkSQL DataTypes?

Posted by Alessandro Baretta <al...@gmail.com>.
Thanks. This is useful.

Alex

On Thu, Dec 11, 2014 at 4:35 PM, Cheng, Hao <ha...@intel.com> wrote:
>
> Part of it can be found at:
>
> https://github.com/apache/spark/pull/3429/files#diff-f88c3e731fcb17b1323b778807c35b38R34
>
> Sorry it's a TO BE reviewed PR, but still should be informative.
>
> Cheng Hao
>
> -----Original Message-----
> From: Alessandro Baretta [mailto:alexbaretta@gmail.com]
> Sent: Friday, December 12, 2014 6:37 AM
> To: Michael Armbrust; dev@spark.apache.org
> Subject: Where are the docs for the SparkSQL DataTypes?
>
> Michael & other Spark SQL junkies,
>
> As I read through the Spark API docs, in particular those for the
> org.apache.spark.sql package, I can't seem to find details about the Scala
> classes representing the various SparkSQL DataTypes, for instance
> DecimalType. I find DataType classes in org.apache.spark.sql.api.java, but
> they don't seem to match the similarly named scala classes. For instance,
> DecimalType is documented as having a nullary constructor, but if I try to
> construct an instance of org.apache.spark.sql.DecimalType without any
> parameters, the compiler complains about the lack of a precisionInfo field,
> which I have discovered can be passed in as None. Where is all this stuff
> documented?
>
> Alex
>

RE: Where are the docs for the SparkSQL DataTypes?

Posted by "Cheng, Hao" <ha...@intel.com>.
Part of it can be found at:
https://github.com/apache/spark/pull/3429/files#diff-f88c3e731fcb17b1323b778807c35b38R34
 
Sorry it's a TO BE reviewed PR, but still should be informative.

Cheng Hao

-----Original Message-----
From: Alessandro Baretta [mailto:alexbaretta@gmail.com] 
Sent: Friday, December 12, 2014 6:37 AM
To: Michael Armbrust; dev@spark.apache.org
Subject: Where are the docs for the SparkSQL DataTypes?

Michael & other Spark SQL junkies,

As I read through the Spark API docs, in particular those for the org.apache.spark.sql package, I can't seem to find details about the Scala classes representing the various SparkSQL DataTypes, for instance DecimalType. I find DataType classes in org.apache.spark.sql.api.java, but they don't seem to match the similarly named scala classes. For instance, DecimalType is documented as having a nullary constructor, but if I try to construct an instance of org.apache.spark.sql.DecimalType without any parameters, the compiler complains about the lack of a precisionInfo field, which I have discovered can be passed in as None. Where is all this stuff documented?

Alex

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org