You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ravisankar Mani <rr...@gmail.com> on 2015/07/09 06:02:44 UTC

Spark query

Hi everyone,

I can't get 'day of year'  when using spark query. Can you help any way to
achieve day of year?

Regards,
Ravi

Re: Spark query

Posted by Brandon White <bw...@gmail.com>.
Convert the column to a column of java Timestamps. Then you can do the
following....

import java.sql.Timestamp
import java.util.Calendar
def date_trunc(timestamp:Timestamp, timeField:String) = {
  timeField match {
    case "hour" =>
      val cal = Calendar.getInstance()
      cal.setTimeInMillis(timestamp.getTime())
      cal.get(Calendar.HOUR_OF_DAY)

    case "day" =>
      val cal = Calendar.getInstance()
      cal.setTimeInMillis(timestamp.getTime())
      cal.get(Calendar.DAY)
  }
}

sqlContext.udf.register("date_trunc", date_trunc _)

On Wed, Jul 8, 2015 at 9:23 PM, Harish Butani <rh...@gmail.com>
wrote:

> try the spark-datetime package:
> https://github.com/SparklineData/spark-datetime
> Follow this example
> https://github.com/SparklineData/spark-datetime#a-basic-example to get
> the different attributes of a DateTime.
>
> On Wed, Jul 8, 2015 at 9:11 PM, prosp4300 <pr...@163.com> wrote:
>
>> As mentioned in Spark sQL programming guide, Spark SQL support Hive UDFs,
>> please take a look below builtin UDFs of Hive, get day of year should be as
>> simply as existing RDBMS
>>
>> https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions
>>
>>
>> At 2015-07-09 12:02:44, "Ravisankar Mani" <rr...@gmail.com> wrote:
>>
>> Hi everyone,
>>
>> I can't get 'day of year'  when using spark query. Can you help any way
>> to achieve day of year?
>>
>> Regards,
>> Ravi
>>
>>
>>
>>
>

Re: Spark query

Posted by Harish Butani <rh...@gmail.com>.
try the spark-datetime package:
https://github.com/SparklineData/spark-datetime
Follow this example
https://github.com/SparklineData/spark-datetime#a-basic-example to get the
different attributes of a DateTime.

On Wed, Jul 8, 2015 at 9:11 PM, prosp4300 <pr...@163.com> wrote:

> As mentioned in Spark sQL programming guide, Spark SQL support Hive UDFs,
> please take a look below builtin UDFs of Hive, get day of year should be as
> simply as existing RDBMS
>
> https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions
>
>
> At 2015-07-09 12:02:44, "Ravisankar Mani" <rr...@gmail.com> wrote:
>
> Hi everyone,
>
> I can't get 'day of year'  when using spark query. Can you help any way to
> achieve day of year?
>
> Regards,
> Ravi
>
>
>
>

Re:Spark query

Posted by prosp4300 <pr...@163.com>.
As mentioned in Spark sQL programming guide, Spark SQL support Hive UDFs, please take a look below builtin UDFs of Hive, get day of year should be as simply as existing RDBMS

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DateFunctions




At 2015-07-09 12:02:44, "Ravisankar Mani" <rr...@gmail.com> wrote:

Hi everyone,


I can't get 'day of year'  when using spark query. Can you help any way to achieve day of year?


Regards,

Ravi