You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "ouruia@cnsuning.com" <ou...@cnsuning.com> on 2016/01/26 04:44:57 UTC

spark-sql[1.4.0] not compatible hive sql when using in with date_sub or regexp_replace

hi , all
        when migrating hive sql to spark sql  encountor  a  incompatibility problem . Please give me some suggestions.


hive table description and data format as following :
1
use spark;
drop table spark.test_or1;
CREATE TABLE `spark.test_or1`( 
`statis_date` string, 
`lbl_nm` string)  row format delimited fields terminated by ',';; 

example data data.txt:
20160110 , item_XX_tab_buy03
20160114  , item_XX_tab_buy01   
20160115 , item_XX_tab_buy11
20160118 , item_XX_tab_buy01
20160101 , item_XX_tab_buy01
20160102 , item_XX_tab_buy03
20160103 , item_XX_tab_buy04

load data local inpath 'data.txt' into table  spark.test_or1; 


when execute this hive command in spark-sql(1.4.0)   encountor UnresolvedException: Invalid call to dataType on unresolved object, tree: 'ta.lbl_nm . However ,this command can be executed succussfully in hive-shell;
select case when ta.lbl_nm in ('item_XX_tab_buy03','item_XX_gmq_buy01') 
then 1 
else 0 
end as  defaultVaule
from spark.test_or1 ta 
where ta.statis_date <= '20160118' 
and ta.statis_date > regexp_replace(date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyy-MM-dd'), 20), '-', '')  limit 10 ;  



this command also has errors : UnresolvedException: Invalid call to dataType on unresolved object, tree: 'ta.lbl_nm . However ,this command can be executed succussfully in hive-shell;
select case when ta.lbl_nm in ('item_XX_tab_buy03','item_XX_gmq_buy01') 
then 1 
else 0 
end as  defaultVaule
from spark.test_or1 ta 
where ta.statis_date <= '20160118' 
and ta.statis_date > date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyyMMdd'), 20)  limit 10 ;  


this command also has errors : UnresolvedException: Invalid call to dataType on unresolved object, tree: 'ta.lbl_nm . However ,this command can be executed succussfully in hive-shell
select case when ta.lbl_nm in ('item_XX_tab_buy03','item_XX_gmq_buy01') 
then 1 
else 0 
end as  defaultVaule
from spark.test_or1 ta 
where ta.statis_date <= '20160118' 
and ta.statis_date > regexp_replace(2016-01-18, '-', '')  limit 10 ;   


while change `in`  to `==` , this command can be executed succussfully in hive-shell and spark sql 
select case when ta.lbl_nm='item_XX_tab_buy03' and ta.lbl_nm = 'item_XX_gmq_buy01' 
then 1 
else 0 
end as  defaultVaule
from spark.test_or1 ta 
where ta.statis_date <= '20160118' 
and ta.statis_date > regexp_replace(date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyy-MM-dd'), 20), '-', '')  limit 10 ;  


spark-sql> 
> 
> select case when ta.lbl_nm='item_XX_tab_buy03' and ta.lbl_nm = 'item_XX_gmq_buy01' 
> then 1 
> else 0 
> end as defaultVaule 
> from spark.test_or1 ta 
> where ta.statis_date <= '20160118' 
> and ta.statis_date > regexp_replace(date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyy-MM-dd'), 20), '-', '') limit 10 ; 
0 
0 
0 
0 
0 
0 
Time taken: 3.725 seconds, Fetched 6 row(s)


detail error log
16/01/26 11:10:36 INFO ParseDriver: Parse Completed 
16/01/26 11:10:36 INFO HiveMetaStore: 0: get_table : db=spark tbl=test_or1 
16/01/26 11:10:36 INFO audit: ugi=spark ip=unknown-ip-addr cmd=get_table : db=spark tbl=test_or1 
16/01/26 11:10:37 ERROR SparkSQLDriver: Failed in [ select case when ta.lbl_nm in ('item_XX_tab_buy03','item_XX_gmq_buy01') 
then 1 
else 0 
end as defaultVaule 
from spark.test_or1 ta 
where ta.statis_date <= '20160118' 
and ta.statis_date > regexp_replace(date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyy-MM-dd'), 20), '-', '') limit 10 ] 
org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object, tree: 'ta.lbl_nm 
at org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute.dataType(unresolved.scala:59) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5$$anonfun$applyOrElse$15.apply(HiveTypeCoercion.scala:299) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5$$anonfun$applyOrElse$15.apply(HiveTypeCoercion.scala:299) 
at scala.collection.LinearSeqOptimized$class.exists(LinearSeqOptimized.scala:80) 
at scala.collection.immutable.List.exists(List.scala:84) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5.applyOrElse(HiveTypeCoercion.scala:299) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5.applyOrElse(HiveTypeCoercion.scala:298) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:221) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4$$anonfun$apply$7.apply(TreeNode.scala:261) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.immutable.List.foreach(List.scala:318) 
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
at scala.collection.AbstractTraversable.map(Traversable.scala:105) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:259) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionDown$1(QueryPlan.scala:75) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1$$anonfun$apply$1.apply(QueryPlan.scala:90) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) 
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) 
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
at scala.collection.AbstractTraversable.map(Traversable.scala:105) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:89) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:94) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:64) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:136) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:135) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:221) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:212) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformAllExpressions(QueryPlan.scala:135) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$.apply(HiveTypeCoercion.scala:298) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$.apply(HiveTypeCoercion.scala:297) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:61) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:59) 
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111) 
at scala.collection.immutable.List.foldLeft(List.scala:84) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:59) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:51) 
at scala.collection.immutable.List.foreach(List.scala:318) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:51) 
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:922) 
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:922) 
at org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:920) 
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131) 
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) 
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:744) 
at org.apache.spark.sql.hive.thriftserver.AbstractSparkSQLDriver.run(AbstractSparkSQLDriver.scala:57) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:283) 
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:218) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:666) 
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) 
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:194) 
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) 
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to dataType on unresolved object, tree: 'ta.lbl_nm 
at org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute.dataType(unresolved.scala:59) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5$$anonfun$applyOrElse$15.apply(HiveTypeCoercion.scala:299) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5$$anonfun$applyOrElse$15.apply(HiveTypeCoercion.scala:299) 
at scala.collection.LinearSeqOptimized$class.exists(LinearSeqOptimized.scala:80) 
at scala.collection.immutable.List.exists(List.scala:84) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5.applyOrElse(HiveTypeCoercion.scala:299) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$$anonfun$apply$5.applyOrElse(HiveTypeCoercion.scala:298) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:221) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4$$anonfun$apply$7.apply(TreeNode.scala:261) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.immutable.List.foreach(List.scala:318) 
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
at scala.collection.AbstractTraversable.map(Traversable.scala:105) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:259) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$transformExpressionDown$1(QueryPlan.scala:75) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1$$anonfun$apply$1.apply(QueryPlan.scala:90) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) 
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) 
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) 
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) 
at scala.collection.AbstractTraversable.map(Traversable.scala:105) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:89) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:94) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:64) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:136) 
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformAllExpressions$1.applyOrElse(QueryPlan.scala:135) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:222) 
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:51) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:221) 
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:242) 
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) 
at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) 
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) 
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) 
at scala.collection.AbstractIterator.to(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) 
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) 
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) 
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildrenDown(TreeNode.scala:272) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:227) 
at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:212) 
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformAllExpressions(QueryPlan.scala:135) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$.apply(HiveTypeCoercion.scala:298) 
at org.apache.spark.sql.catalyst.analysis.HiveTypeCoercion$InConversion$.apply(HiveTypeCoercion.scala:297) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:61) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:59) 
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111) 
at scala.collection.immutable.List.foldLeft(List.scala:84) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:59) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:51) 
at scala.collection.immutable.List.foreach(List.scala:318) 
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:51) 
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed$lzycompute(SQLContext.scala:922) 
at org.apache.spark.sql.SQLContext$QueryExecution.analyzed(SQLContext.scala:922) 
at org.apache.spark.sql.SQLContext$QueryExecution.assertAnalyzed(SQLContext.scala:920) 
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131) 
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) 
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:744) 
at org.apache.spark.sql.hive.thriftserver.AbstractSparkSQLDriver.run(AbstractSparkSQLDriver.scala:57) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:283) 
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:218) 
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:606) 
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:666) 
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169) 
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:194) 
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111) 
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 



execute log in hive shell:
[spark@master1-dev ~]$ hive 

Logging initialized using configuration in file:/log/bigdata/software/apache-hive-0.13.1.3-bin/conf/hive-log4j.properties 
hive> 
> select case when ta.lbl_nm in ('item_XX_tab_buy03','item_XX_gmq_buy01') 
> then 1 
> else 0 
> end as defaultVaule 
> from spark.test_or1 ta 
> where ta.statis_date <= '20160118' 
> and ta.statis_date > regexp_replace(date_sub(from_unixtime(to_unix_timestamp('20160118', 'yyyyMMdd'), 'yyyy-MM-dd'), 20), '-', '') limit 10 ; 
OK 
0 
0 
0 
0 
0 
0 
Time taken: 2.337 seconds, Fetched: 6 row(s)