You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by 田毅 <ti...@asiainfo.com> on 2014/08/01 11:00:19 UTC

How to run specific sparkSQL test with maven

Hi everyone!

Could any one tell me how to run specific sparkSQL test with maven?

For example:

I want to test HiveCompatibilitySuite.

I ran “mvm test -Dtest=HiveCompatibilitySuite”

It did not work. 

BTW, is there any information about how to build a test environment of sparkSQL?

I got this error when i ran the test.

It seems that the HiveCompatibilitySuite need a hadoop and hive environment, am I right?
 
"Relative path in absolute URI: file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1” 






Re: How to run specific sparkSQL test with maven

Posted by Cheng Lian <li...@gmail.com>.
It’s also useful to set hive.exec.mode.local.auto to true to accelerate the
test.
​


On Sat, Aug 2, 2014 at 1:36 AM, Michael Armbrust <mi...@databricks.com>
wrote:

> >
> > It seems that the HiveCompatibilitySuite need a hadoop and hive
> > environment, am I right?
> >
> > "Relative path in absolute URI:
> > file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1”
> >
>
> You should only need Hadoop and Hive if you are creating new tests that we
> need to compute the answers for.  Existing tests are run with cached
> answers.  There are details about the configuration here:
> https://github.com/apache/spark/tree/master/sql
>

Re: How to run specific sparkSQL test with maven

Posted by Michael Armbrust <mi...@databricks.com>.
>
> It seems that the HiveCompatibilitySuite need a hadoop and hive
> environment, am I right?
>
> "Relative path in absolute URI:
> file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1”
>

You should only need Hadoop and Hive if you are creating new tests that we
need to compute the answers for.  Existing tests are run with cached
answers.  There are details about the configuration here:
https://github.com/apache/spark/tree/master/sql

Re: Re:How to run specific sparkSQL test with maven

Posted by Jeremy Freeman <fr...@gmail.com>.
With maven you can run a particular test suite like this:

mvn -DwildcardSuites=org.apache.spark.sql.SQLQuerySuite test

see the note here (under "Spark Tests in Maven"):

http://spark.apache.org/docs/latest/building-with-maven.html



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/How-to-run-specific-sparkSQL-test-with-maven-tp7624p7626.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Re:How to run specific sparkSQL test with maven

Posted by witgo <wi...@qq.com>.
You can try these commands‍
./sbt/sbt assembly‍./sbt/sbt "test-only *.HiveCompatibilitySuite" -Phive‍

‍





------------------ Original ------------------
From:  "田毅";<ti...@asiainfo.com>;
Date:  Fri, Aug 1, 2014 05:00 PM
To:  "dev"<de...@spark.apache.org>; 

Subject:  How to run specific sparkSQL test with maven



Hi everyone!

Could any one tell me how to run specific sparkSQL test with maven?

For example:

I want to test HiveCompatibilitySuite.

I ran “mvm test -Dtest=HiveCompatibilitySuite”

It did not work. 

BTW, is there any information about how to build a test environment of sparkSQL?

I got this error when i ran the test.

It seems that the HiveCompatibilitySuite need a hadoop and hive environment, am I right?
 
"Relative path in absolute URI: file:$%7Bsystem:test.tmp.dir%7D/tmp_showcrt1”