You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@carbondata.apache.org by Mars Xu <xu...@gmail.com> on 2017/02/04 02:42:36 UTC

store location can't be found

Hello All,
 	I met a problem of file not exist. it looks like the store location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties, but when I start up spark-shell by following command and run some commands ,the error is coming
spark-shell --master spark://localhost:7077 --jars ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> 

scala> import org.apache.spark.sql.SparkSession
scala> import org.apache.spark.sql.CarbonSession._
scala> val carbon = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession()
scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name string, city string, age Int) STORED BY 'carbondata’")
scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv' into table test_table”)

scala> carbon.sql("select * from test_table").show()
java.io.FileNotFoundException: File /private/var/carbon.store/default/test_table/Fact/Part0/Segment_0 does not exist.
  at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:948)
  at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:927)

My carbonate version is 1.0 and spark version is spark 2.1.

Re: store location can't be found

Posted by Mars Xu <xu...@gmail.com>.
Hi Liang,
  
	As Ravindran suggest, I create carbonsession with storepath as follow:
   val carbon = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store”)



> 在 2017年2月4日,下午5:06,Liang Chen <ch...@gmail.com> 写道:
> 
>>> Ravindra.


Re: store location can't be found

Posted by Liang Chen <ch...@gmail.com>.
Hi Mars

Can you share how you solved this issue ?

Regards
Liang

2017-02-04 15:54 GMT+08:00 Mars Xu <xu...@gmail.com>:

> Hello Ravindra,
>
>         I have solved this problem. Thanks.
>
> > 在 2017年2月4日,上午11:49,Ravindra Pesala <ra...@gmail.com> 写道:
> >
> > Hi Mars,
> >
> > Please try creating carbonsession with storepath as follow.
> >
> > val carbon = SparkSession.builder().config(sc.getConf).
> > getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> ")
> >
> >
> > Regards,
> > Ravindra.
> >
> > On 4 February 2017 at 08:12, Mars Xu <xujiao.mycafe@gmail.com <mailto:
> xujiao.mycafe@gmail.com>> wrote:
> >
> >> Hello All,
> >>        I met a problem of file not exist. it looks like the store
> >> location can’t be found. I have already set
> carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
> >> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>>
> in $SPARK_HOME/conf/carbon.properties,
> >> but when I start up spark-shell by following command and run some
> commands
> >> ,the error is coming
> >> spark-shell --master spark://localhost:7077 <spark://localhost:7077>
> --jars
> >> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar
> --conf
> >> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
> >> <hdfs://localhost:9000/carbon/store>
> >>
> >> scala> import org.apache.spark.sql.SparkSession
> >> scala> import org.apache.spark.sql.CarbonSession._
> >> scala> val carbon = SparkSession.builder().config(sc.getConf).
> >> getOrCreateCarbonSession()
> >> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> >> string, city string, age Int) STORED BY 'carbondata’")
> >> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/
> resources/sample.csv'
> >> into table test_table”)
> >>
> >> scala> carbon.sql("select * from test_table").show()
> >> java.io.FileNotFoundException: File /private/var/carbon.store/
> >> default/test_table/Fact/Part0/Segment_0 does not exist.
> >>  at org.apache.hadoop.hdfs.DistributedFileSystem$
> >> DirListingIterator.<init>(DistributedFileSystem.java:948)
> >>  at org.apache.hadoop.hdfs.DistributedFileSystem$
> >> DirListingIterator.<init>(DistributedFileSystem.java:927)
> >>
> >> My carbonate version is 1.0 and spark version is spark 2.1.
> >
> >
> >
> >
> > --
> > Thanks & Regards,
> > Ravi
>
>


-- 
Regards
Liang

Re: store location can't be found

Posted by Mars Xu <xu...@gmail.com>.
Hello Ravindra,

     	I have solved this problem. Thanks.
  
> 在 2017年2月4日,上午11:49,Ravindra Pesala <ra...@gmail.com> 写道:
> 
> Hi Mars,
> 
> Please try creating carbonsession with storepath as follow.
> 
> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> ")
> 
> 
> Regards,
> Ravindra.
> 
> On 4 February 2017 at 08:12, Mars Xu <xujiao.mycafe@gmail.com <ma...@gmail.com>> wrote:
> 
>> Hello All,
>>        I met a problem of file not exist. it looks like the store
>> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>> in $SPARK_HOME/conf/carbon.properties,
>> but when I start up spark-shell by following command and run some commands
>> ,the error is coming
>> spark-shell --master spark://localhost:7077 <spark://localhost:7077> --jars
>> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
>> spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store>
>> 
>> scala> import org.apache.spark.sql.SparkSession
>> scala> import org.apache.spark.sql.CarbonSession._
>> scala> val carbon = SparkSession.builder().config(sc.getConf).
>> getOrCreateCarbonSession()
>> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
>> string, city string, age Int) STORED BY 'carbondata’")
>> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
>> into table test_table”)
>> 
>> scala> carbon.sql("select * from test_table").show()
>> java.io.FileNotFoundException: File /private/var/carbon.store/
>> default/test_table/Fact/Part0/Segment_0 does not exist.
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:948)
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:927)
>> 
>> My carbonate version is 1.0 and spark version is spark 2.1.
> 
> 
> 
> 
> -- 
> Thanks & Regards,
> Ravi


Re: store location can't be found

Posted by Mars Xu <xu...@gmail.com>.
Hi Ravindra,

      I follow your suggestion to create carbon session with store path, but when load data it through out another error as follow :

scala> carbon.sql("LOAD DATA INPATH 'hdfs://localhost:9000/resources/sample.csv' INTO TABLE test_table")
org.apache.spark.sql.AnalysisException: LOAD DATA is not supported for datasource tables: `default`.`test_table`;
  at org.apache.spark.sql.execution.command.LoadDataCommand.run(tables.scala:194)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)

do you know what wrong with this ?

> 在 2017年2月4日,上午11:49,Ravindra Pesala <ra...@gmail.com> 写道:
> 
> Hi Mars,
> 
> Please try creating carbonsession with storepath as follow.
> 
> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> ")
> 
> 
> Regards,
> Ravindra.
> 
> On 4 February 2017 at 08:12, Mars Xu <xujiao.mycafe@gmail.com <ma...@gmail.com>> wrote:
> 
>> Hello All,
>>        I met a problem of file not exist. it looks like the store
>> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>> in $SPARK_HOME/conf/carbon.properties,
>> but when I start up spark-shell by following command and run some commands
>> ,the error is coming
>> spark-shell --master spark://localhost:7077 <spark://localhost:7077> --jars
>> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
>> spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store>
>> 
>> scala> import org.apache.spark.sql.SparkSession
>> scala> import org.apache.spark.sql.CarbonSession._
>> scala> val carbon = SparkSession.builder().config(sc.getConf).
>> getOrCreateCarbonSession()
>> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
>> string, city string, age Int) STORED BY 'carbondata’")
>> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
>> into table test_table”)
>> 
>> scala> carbon.sql("select * from test_table").show()
>> java.io.FileNotFoundException: File /private/var/carbon.store/
>> default/test_table/Fact/Part0/Segment_0 does not exist.
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:948)
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:927)
>> 
>> My carbonate version is 1.0 and spark version is spark 2.1.
> 
> 
> 
> 
> -- 
> Thanks & Regards,
> Ravi


Re: store location can't be found

Posted by Ravindra Pesala <ra...@gmail.com>.
Hi Mars,

Please try creating carbonsession with storepath as follow.

val carbon = SparkSession.builder().config(sc.getConf).
getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store ")


Regards,
Ravindra.

On 4 February 2017 at 08:12, Mars Xu <xu...@gmail.com> wrote:

> Hello All,
>         I met a problem of file not exist. it looks like the store
> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties,
> but when I start up spark-shell by following command and run some commands
> ,the error is coming
> spark-shell --master spark://localhost:7077 --jars
> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
>
> scala> import org.apache.spark.sql.SparkSession
> scala> import org.apache.spark.sql.CarbonSession._
> scala> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession()
> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> string, city string, age Int) STORED BY 'carbondata’")
> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
> into table test_table”)
>
> scala> carbon.sql("select * from test_table").show()
> java.io.FileNotFoundException: File /private/var/carbon.store/
> default/test_table/Fact/Part0/Segment_0 does not exist.
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:948)
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:927)
>
> My carbonate version is 1.0 and spark version is spark 2.1.




-- 
Thanks & Regards,
Ravi

Re: store location can't be found

Posted by Liang Chen <ch...@gmail.com>.
Hi

Have you configured as per the guide :
https://github.com/apache/incubator-carbondata/blob/master/docs/installation-guide.md

Regards
Liang

2017-02-04 10:42 GMT+08:00 Mars Xu <xu...@gmail.com>:

> Hello All,
>         I met a problem of file not exist. it looks like the store
> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties,
> but when I start up spark-shell by following command and run some commands
> ,the error is coming
> spark-shell --master spark://localhost:7077 --jars
> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
>
> scala> import org.apache.spark.sql.SparkSession
> scala> import org.apache.spark.sql.CarbonSession._
> scala> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession()
> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> string, city string, age Int) STORED BY 'carbondata’")
> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
> into table test_table”)
>
> scala> carbon.sql("select * from test_table").show()
> java.io.FileNotFoundException: File /private/var/carbon.store/
> default/test_table/Fact/Part0/Segment_0 does not exist.
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:948)
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:927)
>
> My carbonate version is 1.0 and spark version is spark 2.1.




-- 
Regards
Liang