You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@carbondata.apache.org by "sunerhan1992@sina.com" <su...@sina.com> on 2018/04/03 12:26:20 UTC
refresh table [CARBON-1.3.1]
hello,
I have a table created and loaded under carbon1.3.0 and i'm upgrading to carbon1.3.1 using refresh table.
following is my step:
1. copy old table's hdfs location to a new direcotry. /user/xx/prod_inst_cab-->/user/xx/prod_inst_cab_backup
2. hive -e "drop table xx.prod_inst_cab"
3. cc.sql("drop table xx.prod_inst_cab")
(ps: initial carbonsession using '/user/xx' location; like val cc = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/xx") )
4. hdfs dfs -rmr -skipTrash /user/xx/prod_inst_cab and copy /user/xx/prod_inst_cab_backup --> /user/xx/prod_inst_cab
5.cc.sql("refresh table xx.prod_inst_cab").show as well as cc.sql("refresh table xx.prod_inst_cab_backup").show
when perform
1.cc.sql("select count(1) from xx.prod_inst_cab").show result is 0
2.cc.sql("select count(1) from xx.prod_inst_cab_backup").show result is right,but all columns of all records is "null"
Am i missing some details?
sunerhan1992@sina.com
Re: refresh table [CARBON-1.3.1]
Posted by Mic Sun <su...@sina.com>.
hello mohdshahidkhan,
sorry for the writing mistake and
step3: xx is database and i did val cc =
SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user")
)
step4:two table data directory like:
/user/xx/prod_inst_cab_backup
/user/xx/prod_inst_cab
-----
FFCS研究院
--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Re: refresh table [CARBON-1.3.1]
Posted by Mohammad Shahid Khan <mo...@gmail.com>.
Hi Sunerhan,
If xx is database, the step 3 should be like require a minor change
val cc = SparkSession.builder().config(sc.getConf).
getOrCreateCarbonSession("hdfs://ns1/user/") )
Step 4:
The backup table data should be copied at database location.
For more details, please refer below.
*https://github.com/apache/carbondata/blob/master/docs/data-management-on-carbondata.md#refresh-table
<https://github.com/apache/carbondata/blob/master/docs/data-management-on-carbondata.md#refresh-table>*
Regards,
Mohammad Shahid Khan
On Tue, Apr 3, 2018 at 5:56 PM, sunerhan1992@sina.com <sunerhan1992@sina.com
> wrote:
> hello,
> I have a table created and loaded under carbon1.3.0 and i'm
> upgrading to carbon1.3.1 using refresh table.
> following is my step:
> 1. copy old table's hdfs location to a new direcotry.
> /user/xx/prod_inst_cab-->/user/xx/prod_inst_cab_backup
> 2. hive -e "drop table xx.prod_inst_cab"
> 3. cc.sql("drop table xx.prod_inst_cab")
> (ps: initial carbonsession using '/user/xx' location; like val cc =
> SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/xx")
> )
> 4. hdfs dfs -rmr -skipTrash /user/xx/prod_inst_cab and copy
> /user/xx/prod_inst_cab_backup --> /user/xx/prod_inst_cab
> 5.cc.sql("refresh table xx.prod_inst_cab").show as well as
> cc.sql("refresh table xx.prod_inst_cab_backup").show
>
> when perform
> 1.cc.sql("select count(1) from xx.prod_inst_cab").show result is 0
> 2.cc.sql("select count(1) from xx.prod_inst_cab_backup").show
> result is right,but all columns of all records is "null"
>
> Am i missing some details?
>
>
> sunerhan1992@sina.com
>