You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by prakash sejwani <pr...@gmail.com> on 2010/02/28 06:46:42 UTC

Error in metadata while creating table in hive

I am using hive and hadoop 0.20.0

When i create the table in hive i get the following below is the log
$HIVE_HOME/bin/hive -hiveconf hive.root.logger=INFO,console
Hive history file=/tmp/prakash/hive_job_
log_prakash_201002271718_1967794083.txt
10/02/27 17:18:34 INFO exec.HiveHistory: Hive history
file=/tmp/prakash/hive_job_log_prakash_201002271718_1967794083.txt
hive>  CREATE TABLE nginx_ref (foo INT, bar STRING);
10/02/27 17:18:37 INFO parse.ParseDriver: Parsing command:  CREATE TABLE
nginx_ref (foo INT, bar STRING)
10/02/27 17:18:37 INFO parse.ParseDriver: Parse Completed
10/02/27 17:18:37 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
10/02/27 17:18:37 INFO parse.SemanticAnalyzer: Creating table nginx_ref
position=14
10/02/27 17:18:37 INFO ql.Driver: Semantic Analysis Completed
10/02/27 17:18:37 INFO ql.Driver: Returning Hive schema:
Schema(fieldSchemas:null, properties:null)
10/02/27 17:18:37 INFO ql.Driver: query plan =
file:/tmp/prakash/hive_2010-02-27_17-18-37_564_4339795143228998991/queryplan.xml
10/02/27 17:18:38 INFO ql.Driver: Starting command:  CREATE TABLE nginx_ref
(foo INT, bar STRING)
10/02/27 17:18:38 INFO exec.DDLTask: Default to LazySimpleSerDe for table
nginx_ref
10/02/27 17:18:38 INFO hive.log: DDL: struct nginx_ref { i32 foo, string
bar}
10/02/27 17:18:38 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
10/02/27 17:18:38 INFO metastore.ObjectStore: ObjectStore, initialize called
10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.resources" but it cannot be resolved.
10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.runtime" but it cannot be resolved.
10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
requires "org.eclipse.text" but it cannot be resolved.
10/02/27 17:18:42 INFO metastore.ObjectStore: Initialized ObjectStore
10/02/27 17:18:42 INFO metastore.HiveMetaStore: 0: create_table: db=default
tbl=nginx_ref
10/02/27 17:18:42 INFO metastore.HiveMetaStore: 0: get_table : db=default
tbl=nginx_ref
10/02/27 17:18:42 ERROR hive.log: Got exception:
java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
not exist.
10/02/27 17:18:42 ERROR hive.log: java.io.FileNotFoundException: File
file:/user/hive/warehouse/nginx_ref does not exist.
    at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:372)
    at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
    at org.apache.hadoop.hive.metastore.Warehouse.mkdirs(Warehouse.java:124)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:340)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:265)
    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:320)
    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1391)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
    at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:630)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:504)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:382)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

FAILED: Error in metadata: MetaException(message:Got exception:
java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
not exist.)
10/02/27 17:18:42 ERROR exec.DDLTask: FAILED: Error in metadata:
MetaException(message:Got exception: java.io.FileNotFoundException File
file:/user/hive/warehouse/nginx_ref does not exist.)
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got
exception: java.io.FileNotFoundException File
file:/user/hive/warehouse/nginx_ref does not exist.)
    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:326)
    at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1391)
    at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:123)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
    at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:630)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:504)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:382)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: MetaException(message:Got exception:
java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
not exist.)
    at
org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:746)
    at org.apache.hadoop.hive.metastore.Warehouse.mkdirs(Warehouse.java:126)
    at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:340)
    at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:265)
    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:320)
    ... 15 more

FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
10/02/27 17:18:42 ERROR ql.Driver: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask
hive> version
    > ;
10/02/27 17:29:44 INFO parse.ParseDriver: Parsing command: version

FAILED: Parse Error: line 1:0 cannot recognize input 'version'

10/02/27 17:29:44 ERROR ql.Driver: FAILED: Parse Error: line 1:0 cannot
recognize input 'version'

org.apache.hadoop.hive.ql.parse.ParseException: line 1:0 cannot recognize
input 'version'

    at
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:401)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:299)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:377)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)


kindly help with this i using ubuntu 9.10 (karmic) OS

thanks
prakash

Re: Error in metadata while creating table in hive

Posted by prakash sejwani <pr...@gmail.com>.
Ok its working, i configured mysql in hive-site.xml and also set
hive.metastore.warehouse.dir in hive-site.xml

when i this r query

FROM (
  SELECT h.*,
    p.title AS product_sku, p.description AS product_name,
    c.name AS company_name,
    c2.id AS product_company_id,
    c2.name AS product_company_name
  FROM (
      -- Pull from the access_log
      SELECT ip, ident, user,
        -- Reformat the time from the access log
        from_unixtime(cast(unix_timestamp(time, "dd/MMM/yyyy:hh:mm:ss Z") AS
INT)) AS time,
        method, resource, protocol, status, length, referer, agent,
        -- Extract the product_id for the hit from the URL
        cast(regexp_extract(resource, '/products/(\\d+)', 1) AS INT) AS
product_id,
        -- Extract the company_id for the hit from the URL
        cast(regexp_extract(resource, '/companies/(\\d+)', 1) AS INT) AS
company_id,
        -- Run our User Defined Function (see
src/com/econify/geoip/IpToCountry.java).  Takes the IP of the hit and looks
up its country
        -- ip_to_country(ip) AS ip_country
      FROM access_log
    ) h
    -- Join each hit with its product or company (if it has one)
    LEFT OUTER JOIN products p ON (h.product_id = p.id)
    LEFT OUTER JOIN companies c ON (h.company_id = c.id)
    -- If the hit was for a product, we probably didn't get the company_id
in the hit subquery,
    -- so join products.company_id with another instance of the companies
table
    LEFT OUTER JOIN companies c2 ON (p.company_id = c2.id)
    -- Filter out all hits that weren't for a company or a product
    WHERE h.product_id IS NOT NULL OR h.company_id IS NOT NULL
) hit
-- Insert the hit data into a seperate product_hits table
INSERT OVERWRITE TABLE product_hits
  SELECT ip, ident, user, time,
    method, resource, protocol, status,
    length, referer, agent,
    product_id,
    product_company_id AS company_id,
    ip_country,
    product_name,
    product_company_name AS company_name
  WHERE product_name IS NOT NULL
-- Insert the hit data insto a seperate company_hits table
INSERT OVERWRITE TABLE company_hits
  SELECT ip, ident, user, time,
    method, resource, protocol, status,
    length, referer, agent,
    company_id,
    ip_country,
    company_name
  WHERE company_name IS NOT NULL;

FAILED: Parse Error: line 19:6 cannot recognize input 'FROM' in select
expression

thanks,
prakash

Re: Error in metadata while creating table in hive

Posted by Zheng Shao <zs...@gmail.com>.
Did you set hive.metastore.warehouse.dir in hive-site.xml?
See http://wiki.apache.org/hadoop/Hive/GettingStarted
Look for http://wiki.apache.org/hadoop/Hive/GettingStarted#Running_Hive

Zheng

On Sat, Feb 27, 2010 at 9:46 PM, prakash sejwani
<pr...@gmail.com> wrote:
> I am using hive and hadoop 0.20.0
>
> When i create the table in hive i get the following below is the log
> $HIVE_HOME/bin/hive -hiveconf hive.root.logger=INFO,console
> Hive history file=/tmp/prakash/hive_job_
> log_prakash_201002271718_1967794083.txt
> 10/02/27 17:18:34 INFO exec.HiveHistory: Hive history
> file=/tmp/prakash/hive_job_log_prakash_201002271718_1967794083.txt
> hive>  CREATE TABLE nginx_ref (foo INT, bar STRING);
> 10/02/27 17:18:37 INFO parse.ParseDriver: Parsing command:  CREATE TABLE
> nginx_ref (foo INT, bar STRING)
> 10/02/27 17:18:37 INFO parse.ParseDriver: Parse Completed
> 10/02/27 17:18:37 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
> 10/02/27 17:18:37 INFO parse.SemanticAnalyzer: Creating table nginx_ref
> position=14
> 10/02/27 17:18:37 INFO ql.Driver: Semantic Analysis Completed
> 10/02/27 17:18:37 INFO ql.Driver: Returning Hive schema:
> Schema(fieldSchemas:null, properties:null)
> 10/02/27 17:18:37 INFO ql.Driver: query plan =
> file:/tmp/prakash/hive_2010-02-27_17-18-37_564_4339795143228998991/queryplan.xml
> 10/02/27 17:18:38 INFO ql.Driver: Starting command:  CREATE TABLE nginx_ref
> (foo INT, bar STRING)
> 10/02/27 17:18:38 INFO exec.DDLTask: Default to LazySimpleSerDe for table
> nginx_ref
> 10/02/27 17:18:38 INFO hive.log: DDL: struct nginx_ref { i32 foo, string
> bar}
> 10/02/27 17:18:38 INFO metastore.HiveMetaStore: 0: Opening raw store with
> implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 10/02/27 17:18:38 INFO metastore.ObjectStore: ObjectStore, initialize called
> 10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
> requires "org.eclipse.core.resources" but it cannot be resolved.
> 10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
> requires "org.eclipse.core.runtime" but it cannot be resolved.
> 10/02/27 17:18:38 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core"
> requires "org.eclipse.text" but it cannot be resolved.
> 10/02/27 17:18:42 INFO metastore.ObjectStore: Initialized ObjectStore
> 10/02/27 17:18:42 INFO metastore.HiveMetaStore: 0: create_table: db=default
> tbl=nginx_ref
> 10/02/27 17:18:42 INFO metastore.HiveMetaStore: 0: get_table : db=default
> tbl=nginx_ref
> 10/02/27 17:18:42 ERROR hive.log: Got exception:
> java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
> not exist.
> 10/02/27 17:18:42 ERROR hive.log: java.io.FileNotFoundException: File
> file:/user/hive/warehouse/nginx_ref does not exist.
>     at
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:372)
>     at
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
>     at org.apache.hadoop.hive.metastore.Warehouse.mkdirs(Warehouse.java:124)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:340)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:265)
>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:320)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1391)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:123)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:630)
>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:504)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:382)
>     at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
>     at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
> FAILED: Error in metadata: MetaException(message:Got exception:
> java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
> not exist.)
> 10/02/27 17:18:42 ERROR exec.DDLTask: FAILED: Error in metadata:
> MetaException(message:Got exception: java.io.FileNotFoundException File
> file:/user/hive/warehouse/nginx_ref does not exist.)
> org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got
> exception: java.io.FileNotFoundException File
> file:/user/hive/warehouse/nginx_ref does not exist.)
>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:326)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1391)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:123)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:630)
>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:504)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:382)
>     at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
>     at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: MetaException(message:Got exception:
> java.io.FileNotFoundException File file:/user/hive/warehouse/nginx_ref does
> not exist.)
>     at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:746)
>     at org.apache.hadoop.hive.metastore.Warehouse.mkdirs(Warehouse.java:126)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:340)
>     at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:265)
>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:320)
>     ... 15 more
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> 10/02/27 17:18:42 ERROR ql.Driver: FAILED: Execution Error, return code 1
> from org.apache.hadoop.hive.ql.exec.DDLTask
> hive> version
>     > ;
> 10/02/27 17:29:44 INFO parse.ParseDriver: Parsing command: version
>
> FAILED: Parse Error: line 1:0 cannot recognize input 'version'
>
> 10/02/27 17:29:44 ERROR ql.Driver: FAILED: Parse Error: line 1:0 cannot
> recognize input 'version'
>
> org.apache.hadoop.hive.ql.parse.ParseException: line 1:0 cannot recognize
> input 'version'
>
>     at
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:401)
>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:299)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:377)
>     at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
>     at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:303)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     at java.lang.reflect.Method.invoke(Method.java:597)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
>
> kindly help with this i using ubuntu 9.10 (karmic) OS
>
> thanks
> prakash



-- 
Yours,
Zheng