You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/05/26 10:09:51 UTC
[GitHub] [hudi] yanmushi opened a new issue, #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
yanmushi opened a new issue, #5691:
URL: https://github.com/apache/hudi/issues/5691
**Describe the problem you faced**
After I compile hudi-flink-bundle, I create a yarn session and write the data to hive. There is a NoSuchObjectException in the dashboard log panel.
**To Reproduce**
Steps to reproduce the behavior:
1. Because I use hdp 3.1.5 and the hive is 3.1.0, so I update the pom.xml of hudi-flink-bundle, set the hive.version property to 3.1.0. Then run command 'mvn clean install -DskipTests -Dscala.version=2.12.10 -Dflink.version=1.13.6 -Dscala.binary.version=2.12 -Pflink-bundle-shade-hive3'
2. create a yarn session, and execute the sql
3. Go to the dashboard, there is NoSuchObjectException in the log panel
```sql
-- SQL
create table tbl_ts8(
uuid VARCHAR(20),
name VARCHAR(10),
age INT,
ts TIMESTAMP(3),
`partition` VARCHAR(20),
primary key(uuid) NOT ENFORCED
)
with (
'connector' = 'hudi',
'path' = 'hdfs://10.12.4.2:8020/tmp/hudi/tbl_ts8',
'table.type' = 'COPY_ON_WRITE', --If MERGE_ON_READ, hive query will not have output until the parquet file is generated
'hive_sync.enable' = 'true', -- Required. To enable hive synchronization
'hive_sync.mode' = 'jdbc', -- Required. Setting hive sync mode to hms, default jdbc
'hive_sync.metastore.uris' = 'thrift://dl003:9083', -- Required. The port need set on hive-site.xml
'hive_sync.table'='tbl_ts7', -- required, hive table name
'hive_sync.db'='default', -- required, hive database nam
'hive_sync.jdbc_url'='jdbc:hive2://dl003:10000', -- required, hiveServer port
'hive_sync.username'='hive', -- required, JDBC username
'hive_sync.password'='123456' -- required, JDBC password
);
INSERT INTO tbl_ts8 VALUES
('id1','Danny',23,TIMESTAMP '1970-01-01 00:00:01','par1'),
('id2','Stephen',33,TIMESTAMP '1970-01-01 00:00:02','par1'),
('id3','Julian',53,TIMESTAMP '1970-01-01 00:00:03','par2'),
('id4','Fabian',31,TIMESTAMP '1970-01-01 00:00:04','par2'),
('id5','Sophia',18,TIMESTAMP '1970-01-01 00:00:05','par3'),
('id6','Emma',20,TIMESTAMP '1970-01-01 00:00:06','par3'),
('id7','Bob',44,TIMESTAMP '1970-01-01 00:00:07','par4'),
('id8','Han',56,TIMESTAMP '1970-01-01 00:00:08','par4');
```
**Expected behavior**
Run well.
**Environment Description**
* Hudi version : 0.10.1
* Spark version :
* Hive version : 3.1.0.3.1.5.6091-7
* Hadoop version : 3.1.1.3.1.5.6091-7
* Storage (HDFS/S3/GCS..) : HDFS
* Running on Docker? (yes/no) : no
**Stacktrace**
```
2022-05-26 11:10:56,615 INFO org.apache.hudi.sink.StreamWriteOperatorCoordinator [] - Commit instant [20220526111054043] success!
2022-05-26 11:10:56,616 INFO org.apache.hudi.sink.StreamWriteOperatorCoordinator [] - Executor executes action [handle write metadata event for instant 20220526111054043] success!
2022-05-26 11:10:56,648 ERROR org.apache.hudi.sink.StreamWriteOperatorCoordinator [] - Executor executes action [sync hive metadata for instant 20220526111054043] error
java.lang.NoClassDefFoundError: org/apache/hudi/org/apache/hadoop/hive/metastore/api/NoSuchObjectException
at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:78) ~[hudi-flink-bundle_2.12-0.10.1.jar:0.10.1]
at org.apache.hudi.sink.utils.HiveSyncContext.hiveSyncTool(HiveSyncContext.java:51) ~[hudi-flink-bundle_2.12-0.10.1.jar:0.10.1]
at org.apache.hudi.sink.StreamWriteOperatorCoordinator.syncHive(StreamWriteOperatorCoordinator.java:302) ~[hudi-flink-bundle_2.12-0.10.1.jar:0.10.1]
at org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(NonThrownExecutor.java:93) ~[hudi-flink-bundle_2.12-0.10.1.jar:0.10.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_221]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_221]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_221]
Caused by: java.lang.ClassNotFoundException: org.apache.hudi.org.apache.hadoop.hive.metastore.api.NoSuchObjectException
at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_221]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_221]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_221]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_221]
... 7 more
2022-05-26 11:10:56,657 INFO org.apache.hudi.client.AbstractHoodieClient [] - Stopping Timeline service !!
2022-05-26 11:10:56,657 INFO org.apache.hudi.client.embedded.EmbeddedTimelineService [] - Closing Timeline server
2022-05-26 11:10:56,657 INFO org.apache.hudi.timeline.service.TimelineService [] - Closing Timeline Service
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] codope commented on issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
Posted by GitBox <gi...@apache.org>.
codope commented on issue #5691:
URL: https://github.com/apache/hudi/issues/5691#issuecomment-1160577074
@yanmushi let us know about below
> Did you use the bundle jar with hive profile packaged ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] nsivabalan closed issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
Posted by GitBox <gi...@apache.org>.
nsivabalan closed issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
URL: https://github.com/apache/hudi/issues/5691
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] danny0405 commented on issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
Posted by GitBox <gi...@apache.org>.
danny0405 commented on issue #5691:
URL: https://github.com/apache/hudi/issues/5691#issuecomment-1149375909
Did you use the bundle jar with hive profile packaged ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] nsivabalan commented on issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
Posted by GitBox <gi...@apache.org>.
nsivabalan commented on issue #5691:
URL: https://github.com/apache/hudi/issues/5691#issuecomment-1149298252
@danny0405 @wangxianghu : Can either of you folks chime in here please.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] nsivabalan commented on issue #5691: [SUPPORT]NoSuchObjectException: Failed to write data to Hive
Posted by GitBox <gi...@apache.org>.
nsivabalan commented on issue #5691:
URL: https://github.com/apache/hudi/issues/5691#issuecomment-1289936677
Closing due to no activity.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org