You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2019/11/01 19:51:00 UTC

[GitHub] [incubator-hudi] umehrot2 commented on issue #989: [HUDI-312] Make docker hdfs cluster ephemeral. This is needed to fix flakiness in integration tests. Also, Fix DeltaStreamer hanging issue due to uncaught exception

umehrot2 commented on issue #989: [HUDI-312] Make docker hdfs cluster ephemeral. This is needed to fix flakiness in integration tests. Also, Fix DeltaStreamer hanging issue due to uncaught exception
URL: https://github.com/apache/incubator-hudi/pull/989#issuecomment-548927738
 
 
   > @umehrot2 you could give master a shot after this merges?
   
   Sure will give it a shot now. What I observed yesterday is my first run succeeded. Then I ran again and my integration test failed with this error:
   
   ```
    ###### Stderr #######
   ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in [jar:file:/var/hoodie/ws/hudi-spark/target/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in [jar:file:/var/hoodie/ws/hudi-spark/target/lib/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
   SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
   SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".                
   SLF4J: Defaulting to no-operation (NOP) logger implementation
   SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
   Exception in thread "main" org.apache.hudi.hive.HoodieHiveSyncException: Cannot create hive connection jdbc:hive2://hiveserver:10000/
   	at org.apache.hudi.hive.HoodieHiveClient.createHiveConnection(HoodieHiveClient.java:547)
   	at org.apache.hudi.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:106)
   	at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:60)
   	at org.apache.hudi.HoodieSparkSqlWriter$.syncHive(HoodieSparkSqlWriter.scala:235)
   	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:169)
   	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
   	at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:426)
   	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
   	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:198)
   	at HoodieJavaApp.run(HoodieJavaApp.java:148)
   	at HoodieJavaApp.main(HoodieJavaApp.java:93)
   Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://hiveserver:10000: java.net.ConnectException: Connection refused (Connection refused)
   	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:224)
   	at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
   	at java.sql.DriverManager.getConnection(DriverManager.java:664)
   	at java.sql.DriverManager.getConnection(DriverManager.java:247)
   	at org.apache.hudi.hive.HoodieHiveClient.createHiveConnection(HoodieHiveClient.java:544)
   	... 10 more
   Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
   	at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
   	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
   	at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
   	at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:311)
   	at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:196)
   	... 14 more
   Caused by: java.net.ConnectException: Connection refused (Connection refused)
   	at java.net.PlainSocketImpl.socketConnect(Native Method)
   	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
   	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
   	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
   	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
   	at java.net.Socket.connect(Socket.java:589)
   	at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
   	... 18 more
   
   [ERROR] Tests run: 3, Failures: 3, Errors: 0, Skipped: 0, Time elapsed: 44.126 s <<< FAILURE! - in org.apache.hudi.integ.ITTestHoodieSanity
   [ERROR] testRunHoodieJavaAppOnSinglePartitionKeyCOWTable(org.apache.hudi.integ.ITTestHoodieSanity)  Time elapsed: 12.866 s  <<< FAILURE!
   java.lang.AssertionError: Command ([hive, --hiveconf, hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat, --hiveconf, hive.stats.autogather=false, -e, "add jar /var/hoodie/ws/docker/hoodie/hadoop/hive_base/target/hoodie-hadoop-mr-bundle.jar;drop table if exists docker_hoodie_single_partition_key_cow_test"]) expected to succeed. Exit (1)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnCOWTable(ITTestHoodieSanity.java:88)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnSinglePartitionKeyCOWTable(ITTestHoodieSanity.java:42)
   
   [ERROR] testRunHoodieJavaAppOnMultiPartitionKeysCOWTable(org.apache.hudi.integ.ITTestHoodieSanity)  Time elapsed: 11.897 s  <<< FAILURE!
   java.lang.AssertionError: Command ([hive, --hiveconf, hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat, --hiveconf, hive.stats.autogather=false, -e, "add jar /var/hoodie/ws/docker/hoodie/hadoop/hive_base/target/hoodie-hadoop-mr-bundle.jar;drop table if exists docker_hoodie_multi_partition_key_cow_test"]) expected to succeed. Exit (1)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnCOWTable(ITTestHoodieSanity.java:88)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnMultiPartitionKeysCOWTable(ITTestHoodieSanity.java:54)
   
   [ERROR] testRunHoodieJavaAppOnNonPartitionedCOWTable(org.apache.hudi.integ.ITTestHoodieSanity)  Time elapsed: 19.306 s  <<< FAILURE!
   java.lang.AssertionError: Command ([/var/hoodie/ws/hudi-spark/run_hoodie_app.sh, --hive-sync, --table-path, hdfs://namenode/docker_hoodie_non_partition_key_cow_test, --hive-url, jdbc:hive2://hiveserver:10000, --hive-table, docker_hoodie_non_partition_key_cow_test, --non-partitioned]) expected to succeed. Exit (1)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnCOWTable(ITTestHoodieSanity.java:107)
   	at org.apache.hudi.integ.ITTestHoodieSanity.testRunHoodieJavaAppOnNonPartitionedCOWTable(ITTestHoodieSanity.java:66)
   ```
   
   And every run after the first time these errors occur starts hanging.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services