You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by liaifan <li...@aliyun.com> on 2020/04/24 07:28:36 UTC
kylin error
SLF4J: Found binding in [jar:file:/opt/hadoopclient/HDFS/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoopclient/HBase/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoopclient/Hive/HCatalog/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoopclient/HBase/hbase/lib/jdbc/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2020-04-24 10:37:52,111 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2020-04-24 10:37:52,154 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
#
# java.lang.OutOfMemoryError: GC overhead limit exceeded
# -XX:OnOutOfMemoryError=""kill -9 %p""
# Executing /bin/sh -c ""kill -9 12032""...
sh: kill -9 12032: δ???????
?????????
?й???????, ????????????????
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:149)
at java.lang.StringCoding.decode(StringCoding.java:193)
at java.lang.String.<init>(String.java:426)
at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.readEntry(ZipFileIndex.java:665)
at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.buildIndex(ZipFileIndex.java:576)
at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.access$000(ZipFileIndex.java:483)
at com.sun.tools.javac.file.ZipFileIndex.checkIndex(ZipFileIndex.java:191)
at com.sun.tools.javac.file.ZipFileIndex.<init>(ZipFileIndex.java:136)
at com.sun.tools.javac.file.ZipFileIndexCache.getZipFileIndex(ZipFileIndexCache.java:100)
at com.sun.tools.javac.file.JavacFileManager.openArchive(JavacFileManager.java:529)
at com.sun.tools.javac.file.JavacFileManager.openArchive(JavacFileManager.java:462)
at com.sun.tools.javac.file.JavacFileManager.listContainer(JavacFileManager.java:348)
at com.sun.tools.javac.file.JavacFileManager.list(JavacFileManager.java:624)
at com.sun.tools.javac.jvm.ClassReader.fillIn(ClassReader.java:2803)
at com.sun.tools.javac.jvm.ClassReader.complete(ClassReader.java:2446)
at com.sun.tools.javac.jvm.ClassReader.access$000(ClassReader.java:76)
at com.sun.tools.javac.jvm.ClassReader$1.complete(ClassReader.java:240)
at com.sun.tools.javac.code.Symbol.complete(Symbol.java:574)
at com.sun.tools.javac.comp.Enter.visitTopLevel(Enter.java:300)
at com.sun.tools.javac.tree.JCTree$JCCompilationUnit.accept(JCTree.java:518)
at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:258)
at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:272)
at com.sun.tools.javac.comp.Enter.complete(Enter.java:486)
at com.sun.tools.javac.comp.Enter.main(Enter.java:471)
at com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:982)
at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:857)
at com.sun.tools.javac.main.Main.compile(Main.java:523)
at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:129)
at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:138)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:224)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
2020-04-24 10:37:57,393 ERROR tool.ImportTool: Import failed: java.io.IOException: Error returned by javac
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:226)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
The command is:
/opt/sqoop-1.4.7/bin/sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true -Dmapreduce.job.queuename=default --connect "jdbc:mysql://*.*.*.*:3306/eladmin" --driver com.mysql.jdbc.Driver --username root --password "123456" --query "SELECT GAUSS_SYSTEM.SYS_CODE as GAUSS_SYSTEM_SYS_CODE ,GAUSS_SYSTEM.SYS_NAME as GAUSS_SYSTEM_SYS_NAME ,GAUSS_SYSTEM.SORT as GAUSS_SYSTEM_SORT ,GAUSS_SYSTEM.ADD_TIME as GAUSS_SYSTEM_ADD_TIME ,GAUSS_SYSTEM.ENABLED as GAUSS_SYSTEM_ENABLED FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM WHERE 1=1 AND \$CONDITIONS" --target-dir hdfs://hacluster/kylin/kylin_metadata/kylin-23e5a7ff-87d9-6f81-097d-4e6f730930d2/kylin_intermediate_sdp_941046e2_b474_a478_e579_ffc8eb5784f3 --split-by GAUSS_SYSTEM.SORT --boundary-query "SELECT min(GAUSS_SYSTEM.SORT), max(GAUSS_SYSTEM.SORT) FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM" --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '|' --num-mappers 4
at org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:96)
at org.apache.kylin.source.jdbc.CmdStep.sqoopFlatHiveTable(CmdStep.java:50)
at org.apache.kylin.source.jdbc.CmdStep.doWork(CmdStep.java:61)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
... 6 more
kylin版本2.6.5,sqoop版本1.4.7,数据源为mysql,或者postgresql,都会出现上面的错误。
kylin jvm参数配置。
export KYLIN_JVM_SETTINGS="-Xms1024M -Xmx4096M -Xss1024K -XX:MaxPermSize=512M -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCDateStamps -Xloggc:$KYLIN_HOME/logs/kylin.gc.$$ -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=64M"
sqoop import命令。单独拿出来是可以执行。
/opt/sqoop-1.4.7/bin/sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true -Dmapreduce.job.queuename=default --connect "jdbc:mysql://10.102.0.4:3306/eladmin" --driver com.mysql.jdbc.Driver --username root --password "123456" --query "SELECT GAUSS_SYSTEM.SYS_CODE as GAUSS_SYSTEM_SYS_CODE ,GAUSS_SYSTEM.SYS_NAME as GAUSS_SYSTEM_SYS_NAME ,GAUSS_SYSTEM.SORT as GAUSS_SYSTEM_SORT ,GAUSS_SYSTEM.ADD_TIME as GAUSS_SYSTEM_ADD_TIME ,GAUSS_SYSTEM.ENABLED as GAUSS_SYSTEM_ENABLED FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM WHERE 1=1 AND \$CONDITIONS" --target-dir hdfs://hacluster/kylin/kylin_metadata/kylin-23e5a7ff-87d9-6f81-097d-4e6f730930d2/kylin_intermediate_sdp_941046e2_b474_a478_e579_ffc8eb5784f3 --split-by GAUSS_SYSTEM.SORT --boundary-query "SELECT min(GAUSS_SYSTEM.SORT), max(GAUSS_SYSTEM.SORT) FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM" --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '|' --num-mappers 4
Re: kylin error
Posted by ShaoFeng Shi <sh...@apache.org>.
Looks like a normal OOM issue; Please check to give more Java heap to
sqoop.
Best regards,
Shaofeng Shi 史少锋
Apache Kylin PMC
Email: shaofengshi@apache.org
Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
Join Kylin user mail group: user-subscribe@kylin.apache.org
Join Kylin dev mail group: dev-subscribe@kylin.apache.org
liaifan <li...@aliyun.com> 于2020年4月24日周五 下午3:28写道:
>
> SLF4J: Found binding in [jar:file:/opt/hadoopclient/HDFS/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in [jar:file:/opt/hadoopclient/HBase/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in [jar:file:/opt/hadoopclient/Hive/HCatalog/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in [jar:file:/opt/hadoopclient/HBase/hbase/lib/jdbc/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings
> for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> 2020-04-24 10:37:52,111 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
>
> 2020-04-24 10:37:52,154 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
>
> Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
> #
> # java.lang.OutOfMemoryError: GC overhead limit exceeded
> # -XX:OnOutOfMemoryError=""kill -9 %p""
> # Executing /bin/sh -c ""kill -9 12032""...
> sh: kill -9 12032: δ???????
>
>
> ?????????
> ?й???????, ????????????????
> java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.lang.StringCoding$StringDecoder.decode(StringCoding.java:149)
> at java.lang.StringCoding.decode(StringCoding.java:193)
> at java.lang.String.<init>(String.java:426)
>
> at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.readEntry(ZipFileIndex.java:665)
>
> at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.buildIndex(ZipFileIndex.java:576)
>
> at com.sun.tools.javac.file.ZipFileIndex$ZipDirectory.access$000(ZipFileIndex.java:483)
> at com.sun.tools.javac.file.ZipFileIndex.checkIndex(ZipFileIndex.java:191)
> at com.sun.tools.javac.file.ZipFileIndex.<init>(ZipFileIndex.java:136)
>
> at com.sun.tools.javac.file.ZipFileIndexCache.getZipFileIndex(ZipFileIndexCache.java:100)
>
> at com.sun.tools.javac.file.JavacFileManager.openArchive(JavacFileManager.java:529)
>
> at com.sun.tools.javac.file.JavacFileManager.openArchive(JavacFileManager.java:462)
>
> at com.sun.tools.javac.file.JavacFileManager.listContainer(JavacFileManager.java:348)
>
> at com.sun.tools.javac.file.JavacFileManager.list(JavacFileManager.java:624)
> at com.sun.tools.javac.jvm.ClassReader.fillIn(ClassReader.java:2803)
> at com.sun.tools.javac.jvm.ClassReader.complete(ClassReader.java:2446)
> at com.sun.tools.javac.jvm.ClassReader.access$000(ClassReader.java:76)
> at com.sun.tools.javac.jvm.ClassReader$1.complete(ClassReader.java:240)
> at com.sun.tools.javac.code.Symbol.complete(Symbol.java:574)
> at com.sun.tools.javac.comp.Enter.visitTopLevel(Enter.java:300)
>
> at com.sun.tools.javac.tree.JCTree$JCCompilationUnit.accept(JCTree.java:518)
> at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:258)
> at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:272)
> at com.sun.tools.javac.comp.Enter.complete(Enter.java:486)
> at com.sun.tools.javac.comp.Enter.main(Enter.java:471)
> at com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:982)
> at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:857)
> at com.sun.tools.javac.main.Main.compile(Main.java:523)
> at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:129)
> at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:138)
>
> at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:224)
> at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
>
> 2020-04-24 10:37:57,393 ERROR tool.ImportTool: Import failed: java.io.IOException: Error returned by javac
>
> at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:226)
> at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
> at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
>
> The command is:
>
> /opt/sqoop-1.4.7/bin/sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true -Dmapreduce.job.queuename=default --connect "jdbc:mysql://*.*.*.*:3306/eladmin" --driver com.mysql.jdbc.Driver --username root --password "123456" --query "SELECT GAUSS_SYSTEM.SYS_CODE as GAUSS_SYSTEM_SYS_CODE ,GAUSS_SYSTEM.SYS_NAME as GAUSS_SYSTEM_SYS_NAME ,GAUSS_SYSTEM.SORT as GAUSS_SYSTEM_SORT ,GAUSS_SYSTEM.ADD_TIME as GAUSS_SYSTEM_ADD_TIME ,GAUSS_SYSTEM.ENABLED as GAUSS_SYSTEM_ENABLED FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM WHERE 1=1 AND \$CONDITIONS" --target-dir hdfs://hacluster/kylin/kylin_metadata/kylin-23e5a7ff-87d9-6f81-097d-4e6f730930d2/kylin_intermediate_sdp_941046e2_b474_a478_e579_ffc8eb5784f3 --split-by GAUSS_SYSTEM.SORT --boundary-query "SELECT min(GAUSS_SYSTEM.SORT), max(GAUSS_SYSTEM.SORT) FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM" --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '|' --num-mappers 4
>
> at org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:96)
>
> at org.apache.kylin.source.jdbc.CmdStep.sqoopFlatHiveTable(CmdStep.java:50)
> at org.apache.kylin.source.jdbc.CmdStep.doWork(CmdStep.java:61)
>
> at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:167)
> ... 6 more
>
>
> kylin版本2.6.5,sqoop版本1.4.7,数据源为mysql,或者postgresql,都会出现上面的错误。
> kylin jvm参数配置。
>
> export KYLIN_JVM_SETTINGS="-Xms1024M -Xmx4096M -Xss1024K -XX:MaxPermSize=512M -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCDateStamps -Xloggc:$KYLIN_HOME/logs/kylin.gc.$$ -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=64M"
>
> sqoop import命令。单独拿出来是可以执行。
>
> /opt/sqoop-1.4.7/bin/sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true -Dmapreduce.job.queuename=default --connect "jdbc:mysql://
> 10.102.0.4:3306/eladmin
> " --driver com.mysql.jdbc.Driver --username root --password "123456" --query "SELECT GAUSS_SYSTEM.SYS_CODE as GAUSS_SYSTEM_SYS_CODE ,GAUSS_SYSTEM.SYS_NAME as GAUSS_SYSTEM_SYS_NAME ,GAUSS_SYSTEM.SORT as GAUSS_SYSTEM_SORT ,GAUSS_SYSTEM.ADD_TIME as GAUSS_SYSTEM_ADD_TIME ,GAUSS_SYSTEM.ENABLED as GAUSS_SYSTEM_ENABLED FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM WHERE 1=1 AND \$CONDITIONS" --target-dir hdfs://hacluster/kylin/kylin_metadata/kylin-23e5a7ff-87d9-6f81-097d-4e6f730930d2/kylin_intermediate_sdp_941046e2_b474_a478_e579_ffc8eb5784f3 --split-by GAUSS_SYSTEM.SORT --boundary-query "SELECT min(GAUSS_SYSTEM.SORT), max(GAUSS_SYSTEM.SORT) FROM ELADMIN.GAUSS_SYSTEM as GAUSS_SYSTEM" --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '|' --num-mappers 4
>