You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "Lim Sing Yik (Jira)" <ji...@apache.org> on 2021/04/23 02:39:00 UTC

[jira] [Created] (KYLIN-4987) Unable to build cube for JDBC MS SQL Connection

Lim Sing Yik created KYLIN-4987:
-----------------------------------

             Summary: Unable to build cube for JDBC MS SQL Connection
                 Key: KYLIN-4987
                 URL: https://issues.apache.org/jira/browse/KYLIN-4987
             Project: Kylin
          Issue Type: Bug
          Components: Driver - JDBC
    Affects Versions: v3.1.1
         Environment: centos 7 with hortonwork HDP 3.1
            Reporter: Lim Sing Yik
             Fix For: v3.1.1
         Attachments: image-2021-04-23-10-35-13-151.png, image-2021-04-23-10-35-29-989.png, image-2021-04-23-10-37-04-836.png

Currently, I using the JDBC Driver to connect external MSSQL for kylin

 

The configuration for kylin.properties as below:

  !image-2021-04-23-10-35-13-151.png!

And I managed to load the external MS SQL tables as below:

  !image-2021-04-23-10-35-29-989.png!

However, when I want to build a cube for this external MS SQL, I encounter credential error on the step 1 “Sqoop To Flat Hive Table” as below:

21/04/23 09:26:02 ERROR manager.SqlManager: Error executing statement: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'jpsagent'. ClientConnectionId:dc4f6df7-b44a-4983-ae23-ca61516d34c4

com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'jpsagent'. ClientConnectionId:dc4f6df7-b44a-4983-ae23-ca61516d34c4

                  at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:258)

                  at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)

                  at com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4290)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3157)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:82)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3121)

                  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7151)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2478)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2026)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1687)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1528)

                  at com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:866)

                  at com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:569)

                  at java.sql.DriverManager.getConnection(DriverManager.java:664)

                  at java.sql.DriverManager.getConnection(DriverManager.java:247)

                  at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:903)

                  at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)

                  at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:762)

                  at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:785)

                  at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:288)

                  at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:259)

                  at org.apache.sqoop.manager.SqlManager.getColumnTypesForQuery(SqlManager.java:252)

                  at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:343)

                  at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1879)

                  at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1672)

                  at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)

                  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524)

                  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:655)

                  at org.apache.sqoop.Sqoop.run(Sqoop.java:151)

                  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)

                  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:187)

                  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241)

                  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:250)

                  at org.apache.sqoop.Sqoop.main(Sqoop.java:259)

21/04/23 09:26:02 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter

                  at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1678)

                  at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)

                  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524)

                  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:655)

                  at org.apache.sqoop.Sqoop.run(Sqoop.java:151)

                  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)

                  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:187)

                  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241)

                  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:250)

                  at org.apache.sqoop.Sqoop.main(Sqoop.java:259)

 

 

So, I copy out the sqoop commond an run individually in the kylin env:

sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true  -Dmapreduce.job.queuename=default --connect "jdbc:sqlserver://xxx.xxx.xxx.xx7:1433;database=xxS" --driver com.microsoft.sqlserver.jdbc.SQLServerDriver --username xxxxxxxt --password "xxxxxxxd" --query "SELECT [BKOR_BIW_DETAILS_SUMMARY].[VOT] as [BKOR_BIW_DETAILS_SUMMARY_VOT] ,[BKOR_BIW_DETAILS_SUMMARY].[PROJECT_NAME] as [BKOR_BIW_DETAILS_SUMMARY_PROJECT_NAME] ,[BKOR_BIW_DETAILS_SUMMARY].[CONTINGENT] as [BKOR_BIW_DETAILS_SUMMARY_CONTINGENT] ,[BKOR_BIW_DETAILS_SUMMARY].[DEPARTMENT_LONG] as [BKOR_BIW_DETAILS_SUMMARY_DEPARTMENT_LONG] ,[BKOR_BIW_DETAILS_SUMMARY].[CASH_FLOW] as [BKOR_BIW_DETAILS_SUMMARY_CASH_FLOW] ,[BKOR_BIW_DETAILS_SUMMARY].[MONTH] as [BKOR_BIW_DETAILS_SUMMARY_MONTH] ,[BKOR_BIW_DETAILS_SUMMARY].[ESTIMATION] as [BKOR_BIW_DETAILS_SUMMARY_ESTIMATION] ,[BKOR_BIW_DETAILS_SUMMARY].[EXPENSES] as [BKOR_BIW_DETAILS_SUMMARY_EXPENSES] ,[BKOR_BIW_DETAILS_SUMMARY].[DEPENDENT] as [BKOR_BIW_DETAILS_SUMMARY_DEPENDENT] ,[BKOR_BIW_DETAILS_SUMMARY].[COMMITMENT] as [BKOR_BIW_DETAILS_SUMMARY_COMMITMENT] ,[BKOR_BIW_DETAILS_SUMMARY].[BALANCE] as [BKOR_BIW_DETAILS_SUMMARY_BALANCE]  FROM [dbo].[BKOR_BIW_DETAILS_SUMMARY] [BKOR_BIW_DETAILS_SUMMARY] WHERE 1=1 AND (IS_DELETED = 0)  AND \$CONDITIONS" --target-dir hdfs://ambari.local:8020/kylin/kylin_metadata/kylin-3dde066e-bbc0-96f9-4fc8-1aa4761f8d19/kylin_intermediate_jps_biw_b93a0f75_2ec9_bfa7_2d57_836bb3c03ad3 --split-by [VOT] --boundary-query "SELECT min([VOT]), max([VOT]) FROM [dbo].[BKOR_BIW_DETAILS_SUMMARY] " --null-string '\\N' --null-non-string '\\N' --fields-terminated-by '|' --num-mappers 4

 

However, I faced same credential error as in kylin.

 Therefore, I change the command with replace --password "xxxxxxxd" to –P, and the command is running well as below:

  !image-2021-04-23-10-37-04-836.png!

 

 

Is that any configuration I miss out?

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)