You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by "曹勇 (Jira)" <ji...@apache.org> on 2021/09/09 12:20:00 UTC

[jira] [Created] (KYLIN-5087) java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported

曹勇 created KYLIN-5087:
-------------------------

             Summary: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
                 Key: KYLIN-5087
                 URL: https://issues.apache.org/jira/browse/KYLIN-5087
             Project: Kylin
          Issue Type: Bug
          Components: Environment , Integration, Query Engine, Spark Engine, Web 
    Affects Versions: v4.0.0
         Environment: hadoop-3.2.2
hive-2.3.9
kylin-4.0.0
scala-2.12.14
spark-3.1.2-bin-hadoop3.2
zookeeper-3.7.0
            Reporter: 曹勇
         Attachments: image-2021-09-09-20-11-08-870.png

When I query example sql `select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt` for kylin_sales_cube in the document ,the issue happened:
{code:java}
2021-09-09 20:06:10,063 WARN  [Thread-7] config.package:422 : Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.NullPointerException. Using `false` as a default value.
2021-09-09 20:06:10,482 INFO  [Thread-7] client.AHSProxy:42 : Connecting to Application History server at node1/192.168.111.49:10200
2021-09-09 20:06:10,601 INFO  [Thread-7] yarn.Client:57 : Requesting a new application from cluster with 3 NodeManagers
2021-09-09 20:06:11,337 INFO  [Thread-7] conf.Configuration:2795 : resource-types.xml not found
2021-09-09 20:06:11,338 INFO  [Thread-7] resource.ResourceUtils:442 : Unable to find 'resource-types.xml'.
2021-09-09 20:06:11,350 INFO  [Thread-7] yarn.Client:57 : Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
2021-09-09 20:06:11,350 INFO  [Thread-7] yarn.Client:57 : Will allocate AM container, with 896 MB memory including 384 MB overhead
2021-09-09 20:06:11,351 INFO  [Thread-7] yarn.Client:57 : Setting up container launch context for our AM
2021-09-09 20:06:11,351 INFO  [Thread-7] yarn.Client:57 : Setting up the launch environment for our AM container
2021-09-09 20:06:11,356 INFO  [Thread-7] yarn.Client:57 : Preparing resources for our AM container
2021-09-09 20:06:11,434 WARN  [Thread-7] yarn.Client:69 : Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
2021-09-09 20:06:13,436 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_libs__3169881654091659892.zip -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_libs__3169881654091659892.zip
2021-09-09 20:06:14,657 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/kylin-parquet-job-4.0.0.jar
2021-09-09 20:06:15,058 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/spark-executor-log4j.properties
2021-09-09 20:06:15,238 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_conf__4477603375905676389.zip -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_conf__.zip
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing view acls to: hadoop
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing modify acls to: hadoop
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing view acls groups to:
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing modify acls groups to:
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-09 20:06:15,378 INFO  [Thread-7] yarn.Client:57 : Submitting application application_1631008247268_0018 to ResourceManager
2021-09-09 20:06:15,637 INFO  [Thread-7] impl.YarnClientImpl:329 : Submitted application application_1631008247268_0018
2021-09-09 20:06:16,646 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:16,656 INFO  [Thread-7] yarn.Client:57 :
         client token: N/A
         diagnostics: AM container is launched, waiting for AM container to Register with RM
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1631189175391
         final status: UNDEFINED
         tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
         user: hadoop
2021-09-09 20:06:17,659 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:18,662 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:19,665 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:19,947 INFO  [dispatcher-event-loop-7] cluster.YarnClientSchedulerBackend:57 : Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> node1,node2, PROXY_URI_BASES -> http://node1:8088/proxy/application_1631008247268_0018,http://node2:8088/proxy/application_1631008247268_0018, RM_HA_URLS -> node1:8088,node2:8088), /proxy/application_1631008247268_0018
2021-09-09 20:06:20,668 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: RUNNING)
2021-09-09 20:06:20,669 INFO  [Thread-7] yarn.Client:57 :
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: 192.168.111.25
         ApplicationMaster RPC port: -1
         queue: default
         start time: 1631189175391
         final status: UNDEFINED
         tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
         user: hadoop
2021-09-09 20:06:20,672 INFO  [Thread-7] cluster.YarnClientSchedulerBackend:57 : Application application_1631008247268_0018 has started running.
2021-09-09 20:06:20,693 INFO  [Thread-7] util.Utils:57 : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37707.
2021-09-09 20:06:20,693 INFO  [Thread-7] netty.NettyBlockTransferService:81 : Server created on node3:37707
2021-09-09 20:06:20,696 INFO  [Thread-7] storage.BlockManager:57 : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2021-09-09 20:06:20,706 INFO  [Thread-7] storage.BlockManagerMaster:57 : Registering BlockManager BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,713 INFO  [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager node3:37707 with 2004.6 MiB RAM, BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,719 INFO  [Thread-7] storage.BlockManagerMaster:57 : Registered BlockManager BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,721 INFO  [Thread-7] storage.BlockManager:57 : Initialized BlockManager: BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,832 INFO  [dispatcher-event-loop-2] cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:57 : ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2021-09-09 20:06:20,922 INFO  [Thread-7] ui.ServerInfo:57 : Adding filter to /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
2021-09-09 20:06:20,924 INFO  [Thread-7] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@71ddd2af{/metrics/json,null,AVAILABLE,@Spark}
2021-09-09 20:06:25,825 INFO  [dispatcher-CoarseGrainedScheduler] cluster.YarnSchedulerBackend$YarnDriverEndpoint:57 : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.111.50:36340) with ID 1,  ResourceProfileId 0
2021-09-09 20:06:25,872 INFO  [Thread-7] cluster.YarnClientSchedulerBackend:57 : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
2021-09-09 20:06:25,949 ERROR [Thread-7] sql.SparderContext:94 : Error for initializing spark
java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
        at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
        at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
        at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
        at org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1(UdfManager.scala:35)
        at org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1$adapted(UdfManager.scala:34)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.kylin.query.UdfManager.org$apache$kylin$query$UdfManager$$registerBuiltInFunc(UdfManager.scala:34)
        at org.apache.kylin.query.UdfManager$.create(UdfManager.scala:85)
        at org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:116)
        at org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
        at java.lang.Thread.run(Thread.java:748)
2021-09-09 20:06:25,951 INFO  [Thread-7] sql.SparderContext:57 : Setting initializing Spark thread to null.
2021-09-09 20:06:25,952 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] monitor.SparderContextCanary:68 : Start monitoring Sparder
2021-09-09 20:06:25,953 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Init spark.
2021-09-09 20:06:25,954 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Initializing Spark thread starting.
2021-09-09 20:06:25,954 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Initializing Spark, waiting for done.
2021-09-09 20:06:25,955 INFO  [Thread-37] sql.SparderContext:57 : SparderContext deploy with spark master: yarn
2021-09-09 20:06:25,964 INFO  [Thread-37] sql.SparderContext:57 : Spark application id is application_1631008247268_0018
2021-09-09 20:06:25,965 INFO  [Thread-37] sql.SparderContext:57 : Spark context started successfully with stack trace:
2021-09-09 20:06:25,965 INFO  [Thread-37] sql.SparderContext:57 : java.lang.Thread.getStackTrace(Thread.java:1559)
org.apache.spark.sql.SparderContext$$anon$1.$anonfun$run$8(SparderContext.scala:171)
org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
org.apache.spark.sql.SparderContext$.logInfo(SparderContext.scala:45)
org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:171)
java.lang.Thread.run(Thread.java:748)
2021-09-09 20:06:25,966 INFO  [Thread-37] sql.SparderContext:57 : Class loader: org.apache.kylin.spark.classloader.SparkClassLoader@3187093f
2021-09-09 20:06:25,969 INFO  [Thread-37] memory.MonitorEnv:57 : create driver monitor env
2021-09-09 20:06:25,976 INFO  [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager node2:44050 with 2004.6 MiB RAM, BlockManagerId(1, node2, 44050, None)
2021-09-09 20:06:25,988 INFO  [Thread-37] sql.SparderContext:57 : setup master endpoint finished.hostPort:node3:40116
2021-09-09 20:06:26,025 INFO  [Thread-37] client.AHSProxy:42 : Connecting to Application History server at node1/192.168.111.49:10200
2021-09-09 20:06:26,029 INFO  [Thread-37] sql.SparderContext:57 : Setting initializing Spark thread to null.
2021-09-09 20:06:26,037 ERROR [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:576 : Exception while executing query
java.sql.SQLException: Error while executing SQL "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000": java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
        at org.apache.calcite.avatica.Helper.createException(Helper.java:56)
        at org.apache.calcite.avatica.Helper.createException(Helper.java:41)
        at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:163)
        at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
        at org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
        at org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
        at org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
        at org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
        at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
        at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
        at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
        at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
        at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
        at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
        at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
        at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
        at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
        at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
        at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
        at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
        at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
        at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
        at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
        at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
        at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
        at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:50)
        at Baz.bind(Unknown Source)
        at org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
        at org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
        at org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
        at org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
        at org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
        at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
        at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
        at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
        at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
        at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
        at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
        ... 84 more
Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
        at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
        at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
        at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
        at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
        at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
        at org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
        at org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
        at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
        at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
        at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
        at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
        at org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
        at org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
        at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
        at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
        at org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
        at org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
        at org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
        at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
        ... 96 more
2021-09-09 20:06:26,041 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:1222 : Processed rows for each storageContext: 0
2021-09-09 20:06:26,048 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:391 :
==========================[QUERY]===============================
Query Id: 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384
SQL: select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt
User: ADMIN
Success: false
Duration: 21.519
Project: learn_kylin
Realization Names: [CUBE[name=kylin_sales_cube]]
Cuboid Ids: [16384]
Is Exactly Matched: [false]
Total scan count: 0
Total scan files: 0
Total metadata time: 0ms
Total spark scan time: 0ms
Total scan bytes: -1
Result row count: 0
Storage cache used: false
Is Query Push-Down: false
Is Prepare: false
Used Spark pool: null
Trace URL: null
Message: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
while executing SQL: "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000"
==========================[QUERY]===============================2021-09-09 20:06:26,052 ERROR [http-bio-7070-exec-1] controller.BasicController:65 :
org.apache.kylin.rest.exception.InternalErrorException: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
while executing SQL: "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000"
        at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:486)
        at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
        at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
        at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
        at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
        at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
        at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
        at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
        at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
        at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
        at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
        at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
        at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
        at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
        at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
        at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
        at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
        at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
        at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
        at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
        at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
        at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
        at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
        at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
        at org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
        at org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
        at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
        at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
        at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
        at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
        at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
        at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
        at org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
        at org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
        at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
        at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
        at org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
        at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
        at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
        at org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
        at org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
        at org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
        at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
        at Baz.bind(Unknown Source)
        at org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
        at org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
        at org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
        at org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
        at org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
        at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
        at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
        at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
        at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
        at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
        at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
        at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
        at org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
        at org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
        at org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
        at org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
        at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
        ... 78 more
2021-09-09 20:06:35,790 INFO  [FetcherRunner 1341171673-29] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others

{code}
web ui as follow:

!image-2021-09-09-20-11-08-870.png!

 

Obviously,my spark version is spark-3.1.2-bin-hadoop3.2 in the environment. But why kylin call spark version 2.0.0 ?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Re:[jira] [Created] (KYLIN-5087) java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported

Posted by 大数据开发工程师-付德彬 <bi...@163.com>.
退订


| |
大数据开发工程师-付德彬
|
|
bingclouds@163.com
|
签名由网易邮箱大师定制


On 09/9/2021 20:20,曹勇 (Jira)<ji...@apache.org> wrote:
曹勇 created KYLIN-5087:
-------------------------

Summary: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
Key: KYLIN-5087
URL: https://issues.apache.org/jira/browse/KYLIN-5087
Project: Kylin
Issue Type: Bug
Components: Environment , Integration, Query Engine, Spark Engine, Web
Affects Versions: v4.0.0
Environment: hadoop-3.2.2
hive-2.3.9
kylin-4.0.0
scala-2.12.14
spark-3.1.2-bin-hadoop3.2
zookeeper-3.7.0
Reporter: 曹勇
Attachments: image-2021-09-09-20-11-08-870.png

When I query example sql `select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt` for kylin_sales_cube in the document ,the issue happened:
{code:java}
2021-09-09 20:06:10,063 WARN  [Thread-7] config.package:422 : Can not load the default value of `spark.yarn.isHadoopProvided` from `org/apache/spark/deploy/yarn/config.properties` with error, java.lang.NullPointerException. Using `false` as a default value.
2021-09-09 20:06:10,482 INFO  [Thread-7] client.AHSProxy:42 : Connecting to Application History server at node1/192.168.111.49:10200
2021-09-09 20:06:10,601 INFO  [Thread-7] yarn.Client:57 : Requesting a new application from cluster with 3 NodeManagers
2021-09-09 20:06:11,337 INFO  [Thread-7] conf.Configuration:2795 : resource-types.xml not found
2021-09-09 20:06:11,338 INFO  [Thread-7] resource.ResourceUtils:442 : Unable to find 'resource-types.xml'.
2021-09-09 20:06:11,350 INFO  [Thread-7] yarn.Client:57 : Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
2021-09-09 20:06:11,350 INFO  [Thread-7] yarn.Client:57 : Will allocate AM container, with 896 MB memory including 384 MB overhead
2021-09-09 20:06:11,351 INFO  [Thread-7] yarn.Client:57 : Setting up container launch context for our AM
2021-09-09 20:06:11,351 INFO  [Thread-7] yarn.Client:57 : Setting up the launch environment for our AM container
2021-09-09 20:06:11,356 INFO  [Thread-7] yarn.Client:57 : Preparing resources for our AM container
2021-09-09 20:06:11,434 WARN  [Thread-7] yarn.Client:69 : Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
2021-09-09 20:06:13,436 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_libs__3169881654091659892.zip -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_libs__3169881654091659892.zip
2021-09-09 20:06:14,657 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/kylin-parquet-job-4.0.0.jar
2021-09-09 20:06:15,058 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/spark-executor-log4j.properties
2021-09-09 20:06:15,238 INFO  [Thread-7] yarn.Client:57 : Uploading resource file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_conf__4477603375905676389.zip -> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_conf__.zip
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing view acls to: hadoop
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing modify acls to: hadoop
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing view acls groups to:
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : Changing modify acls groups to:
2021-09-09 20:06:15,354 INFO  [Thread-7] spark.SecurityManager:57 : SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-09 20:06:15,378 INFO  [Thread-7] yarn.Client:57 : Submitting application application_1631008247268_0018 to ResourceManager
2021-09-09 20:06:15,637 INFO  [Thread-7] impl.YarnClientImpl:329 : Submitted application application_1631008247268_0018
2021-09-09 20:06:16,646 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:16,656 INFO  [Thread-7] yarn.Client:57 :
client token: N/A
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1631189175391
final status: UNDEFINED
tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
user: hadoop
2021-09-09 20:06:17,659 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:18,662 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:19,665 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: ACCEPTED)
2021-09-09 20:06:19,947 INFO  [dispatcher-event-loop-7] cluster.YarnClientSchedulerBackend:57 : Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> node1,node2, PROXY_URI_BASES -> http://node1:8088/proxy/application_1631008247268_0018,http://node2:8088/proxy/application_1631008247268_0018, RM_HA_URLS -> node1:8088,node2:8088), /proxy/application_1631008247268_0018
2021-09-09 20:06:20,668 INFO  [Thread-7] yarn.Client:57 : Application report for application_1631008247268_0018 (state: RUNNING)
2021-09-09 20:06:20,669 INFO  [Thread-7] yarn.Client:57 :
client token: N/A
diagnostics: N/A
ApplicationMaster host: 192.168.111.25
ApplicationMaster RPC port: -1
queue: default
start time: 1631189175391
final status: UNDEFINED
tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
user: hadoop
2021-09-09 20:06:20,672 INFO  [Thread-7] cluster.YarnClientSchedulerBackend:57 : Application application_1631008247268_0018 has started running.
2021-09-09 20:06:20,693 INFO  [Thread-7] util.Utils:57 : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37707.
2021-09-09 20:06:20,693 INFO  [Thread-7] netty.NettyBlockTransferService:81 : Server created on node3:37707
2021-09-09 20:06:20,696 INFO  [Thread-7] storage.BlockManager:57 : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2021-09-09 20:06:20,706 INFO  [Thread-7] storage.BlockManagerMaster:57 : Registering BlockManager BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,713 INFO  [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager node3:37707 with 2004.6 MiB RAM, BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,719 INFO  [Thread-7] storage.BlockManagerMaster:57 : Registered BlockManager BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,721 INFO  [Thread-7] storage.BlockManager:57 : Initialized BlockManager: BlockManagerId(driver, node3, 37707, None)
2021-09-09 20:06:20,832 INFO  [dispatcher-event-loop-2] cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:57 : ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2021-09-09 20:06:20,922 INFO  [Thread-7] ui.ServerInfo:57 : Adding filter to /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
2021-09-09 20:06:20,924 INFO  [Thread-7] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@71ddd2af{/metrics/json,null,AVAILABLE,@Spark}
2021-09-09 20:06:25,825 INFO  [dispatcher-CoarseGrainedScheduler] cluster.YarnSchedulerBackend$YarnDriverEndpoint:57 : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.111.50:36340) with ID 1,  ResourceProfileId 0
2021-09-09 20:06:25,872 INFO  [Thread-7] cluster.YarnClientSchedulerBackend:57 : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
2021-09-09 20:06:25,949 ERROR [Thread-7] sql.SparderContext:94 : Error for initializing spark
java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
at org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1(UdfManager.scala:35)
at org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1$adapted(UdfManager.scala:34)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.kylin.query.UdfManager.org$apache$kylin$query$UdfManager$$registerBuiltInFunc(UdfManager.scala:34)
at org.apache.kylin.query.UdfManager$.create(UdfManager.scala:85)
at org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:116)
at org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
at java.lang.Thread.run(Thread.java:748)
2021-09-09 20:06:25,951 INFO  [Thread-7] sql.SparderContext:57 : Setting initializing Spark thread to null.
2021-09-09 20:06:25,952 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] monitor.SparderContextCanary:68 : Start monitoring Sparder
2021-09-09 20:06:25,953 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Init spark.
2021-09-09 20:06:25,954 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Initializing Spark thread starting.
2021-09-09 20:06:25,954 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] sql.SparderContext:57 : Initializing Spark, waiting for done.
2021-09-09 20:06:25,955 INFO  [Thread-37] sql.SparderContext:57 : SparderContext deploy with spark master: yarn
2021-09-09 20:06:25,964 INFO  [Thread-37] sql.SparderContext:57 : Spark application id is application_1631008247268_0018
2021-09-09 20:06:25,965 INFO  [Thread-37] sql.SparderContext:57 : Spark context started successfully with stack trace:
2021-09-09 20:06:25,965 INFO  [Thread-37] sql.SparderContext:57 : java.lang.Thread.getStackTrace(Thread.java:1559)
org.apache.spark.sql.SparderContext$$anon$1.$anonfun$run$8(SparderContext.scala:171)
org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
org.apache.spark.sql.SparderContext$.logInfo(SparderContext.scala:45)
org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:171)
java.lang.Thread.run(Thread.java:748)
2021-09-09 20:06:25,966 INFO  [Thread-37] sql.SparderContext:57 : Class loader: org.apache.kylin.spark.classloader.SparkClassLoader@3187093f
2021-09-09 20:06:25,969 INFO  [Thread-37] memory.MonitorEnv:57 : create driver monitor env
2021-09-09 20:06:25,976 INFO  [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager node2:44050 with 2004.6 MiB RAM, BlockManagerId(1, node2, 44050, None)
2021-09-09 20:06:25,988 INFO  [Thread-37] sql.SparderContext:57 : setup master endpoint finished.hostPort:node3:40116
2021-09-09 20:06:26,025 INFO  [Thread-37] client.AHSProxy:42 : Connecting to Application History server at node1/192.168.111.49:10200
2021-09-09 20:06:26,029 INFO  [Thread-37] sql.SparderContext:57 : Setting initializing Spark thread to null.
2021-09-09 20:06:26,037 ERROR [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:576 : Exception while executing query
java.sql.SQLException: Error while executing SQL "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000": java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
at org.apache.calcite.avatica.Helper.createException(Helper.java:56)
at org.apache.calcite.avatica.Helper.createException(Helper.java:41)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:163)
at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
at org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
at org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
at org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
at org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:50)
at Baz.bind(Unknown Source)
at org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
at org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
at org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
at org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
at org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
... 84 more
Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
at org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
at org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
at org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
at org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
at org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
at org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
at org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
... 96 more
2021-09-09 20:06:26,041 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:1222 : Processed rows for each storageContext: 0
2021-09-09 20:06:26,048 INFO  [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35] service.QueryService:391 :
==========================[QUERY]===============================
Query Id: 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384
SQL: select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt
User: ADMIN
Success: false
Duration: 21.519
Project: learn_kylin
Realization Names: [CUBE[name=kylin_sales_cube]]
Cuboid Ids: [16384]
Is Exactly Matched: [false]
Total scan count: 0
Total scan files: 0
Total metadata time: 0ms
Total spark scan time: 0ms
Total scan bytes: -1
Result row count: 0
Storage cache used: false
Is Query Push-Down: false
Is Prepare: false
Used Spark pool: null
Trace URL: null
Message: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
while executing SQL: "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000"
==========================[QUERY]===============================2021-09-09 20:06:26,052 ERROR [http-bio-7070-exec-1] controller.BasicController:65 :
org.apache.kylin.rest.exception.InternalErrorException: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
while executing SQL: "select * from (select part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt) limit 50000"
at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:486)
at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
at org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
at org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
at org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
at org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
at org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
at org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
at org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
at org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
at org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
at org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
at org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
at org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
at org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
at org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
at org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
at org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
at Baz.bind(Unknown Source)
at org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
at org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
at org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
at org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
at org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
at org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
at org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
at org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
at org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
at org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
... 78 more
2021-09-09 20:06:35,790 INFO  [FetcherRunner 1341171673-29] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others

{code}
web ui as follow:

!image-2021-09-09-20-11-08-870.png!

 

Obviously,my spark version is spark-3.1.2-bin-hadoop3.2 in the environment. But why kylin call spark version 2.0.0 ?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)