You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org> on 2023/01/31 05:54:01 UTC

[GitHub] [iceberg] zhangjiuyang1993 opened a new issue, #6708: Quick start docker-compose demo doesn't work

zhangjiuyang1993 opened a new issue, #6708:
URL: https://github.com/apache/iceberg/issues/6708

   ### Apache Iceberg version
   
   0.14.1
   
   ### Query engine
   
   Spark
   
   ### Please describe the bug 🐞
   
   @nastra 
   version: "3"
   
   services:
   spark-iceberg:
   image: tabulario/spark-iceberg
   container_name: spark-iceberg
   build: spark/
   depends_on:
   - rest
   - minio
   volumes:
   - ./warehouse:/home/iceberg/warehouse
   - ./notebooks:/home/iceberg/notebooks/notebooks
   environment:
   - AWS_ACCESS_KEY_ID=admin
   - AWS_SECRET_ACCESS_KEY=password
   - AWS_REGION=us-east-1
   ports:
   - 8888:8888
   - 8080:8080
   links:
   - rest:rest
   - minio:minio
   rest:
   image: tabulario/iceberg-rest:0.1.0
   ports:
   - 8181:8181
   environment:
   - AWS_ACCESS_KEY_ID=admin
   - AWS_SECRET_ACCESS_KEY=password
   - AWS_REGION=us-east-1
   - CATALOG_WAREHOUSE=s3a://warehouse/wh/
   - CATALOG_IO__IMPL=org.apache.iceberg.aws.s3.S3FileIO
   - CATALOG_S3_ENDPOINT=http://minio:9000/
   minio:
   image: minio/minio
   container_name: minio
   environment:
   - MINIO_ROOT_USER=admin
   - MINIO_ROOT_PASSWORD=password
   ports:
   - 9001:9001
   - 9000:9000
   command: ["server", "/data", "--console-address", ":9001"]
   mc:
   depends_on:
   - minio
   image: minio/mc
   container_name: mc
   environment:
   - AWS_ACCESS_KEY_ID=admin
   - AWS_SECRET_ACCESS_KEY=password
   - AWS_REGION=us-east-1
   entrypoint: >
   /bin/sh -c "
   until (/usr/bin/mc config host add minio http://minio:9000/ admin password) do echo '...waiting...' && sleep 1; done;
   /usr/bin/mc rm -r --force minio/warehouse;
   /usr/bin/mc mb minio/warehouse;
   /usr/bin/mc policy set public minio/warehouse;
   exit 0;
   "
   While According to the configuration above, I got the following error:
   23/01/31 01:43:58 WARN RESTSessionCatalog: Failed to report metrics to REST endpoint for table nyc.taxis
   org.apache.iceberg.exceptions.BadRequestException: Malformed request: No route for request: POST v1/namespaces/nyc/tables/taxis/metrics
   at org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:152)
   at org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:135)
   at org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:150)
   at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:224)
   at org.apache.iceberg.rest.HTTPClient.post(HTTPClient.java:269)
   at org.apache.iceberg.rest.RESTClient.post(RESTClient.java:112)
   at org.apache.iceberg.rest.RESTSessionCatalog.reportMetrics(RESTSessionCatalog.java:321)
   at org.apache.iceberg.rest.RESTSessionCatalog.lambda$loadTable$2(RESTSessionCatalog.java:307)
   at org.apache.iceberg.BaseTableScan.lambda$planFiles$0(BaseTableScan.java:168)
   at org.apache.iceberg.io.CloseableIterable$3.close(CloseableIterable.java:95)
   at org.apache.iceberg.spark.source.SparkBatchQueryScan.files(SparkBatchQueryScan.java:125)
   at org.apache.iceberg.spark.source.SparkBatchQueryScan.tasks(SparkBatchQueryScan.java:138)
   at org.apache.iceberg.spark.source.SparkScan.toBatch(SparkScan.java:111)
   at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.batch$lzycompute(BatchScanExec.scala:42)
   at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.batch(BatchScanExec.scala:42)
   at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.inputPartitions$lzycompute(BatchScanExec.scala:54)
   at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.inputPartitions(BatchScanExec.scala:54)
   at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar(DataSourceV2ScanExecBase.scala:142)
   at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar$(DataSourceV2ScanExecBase.scala:141)
   at org.apache.spark.sql.execution.datasources.v2.BatchScanExec.supportsColumnar(BatchScanExec.scala:36)
   at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy.apply(DataSourceV2Strategy.scala:143)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
   at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:69)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   at scala.collection.Iterator.foreach(Iterator.scala:943)
   at scala.collection.Iterator.foreach$(Iterator.scala:943)
   at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:69)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)
   at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:196)
   at scala.collection.TraversableOnce$folder$1.apply(TraversableOnce.scala:194)
   at scala.collection.Iterator.foreach(Iterator.scala:943)
   at scala.collection.Iterator.foreach$(Iterator.scala:943)
   at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   at scala.collection.TraversableOnce.foldLeft(TraversableOnce.scala:199)
   at scala.collection.TraversableOnce.foldLeft$(TraversableOnce.scala:192)
   at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1431)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$2(QueryPlanner.scala:75)
   at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
   at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
   at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
   at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:69)
   at org.apache.spark.sql.execution.QueryExecution$.createSparkPlan(QueryExecution.scala:459)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$sparkPlan$1(QueryExecution.scala:145)
   at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:185)
   at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
   at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:184)
   at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:145)
   at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:138)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:158)
   at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:185)
   at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
   at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
   at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:184)
   at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:158)
   at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:151)
   at org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:204)
   at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:249)
   at org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:218)
   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:103)
   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
   at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
   at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3856)
   at org.apache.spark.sql.Dataset.collectToPython(Dataset.scala:3685)
   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.base/java.lang.reflect.Method.invoke(Method.java:566)
   at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
   at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
   at py4j.Gateway.invoke(Gateway.java:282)
   at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
   at py4j.commands.CallCommand.execute(CallCommand.java:79)
   at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
   at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
   at java.base/java.lang.Thread.run(Thread.java:829)
   Sorry if wrongly reported as bug.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] nastra commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "nastra (via GitHub)" <gi...@apache.org>.
nastra commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1430868859

   @zhangjiuyang1993 glad it works for you now. I'm going to close this then.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1411402841

   Sometimes i run some sql scripts, the server occurs connection refused,for example:
   %%sql
   
   SELECT COUNT(*) as cnt
   FROM nyc.taxis
   
   ---------------------------------------------------------------------------
   ConnectionRefusedError                    Traceback (most recent call last)
   Cell In[15], line 1
   ----> 1 get_ipython().run_cell_magic('sql', '', '\nSELECT COUNT(*) as cnt\nFROM nyc.taxis\n')
   
   File /usr/local/lib/python3.9/site-packages/IPython/core/interactiveshell.py:2422, in InteractiveShell.run_cell_magic(self, magic_name, line, cell)
      2420 with self.builtin_trap:
      2421     args = (magic_arg_s, cell)
   -> 2422     result = fn(*args, **kwargs)
      2423 return result
   
   File ~/.ipython/profile_default/startup/00-prettytables.py:61, in sql(line, cell)
        59         return _to_table(df, num_rows=args.limit)
        60 else:
   ---> 61     return _to_table(spark.sql(cell))
   
   File /opt/spark/python/pyspark/sql/session.py:1034, in SparkSession.sql(self, sqlQuery, **kwargs)
      1032     sqlQuery = formatter.format(sqlQuery, **kwargs)
      1033 try:
   -> 1034     return DataFrame(self._jsparkSession.sql(sqlQuery), self)
      1035 finally:
      1036     if len(kwargs) > 0:
   
   File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1320, in JavaMember.__call__(self, *args)
      1313 args_command, temp_args = self._build_args(*args)
      1315 command = proto.CALL_COMMAND_NAME +\
      1316     self.command_header +\
      1317     args_command +\
      1318     proto.END_COMMAND_PART
   -> 1320 answer = self.gateway_client.send_command(command)
      1321 return_value = get_return_value(
      1322     answer, self.gateway_client, self.target_id, self.name)
      1324 for temp_arg in temp_args:
   
   File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1036, in GatewayClient.send_command(self, command, retry, binary)
      1015 def send_command(self, command, retry=True, binary=False):
      1016     """Sends a command to the JVM. This method is not intended to be
      1017        called directly by Py4J users. It is usually called by
      1018        :class:`JavaMember` instances.
      (...)
      1034      if `binary` is `True`.
      1035     """
   -> 1036     connection = self._get_connection()
      1037     try:
      1038         response = connection.send_command(command)
   
   File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py:284, in JavaClient._get_connection(self)
       281     pass
       283 if connection is None or connection.socket is None:
   --> 284     connection = self._create_new_connection()
       285 return connection
   
   File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py:291, in JavaClient._create_new_connection(self)
       287 def _create_new_connection(self):
       288     connection = ClientServerConnection(
       289         self.java_parameters, self.python_parameters,
       290         self.gateway_property, self)
   --> 291     connection.connect_to_java_server()
       292     self.set_thread_connection(connection)
       293     return connection
   
   File /opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py:438, in ClientServerConnection.connect_to_java_server(self)
       435 if self.ssl_context:
       436     self.socket = self.ssl_context.wrap_socket(
       437         self.socket, server_hostname=self.java_address)
   --> 438 self.socket.connect((self.java_address, self.java_port))
       439 self.stream = self.socket.makefile("rb")
       440 self.is_connected = True
   
   ConnectionRefusedError: [Errno 111] Connection refused


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] nastra commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "nastra (via GitHub)" <gi...@apache.org>.
nastra commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1410481495

   You need to upgrade `tabulario/iceberg-rest:0.1.0` (uses Iceberg 1.0.0) to `tabulario/iceberg-rest:0.2.0` (uses Iceberg 1.1.0). That should fix it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] nastra closed issue #6708: Quick start docker-compose demo doesn't work

Posted by "nastra (via GitHub)" <gi...@apache.org>.
nastra closed issue #6708: Quick start docker-compose demo doesn't work
URL: https://github.com/apache/iceberg/issues/6708


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] Fokko commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "Fokko (via GitHub)" <gi...@apache.org>.
Fokko commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1419814726

   Can you run a `docker-compose pull` to make sure that you're running the latest version? We recently did some updates to the stack.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1411367934

   While I run this command:
   df = spark.read.parquet("/home/iceberg/data/yellow_tripdata_2021-04.parquet")
   df.write.saveAsTable("nyc.taxis")
   It occurs errors:
   ERROR:root:Exception while sending command.                         (0 + 4) / 4]
   Traceback (most recent call last):
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 516, in send_command
       raise Py4JNetworkError("Answer from Java side is empty")
   py4j.protocol.Py4JNetworkError: Answer from Java side is empty
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1038, in send_command
       response = connection.send_command(command)
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 539, in send_command
       raise Py4JNetworkError(
   py4j.protocol.Py4JNetworkError: Error while sending or receiving
   ERROR:root:Exception while sending command.
   Traceback (most recent call last):
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 516, in send_command
       raise Py4JNetworkError("Answer from Java side is empty")
   py4j.protocol.Py4JNetworkError: Answer from Java side is empty
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1038, in send_command
       response = connection.send_command(command)
     File "/opt/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 539, in send_command
       raise Py4JNetworkError(
   py4j.protocol.Py4JNetworkError: Error while sending or receiving


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1430636016

   It works. Thanks a lot.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1411368041

   @nastra 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1411351099

   It works. Thanks a lot.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] nastra commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "nastra (via GitHub)" <gi...@apache.org>.
nastra commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1412280147

   This seems like a network issue. Is any of the components from the quickstart demo not reachable maybe?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] Fokko commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "Fokko (via GitHub)" <gi...@apache.org>.
Fokko commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1419813815

   > py4j.protocol.Py4JNetworkError: Answer from Java side is empty
   
   Means that the Spark process has died. Do you see anything in the logs?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org


[GitHub] [iceberg] zhangjiuyang1993 commented on issue #6708: Quick start docker-compose demo doesn't work

Posted by "zhangjiuyang1993 (via GitHub)" <gi...@apache.org>.
zhangjiuyang1993 commented on issue #6708:
URL: https://github.com/apache/iceberg/issues/6708#issuecomment-1413050964

   No, the quickstart demo sometimes is not reachable.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org