You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pinot.apache.org by Pinot Slack Email Digest <ap...@gmail.com> on 2022/03/30 02:00:28 UTC

Apache Pinot Daily Email Digest (2022-03-29)

### _#general_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  
 **@akumar:** @akumar has joined the channel  
 **@sboggavarapu:** I got one more question regarding lookup tables. I have
created an offline dimension only table with 3 records  
**@mark.needham:** if you navigate to that table, can you see what segments
are listed?  
**@sboggavarapu:** There are 4 segments listed each with configuration like  
**@sboggavarapu:** ```{ "custom.map":
"{\"input.data.file.uri\":\"file:/data/customers.csv\"}", "segment.crc":
"3320463979", "segment.creation.time": "1648122612684",
"segment.index.version": "v3", "segment.name": "merchants_OFFLINE_0",
"segment.offline.download.url": "", "segment.offline.push.time":
"1648122613365", "segment.table.name": "merchants", "segment.total.docs":
"100", "segment.type": "OFFLINE" }```  
**@mark.needham:** what are the others called?  
**@mark.needham:** I expect some of them are probably invalid  
**@mark.needham:** but I dunno how they got there  
**@sboggavarapu:** FYI, I am running this in my docker local.  
**@sboggavarapu:**  
**@sboggavarapu:** I tried with a couple of offline tables...every table has
some invalid data like this  
**@mark.needham:** in your ingestion file - it might be that the input
directory has multiple CSV files  
**@mark.needham:** and it's created one segment per file  
**@sboggavarapu:** Oh.. It is possible that I have multiple csv files in the
directory...but shouldn't it read only one with relevant file name?  
**@mark.needham:** it should do  
**@sboggavarapu:** I had the configuration like this:  
**@sboggavarapu:** ```executionFrameworkSpec: name: 'standalone'
segmentGenerationJobRunnerClassName:
'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner'
segmentTarPushJobRunnerClassName:
'org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner'
jobType: SegmentCreationAndTarPush inputDirURI: '/data'
includeFileNamePattern: 'glob:**/*.csv' outputDirURI:
'/opt/pinot/data/members' overwriteOutput: true pinotFSSpecs: \- scheme: file
className: org.apache.pinot.spi.filesystem.LocalPinotFS recordReaderSpec:
dataFormat: 'csv' className:
'org.apache.pinot.plugin.inputformat.csv.CSVRecordReader' configClassName:
'org.apache.pinot.plugin.inputformat.csv.CSVRecordReaderConfig' tableSpec:
tableName: 'members' pinotClusterSpecs: \- controllerURI: ''```  
**@mark.needham:** *.csv will match every file though  
**@mark.needham:** `includeFileNamePattern: 'glob:**/*.csv'`  
**@sboggavarapu:** This is for an offline table called "members"  
**@mark.needham:** so it creates one segment per CSV file under /data/  
**@mark.needham:** I guess there are 4 files?  
**@mark.needham:** for other tables  
**@sboggavarapu:** I see... I did try specifying specific file name in that
property..like ```includeFileNamePattern: 'glob:**/members.csv'```  
**@sboggavarapu:** I believe ...it failed to read it ...  
**@mark.needham:** oh  
**@sboggavarapu:** But I can try once more to confirm that  
**@mark.needham:** it might be a bug somewhere if it's not reading the pattern
properly  
**@sboggavarapu:** I will also try reading from a directory with only csv file
and confirm if that works without any invalid data.  
**@sboggavarapu:** Yeah... I will post the stack trace in case of an error  
 **@sboggavarapu:** However, after ingestion, along with my 3 records, pinot
actually shows a lot of documents with null values..  
 **@sboggavarapu:** The actual 3 docs are the last 3 records. But like we can
see, there are other results with null merchant name/Integer.MIN_VALUE
merchantIds... In fact, it shows total docs as 116 where I expected it to show
only 3 records.  
 **@ysuo:** I met the following error when i query EXPLAIN PLAN FOR select *
from streams_metrics_flat_data, any idea of the cause for it?  
**@richard892:** hi, I believe this was fixed recently cc @amrish.k.lal  
**@ysuo:** thanks  
**@ysuo:** by the way which version fixed it?  
**@richard892:** if it has been fixed, the fix will be in 0.10.0 which will be
released soon  
**@richard892:** I tagged Amrish because he created the feature and has more
awareness than I do  
**@ysuo:** I see, thank you.  
**@amrish.k.lal:** Hi, this error happens in older versions of Pinot which
didn't have EXPLAIN PLAN functionality. Newer versions with EXPLAIN PLAN
functionality won't have this error.  
**@ysuo:** Got it. Thanks @**  
 **@ysuo:** I created a star tree index for a realtime table, how can I verify
if it works or not? and there is any tool to view this index data?  
**@mark.needham:** if you call: ```<YOURTABLENAME>/metadata```  
**@mark.needham:** ```{ "baseballStats_OFFLINE_0": { "segmentName":
"baseballStats_OFFLINE_0", "schemaName": null, "crc": 3449875305,
"creationTimeMillis": 1648546440888, "creationTimeReadable":
"2022-03-29T09:34:00:888 UTC", "timeGranularitySec": null, "startTimeMillis":
null, "startTimeReadable": null, "endTimeMillis": null, "endTimeReadable":
null, "segmentVersion": "v3", "creatorName": null, "custom": {
"input.data.file.uri":
"file:/opt/pinot/examples/batch/baseballStats/rawdata/baseballStats_data.csv"
}, "columns": [], "indexes": {}, "star-tree-index": null } }```  
**@mark.needham:** it will tell you if a star tree index has been defined  
**@mark.needham:** on that last property  
**@ysuo:** I checked metadata using this way and I found star-tree-index is
null. I modified my table config to add a star tree index.  
**@ysuo:** It seems the index didn’t work. Any idea how can I make it work?  
**@ysuo:** I just tried “enableDynamicStarTreeCreation”: true, then it worked.
Thanks @mark.needham  
 **@ysuo:** Thanks  
 **@moradi.sajjad:** ```Hello Community, We are pleased to announce that
Apache Pinot 0.10.0 is released! The release can be downloaded at  The release
note is available at  Additional resources - Project website:  Getting
started:  Pinot developer blogs:  Intro to Pinot Video:  Best Regards, Apache
Pinot Team```  
**@diogo.baeder:** Awesome! Any ideas when a Docker image for it will be
released?  
**@mayanks:** Thanks a lot @moradi.sajjad for your work on releasing this
version.  
**@xiangfu0:** @diogo.baeder today  
**@karinwolok1:** Amazing!!! :heart:  
**@diogo.baeder:** @xiangfu0 nice, thanks! :slightly_smiling_face:  
**@snlee:** @moradi.sajjad Thank you for working on this!  
**@luisfernandez:** Can't wait to upgrade!  
**@xiangfu0:** docker image `apachepinot/pinot:0.10.0` is published  
**@diogo.baeder:** Thanks! :heart:  
 **@pabraham.usa:** Anyone faced following error ? Once this happens I have to
restart Pinot server and the data during these errors are not consumed by
realtime table. ```16:20:24.624 SegmentColumnarIndexCreator - Caught exception
org.apache.lucene.store.AlreadyClosedException: this IndexWriter is closed
while refreshing realtime lucene reader for segment:
mydata__1__25531__20220323T2114Z```  
**@tisantos:** I'm assuming you have text index enabled? cc: @steotia have you
seen this before?  
**@pabraham.usa:** Yes it is text index  

###  _#random_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  
 **@akumar:** @akumar has joined the channel  

###  _#troubleshooting_

  
 **@mohammedgalalen056:** Hi team, I was trying to compile pinot from source
on macbook pro M1 and I got two errors during the compilation one regarding
the `protoc-gen-grpc-java-1.4.0-osx-x86_64` and the other
`com.github.eirslett:frontend-maven-plugin:1.1` I had to upgrade
``com.github.eirslett:frontend-maven-plugin`` to `1.11.0` and downloaded the
`protoc-gen-grpc-java-1.4.0-osx-x86_64` manually. But I couldn't run the
example, and I'm getting this error ```Failed to start a Pinot [SERVER] at
15.16 since launch java.lang.RuntimeException:
java.util.concurrent.RejectedExecutionException: event executor terminated at
org.apache.pinot.core.transport.QueryServer.start(QueryServer.java:136)
~[pinot-all-0.10.0-SNAPSHOT-jar-with-
dependencies.jar:0.10.0-SNAPSHOT-649f5988d5746869ef6a690f4747ff4d6fb9c607] at
org.apache.pinot.server.starter.ServerInstance.start(ServerInstance.java:165)
```  
**@ken:** There should be a log file in the `/logs` subdir of the directory
containing the expanded Pinot distribution tarball that you built. If you look
in the log file, typically you’ll find more information about root cause of
the error that’s being returned.  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  
 **@akumar:** @akumar has joined the channel  
 **@diana.arnos:** I'm having this problem again. Servers are up and running,
status is shown as `Alive` but when I run a query, the broker says `No server
found for request` If I try a rebalance: `"Instance reassigned, table is
already balanced"` If I try to run `/rebuildBrokerResourceFromHelixTags` :
```{ "status": "Broker resource is not rebuilt because ideal state is the same
for table: <redacted>" }``` But I can see both servers that I have consuming
data: ```Consumed 1410 events from (rate:20.137104/s), currentOffset=642595,
numRowsConsumedSoFar=167595, numRowsIndexedSoFar=167595``` And the segments
are being uploaded to the deepstore. I checked zookeeper and idealstate and
externalview and they are the same. Is there a way to force the broker to
recognize the servers?  
 **@kchavda:** Is anyone using Tableau with Pinot? Getting this error when
trying to connect to hosted instance:  
**@xiangfu0:** I think you might need to drop the jars to the class path? cc:
@kennybastani  
**@kennybastani:** @kchavda Please follow this guide to get it working:  
**@kchavda:** Hi @kenny, I actually followed that guide  
**@kchavda:** @xiangfu0, thanks for looking into this. I followed the link
Kenny shared already and have the jar files in the appropriate folders.  
**@kennybastani:** Which version of Pinot are you using?  
**@kennybastani:** Can you provide the output of `ls
~/Library/Tableau/Drivers`  
**@kchavda:** 0.10.0-SNAPSHOT  
**@kchavda:** `async-http-client-1.9.40.jar pinot-java-
client-0.10.0-SNAPSHOT.jar pinot-jdbc-client-0.10.0-SNAPSHOT.jar`  
**@kennybastani:** Can you provide the output of `tail -n 50 ~/Documents/My\
Tableau\ Repository/Logs/jprotocolserver.log`  
**@kennybastani:** Can you also verify the output of `curl -i localhost:8000`
or if you are not running the Pinot broker locally, please replace the URL of
the broker in that command  
**@kennybastani:** Okay, I think that the async http client has been upgraded
to a newer version in 0.10.0  
**@kennybastani:** Run the command `cp ~/.m2/repository/com/ning/async-http-
client/2.12.3/async-http-client-2.12.3.jar .`  
**@kennybastani:** To your Tableau driver directory  
**@kennybastani:** Then `rm ~/Library/Tableau/Drivers/async-http-
client-1.9.21.jar`  
**@kchavda:** The curl command timed out  
**@kchavda:** It's hitting the startree.cloud since we are on on it.  
**@kennybastani:** Okay, so the issue you're having right now is the `async-
http-client` version, so fix that first  
**@kennybastani:** Then we will likely see an issue connecting to the broker  
**@kchavda:** So I was getting an error when I tried building Pinot locally.
```Mar 28, 2022 1:56:51 PM com.diffplug.spotless.FormatExceptionPolicyLegacy
error SEVERE: Step 'removeUnusedImports' found problem in
'src/test/java/org/apache/pinot/spi/ingestion/batch/IngestionJobLauncherTest.java':
null java.lang.reflect.InvocationTargetException at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568) at
com.diffplug.spotless.java.GoogleJavaFormatStep$State.lambda$constructRemoveUnusedFunction$3(GoogleJavaFormatStep.java:190)
at
com.diffplug.spotless.java.GoogleJavaFormatStep$State.lambda$createRemoveUnusedImportsOnly$1(GoogleJavaFormatStep.java:167)
at com.diffplug.spotless.FormatterFunc.apply(FormatterFunc.java:32) at
com.diffplug.spotless.FormatterStepImpl$Standard.format(FormatterStepImpl.java:78)
at com.diffplug.spotless.FormatterStep$Strict.format(FormatterStep.java:76) at
com.diffplug.spotless.Formatter.compute(Formatter.java:230) at
com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:201) at
com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:188) at
com.diffplug.spotless.maven.SpotlessCheckMojo.process(SpotlessCheckMojo.java:52)
at
com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:146)
at
com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:137)
at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
at
org.apache.maven.lifecycle.internal.MojoExecutor.doExecute(MojoExecutor.java:301)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:211)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:165)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:157)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:121)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:127)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:294) at
org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192) at
org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105) at
org.apache.maven.cli.MavenCli.execute(MavenCli.java:960) at
org.apache.maven.cli.MavenCli.doMain(MavenCli.java:293) at
org.apache.maven.cli.MavenCli.main(MavenCli.java:196) at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568) at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
Caused by: java.lang.IllegalAccessError: class
com.google.googlejavaformat.java.RemoveUnusedImports (in unnamed module
@0x718198db) cannot access class com.sun.tools.javac.util.Context (in module
jdk.compiler) because module jdk.compiler does not export
com.sun.tools.javac.util to unnamed module @0x718198db at
com.google.googlejavaformat.java.RemoveUnusedImports.removeUnusedImports(RemoveUnusedImports.java:187)
... 38 more [INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary for Pinot 0.10.0-SNAPSHOT: [INFO] [INFO] Pinot
.............................................. SUCCESS [01:26 min] [INFO]
Pinot Service Provider Interface ................... FAILURE [ 23.216 s]
[INFO] Pinot Segment Service Provider Interface ........... SKIPPED [INFO]
Pinot Plugins ...................................... SKIPPED [INFO] Pinot
Metrics ...................................... SKIPPED [INFO] Pinot Yammer
Metrics ............................... SKIPPED [INFO] Pinot Common
....................................... SKIPPED [INFO] Pinot Input Format
................................. SKIPPED [INFO] Pinot Avro Base
.................................... SKIPPED [INFO] Pinot Avro
......................................... SKIPPED [INFO] Pinot Csv
.......................................... SKIPPED [INFO] Pinot JSON
......................................... SKIPPED [INFO] Pinot local segment
implementations ................ SKIPPED [INFO] Pinot Core
......................................... SKIPPED [INFO] Pinot Server
....................................... SKIPPED [INFO] Pinot Segment Uploader
............................. SKIPPED [INFO] Pinot Segment Uploader Default
..................... SKIPPED [INFO] Pinot Controller
................................... SKIPPED [INFO] Pinot Broker
....................................... SKIPPED [INFO] Pinot Clients
...................................... SKIPPED [INFO] Pinot Java Client
.................................. SKIPPED [INFO] Pinot JDBC Client
.................................. SKIPPED [INFO] Pinot Batch Ingestion
.............................. SKIPPED [INFO] Pinot Batch Ingestion Common
....................... SKIPPED [INFO] Pinot Minion
....................................... SKIPPED [INFO] Pinot Confluent Avro
............................... SKIPPED [INFO] Pinot ORC
.......................................... SKIPPED [INFO] Pinot Parquet
...................................... SKIPPED [INFO] Pinot Thrift
....................................... SKIPPED [INFO] Pinot Protocol Buffers
............................. SKIPPED [INFO] Pluggable Pinot file system
........................ SKIPPED [INFO] Pinot Azure Data Lake Storage
...................... SKIPPED [INFO] Pinot Hadoop Filesystem
............................ SKIPPED [INFO] Pinot Google Cloud Storage
......................... SKIPPED [INFO] Pinot Amazon S3
.................................... SKIPPED [INFO] Pinot Batch Ingestion for
Spark .................... SKIPPED [INFO] Pinot Batch Ingestion for Hadoop
................... SKIPPED [INFO] Pinot Batch Ingestion Standalone
................... SKIPPED [INFO] Pinot Batch Ingestion
.............................. SKIPPED [INFO] Pinot Ingestion Common
............................. SKIPPED [INFO] Pinot Hadoop
....................................... SKIPPED [INFO] Pinot Spark
........................................ SKIPPED [INFO] Pinot Stream Ingestion
............................. SKIPPED [INFO] Pinot Kafka Base
................................... SKIPPED [INFO] Pinot Kafka 0.9
.................................... SKIPPED [INFO] Pinot Kafka 2.x
.................................... SKIPPED [INFO] Pinot Kinesis
...................................... SKIPPED [INFO] Pinot Pulsar
....................................... SKIPPED [INFO] Pinot Minion Tasks
................................. SKIPPED [INFO] Pinot Minion Built-In Tasks
........................ SKIPPED [INFO] Pinot Dropwizard Metrics
........................... SKIPPED [INFO] Pinot Segment Writer
............................... SKIPPED [INFO] Pinot Segment Writer File Based
.................... SKIPPED [INFO] Pluggable Pinot Environment Provider
............... SKIPPED [INFO] Pinot Azure Environment
............................ SKIPPED [INFO] Pinot Tools
........................................ SKIPPED [INFO] Pinot Test Utils
................................... SKIPPED [INFO] Pinot Integration Tests
............................ SKIPPED [INFO] Pinot Perf
......................................... SKIPPED [INFO] Pinot Distribution
................................. SKIPPED [INFO] Pinot Connectors
................................... SKIPPED [INFO] Pinot Spark Connector
.............................. SKIPPED [INFO] Pinot Flink Connector
.............................. SKIPPED [INFO] Pinot Compatibility Verifier
....................... SKIPPED [INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE [INFO]
------------------------------------------------------------------------
[INFO] Total time: 01:54 min [INFO] Finished at: 2022-03-28T13:56:51-04:00
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal com.diffplug.spotless:spotless-maven-
plugin:2.9.0:check (default) on project pinot-spi: Execution default of goal
com.diffplug.spotless:spotless-maven-plugin:2.9.0:check failed:
java.lang.reflect.InvocationTargetException: class
com.google.googlejavaformat.java.RemoveUnusedImports (in unnamed module
@0x718198db) cannot access class com.sun.tools.javac.util.Context (in module
jdk.compiler) because module jdk.compiler does not export
com.sun.tools.javac.util to unnamed module @0x718198db -> [Help 1] [ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] [ERROR] For more information about the errors and possible solutions,
please read the following articles: [ERROR] [Help 1]  [ERROR] [ERROR] After
correcting the problems, you can resume the build with the command [ERROR] mvn
<args> -rf :pinot-spi```  
**@kennybastani:** This is the latest snapshot or from release?  
**@kennybastani:** Looks like someone broke the build  
**@kchavda:** it pulled master I believe  
**@kennybastani:** Go ahead and use  
**@kchavda:** Trying now  
**@kennybastani:** That `cp` command is actually this one: `cp
~/.m2/repository/org/asynchttpclient/async-http-client/2.12.3/async-http-
client-2.12.3.jar ~/Library/Tableau/Drivers*/*`  
**@kennybastani:** I'll get the docs updated to the latest release  
**@kchavda:** ah, that explains why noting was there in the `ning` directory  
**@kchavda:** Not sure why but I keep on getting this error when I try to
build `mvn clean install -DskipTests -Pbin-dist`  
**@kchavda:** ```com.diffplug.spotless.FormatExceptionPolicyLegacy error
SEVERE: Step 'removeUnusedImports' found problem in
'src/test/java/org/apache/pinot/spi/ingestion/batch/IngestionJobLauncherTest.java':
null java.lang.reflect.InvocationTargetException at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568) at
com.diffplug.spotless.java.GoogleJavaFormatStep$State.lambda$constructRemoveUnusedFunction$3(GoogleJavaFormatStep.java:190)
at
com.diffplug.spotless.java.GoogleJavaFormatStep$State.lambda$createRemoveUnusedImportsOnly$1(GoogleJavaFormatStep.java:167)
at com.diffplug.spotless.FormatterFunc.apply(FormatterFunc.java:32) at
com.diffplug.spotless.FormatterStepImpl$Standard.format(FormatterStepImpl.java:78)
at com.diffplug.spotless.FormatterStep$Strict.format(FormatterStep.java:76) at
com.diffplug.spotless.Formatter.compute(Formatter.java:230) at
com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:201) at
com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:188) at
com.diffplug.spotless.maven.SpotlessCheckMojo.process(SpotlessCheckMojo.java:52)
at
com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:146)
at
com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:137)
at
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305) at
org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192) at
org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105) at
org.apache.maven.cli.MavenCli.execute(MavenCli.java:972) at
org.apache.maven.cli.MavenCli.doMain(MavenCli.java:293) at
org.apache.maven.cli.MavenCli.main(MavenCli.java:196) at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568) at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
Caused by: java.lang.IllegalAccessError: class
com.google.googlejavaformat.java.RemoveUnusedImports (in unnamed module
@0x30d3f583) cannot access class com.sun.tools.javac.util.Context (in module
jdk.compiler) because module jdk.compiler does not export
com.sun.tools.javac.util to unnamed module @0x30d3f583 at
com.google.googlejavaformat.java.RemoveUnusedImports.removeUnusedImports(RemoveUnusedImports.java:187)
... 37 more```  
**@kennybastani:** I've seen this happen really weirdly when your JDK version
isn't set to baseline  
**@kchavda:** I had to use create a jvm.config file for maven to get it to
build successfully  
**@kchavda:** ```openjdk 17.0.2 2022-01-18 OpenJDK Runtime Environment (build
17.0.2+8-86) OpenJDK 64-Bit Server VM (build 17.0.2+8-86, mixed mode,
sharing)```  
**@kennybastani:** Oh yeah let's get that down to 11-14  
**@kennybastani:** Try `/usr/libexec/java_home -V`  
**@kennybastani:** Let me know the output  
**@kchavda:** ```Matching Java Virtual Machines (2): 17, x86_64: "Java SE 17"
/Library/Java/JavaVirtualMachines/jdk-17.jdk/Contents/Home 11.0.13, x86_64:
"Java SE 11.0.13"
/Library/Java/JavaVirtualMachines/jdk-11.0.13.jdk/Contents/Home
/Library/Java/JavaVirtualMachines/jdk-17.jdk/Contents/Home```  
**@kennybastani:** export
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk-11.0.13.jdk/Contents/Home  
**@kennybastani:** and rebuild  
**@kchavda:** That did it. Moving along!  
**@kennybastani:** Sweet :slightly_smiling_face:  
**@kchavda:** Btw, between flink and spark streaming, which one do you
suggest?  
**@kennybastani:** I'm probably not the best person to answer that. I would
prefer Spark because I'm old school, but Flink has a lot more community uptake
and affinity.  
**@kchavda:** Build success! Copied over the jar files and trying Tableau
again :crossed_fingers:  
**@kchavda:**  
**@kchavda:** different error  
**@kchavda:** reactivestreams  
**@kchavda:** ```at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source) ~[?:?] at java.lang.Thread.run(Unknown Source) [?:?] Caused by:
java.lang.ClassNotFoundException: org.reactivestreams.Publisher at
java.net.URLClassLoader.findClass(Unknown Source) ~[?:?] at
java.lang.ClassLoader.loadClass(Unknown Source) ~[?:?] at
java.net.FactoryURLClassLoader.loadClass(Unknown Source) ~[?:?] at
java.lang.ClassLoader.loadClass(Unknown Source) ~[?:?] ... 27 more 2022-03-29
18:05:35.158 -0400 (,,,,4,7) grpc-default-executor-1 : INFO
com.tableau.connect.grpc.GrpcProtocolService - End local request 7
/constructProtocol. 2022-03-29 18:06:13.267 -0400 (,,,,5,9) grpc-default-
executor-1 : INFO com.tableau.connect.grpc.GrpcProtocolService - Start local
request 9 /constructProtocol. 2022-03-29 18:06:13.267 -0400 (,,,,5,9) grpc-
default-executor-1 : INFO com.tableausoftware.jdbc.JDBCDriverManager - Get
driver from isolatedDrivers. 2022-03-29 18:06:13.268 -0400 (,,,,5,9) grpc-
default-executor-1 : INFO com.tableausoftware.jdbc.JDBCProtocolImpl -
Connecting to jdbc: 2022-03-29 18:06:13.268 -0400 (,,,,5,9) grpc-default-
executor-1 : INFO com.tableausoftware.jdbc.JDBCProtocolImpl - Connection
properties {password=*******, user=} 2022-03-29 18:06:13.268 -0400 (,,,,5,9)
grpc-default-executor-1 : INFO org.apache.pinot.client.PinotDriver -
Initiating connection to database for url: jdbc: 2022-03-29 18:06:13.270 -0400
(,,,,5,9) grpc-default-executor-1 : ERROR
com.tableau.connect.util.GrpcServiceHelper - Failed in constructProtocol.
java.lang.NoClassDefFoundError: org/reactivestreams/Publisher at
org.asynchttpclient.netty.channel.ChannelManager.configureBootstraps(ChannelManager.java:212)
~[?:?] at
org.asynchttpclient.DefaultAsyncHttpClient.<init>(DefaultAsyncHttpClient.java:94)
~[?:?] at org.asynchttpclient.Dsl.asyncHttpClient(Dsl.java:36) ~[?:?] at
org.apache.pinot.client.JsonAsyncHttpPinotClientTransport.<init>(JsonAsyncHttpPinotClientTransport.java:80)
~[?:?] at
org.apache.pinot.client.JsonAsyncHttpPinotClientTransportFactory.buildTransport(JsonAsyncHttpPinotClientTransportFactory.java:37)
~[?:?] at org.apache.pinot.client.PinotDriver.connect(PinotDriver.java:69)
~[?:?] at
com.tableausoftware.jdbc.JDBCDriverManager.getConnection(JDBCDriverManager.java:271)
~[jdbcserver.jar:20214.0.17] at
com.tableausoftware.jdbc.JDBCProtocolImpl.getConnection(JDBCProtocolImpl.java:325)
~[jdbcserver.jar:20214.0.17] at
com.tableausoftware.jdbc.JDBCProtocolImpl.<init>(JDBCProtocolImpl.java:118)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.service.ProtocolPool.constructProtocol(ProtocolPool.java:48)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.service.ProtocolService.constructProtocol(ProtocolService.java:59)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.grpc.GrpcProtocolService.lambda$constructProtocol$0(GrpcProtocolService.java:63)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.grpc.GrpcProtocolService.wrap(GrpcProtocolService.java:289)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.grpc.GrpcProtocolService.constructProtocol(GrpcProtocolService.java:62)
~[jdbcserver.jar:20214.0.17] at
com.tableau.connect.generated.ProtocolServiceGrpc$MethodHandlers.invoke(ProtocolServiceGrpc.java:1492)
~[jdbcserver.jar:20214.0.17] at
io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:180)
~[jdbcserver.jar:20214.0.17] at
io.grpc.PartialForwardingServerCallListener.onHalfClose(PartialForwardingServerCallListener.java:35)
~[jdbcserver.jar:20214.0.17] at
io.grpc.ForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:23)
~[jdbcserver.jar:20214.0.17] at
io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:40)
~[jdbcserver.jar:20214.0.17] at
io.grpc.Contexts$ContextualizedServerCallListener.onHalfClose(Contexts.java:86)
~[jdbcserver.jar:20214.0.17] at
io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:331)
~[jdbcserver.jar:20214.0.17] at
io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:814)
~[jdbcserver.jar:20214.0.17] at
io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
~[jdbcserver.jar:20214.0.17] at
io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
~[jdbcserver.jar:20214.0.17] at
java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[?:?] at
java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[?:?] at
java.lang.Thread.run(Unknown Source) [?:?] Caused by:
java.lang.ClassNotFoundException: org.reactivestreams.Publisher at
java.net.URLClassLoader.findClass(Unknown Source) ~[?:?] at
java.lang.ClassLoader.loadClass(Unknown Source) ~[?:?] at
java.net.FactoryURLClassLoader.loadClass(Unknown Source) ~[?:?] at
java.lang.ClassLoader.loadClass(Unknown Source) ~[?:?] ... 27 more 2022-03-29
18:06:13.270 -0400 (,,,,5,9) grpc-default-executor-1 : INFO
com.tableau.connect.grpc.GrpcProtocolService - End local request 9
/constructProtocol.```  
**@kennybastani:** Looks like we added another transitive dependency  
**@kennybastani:** Let me see  
**@kennybastani:** `cp ~/.m2/repository/org/reactivestreams/reactive-
streams/1.0.3/reactive-streams-1.0.3.jar ~/Library/Tableau/Drivers/`  
**@kchavda:** processing request...  
**@kennybastani:** I assume this will be the timeout issue  
**@kchavda:** yeah, it's sitting there  
**@kennybastani:** We'll need to figure out how to get you a connection to the
broker from the desktop. Whether it is a VPN or some other kind of proxy we
build into the platform for Tableau users.  
**@kchavda:** Does the platform currently not support it?  
**@kennybastani:** I'm checking now  
**@kchavda:** Thanks.  
**@kennybastani:** Adding @mayanks to thread  
**@kchavda:** Thanks. I can create a support ticket if needed  
**@kennybastani:** That would be a good idea  
 **@diogo.baeder:** Hey guys, a few surprises I had with 0.10.0: • The
`segmentPartitionConfig` map doesn't accept the mapping of column to partition
config directly, as the table configuration documentation says, but rather can
only contain a `columnPartitionMap` field it seems, and then this field in its
turn can contain the mapping between column and partition config • The
`segmentsConfig` seems to have had its old `replicasPerPartition` renamed to
`replication`, if I understand correctly - or maybe I just don't understand
where each should be used, if both are valid (although the config docs don't
mention `replicasPerPartition` anymore) Should I open a ticket on GitHub about
these? Or am I getting something wrong perhaps?  
**@mayanks:** Do you see relevant info in release notes? cc: @moradi.sajjad  
**@diogo.baeder:** Not sure if that question was for me, @mayanks - if yes, I
don't see any mention to those changes  

###  _#pinot-k8s-operator_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  

###  _#pinot-dev_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  
 **@apte.kaivalya:** Hey a quick question - Is there an Http API to do batch
inserts into Pinot? Can we skip Kafka ingestion for ingesting real time
batches of data using a direct API?  
**@mark.needham:** You can load data from S3 buckets or local fs:  
**@apte.kaivalya:** Yeah I was looking for something more lightweight which
can be plugged into a production app without needing much effort. An Http Api
call seems like one.  
**@mark.needham:** fair enough. The ingestion job is effectively wrapping
together creating segment(s), uploading them to Pinot's deep store, and then
having those segments downloaded to Pinot Servers  
**@mark.needham:** it does call the HTTP API to do some of those steps  
**@apte.kaivalya:** yeah - to use those APIs it might require similar amount
of work that I am trying to save :slightly_smiling_face:  
 **@mayanks:** You can push offline segments. Is that not what you are looking
for?  
 **@apte.kaivalya:** hmm, not exactly. I was thinking if there was an API to
post some events into Pinot. Pushing offline segments works but I cannot
invoke it from my pipeline. Reason I want to bypass Kafka ingestion is to
avoid setting up a Kafka cluster (which is tricky for prod in our infra as
they have rotating keys to connect to kafka etc).  
 **@mayanks:** I see, there is one, but not in open source.  

###  _#community_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  

###  _#pinot-realtime-table-rebalance_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  

###  _#getting-started_

  
 **@manel.rhaiem92:** @manel.rhaiem92 has joined the channel  
 **@akumar:** @akumar has joined the channel  
\--------------------------------------------------------------------- To
unsubscribe, e-mail: dev-unsubscribe@pinot.apache.org For additional commands,
e-mail: dev-help@pinot.apache.org