You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@pinot.apache.org by GitBox <gi...@apache.org> on 2020/12/30 07:05:45 UTC
[GitHub] [incubator-pinot] fx19880617 opened a new pull request #6396: Adding ImportData sub command in pinot admin
fx19880617 opened a new pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396
## Description
Adding a Pinot-Admin sub command: ImportData to help users import data into Pinot.
Usage:
```
bin/pinot-admin.sh InsertData -dataFilePath /path/to/my-local-data.gz.parquet -format parquet -table my-table
```
```
bin/pinot-admin.sh InsertData -dataFilePath s3://<my-bucket>/path/to/my-s3-data.gz.parquet -format parquet -table my-table
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (f966c1e) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `22.18%`.
> The diff coverage is `38.51%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
===========================================
- Coverage 66.44% 44.26% -22.19%
===========================================
Files 1075 1319 +244
Lines 54773 64303 +9530
Branches 8168 9361 +1193
===========================================
- Hits 36396 28464 -7932
- Misses 15700 33482 +17782
+ Partials 2677 2357 -320
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `44.26% <38.51%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...org/apache/pinot/broker/queryquota/HitCounter.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9IaXRDb3VudGVyLmphdmE=) | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
| [...che/pinot/broker/queryquota/MaxHitRateTracker.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9NYXhIaXRSYXRlVHJhY2tlci5qYXZh) | `0.00% <0.00%> (ø)` | |
| [...ache/pinot/broker/queryquota/QueryQuotaEntity.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9RdWVyeVF1b3RhRW50aXR5LmphdmE=) | `0.00% <0.00%> (-50.00%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ceselector/StrictReplicaGroupInstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL1N0cmljdFJlcGxpY2FHcm91cEluc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/TimeSegmentPruner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL1RpbWVTZWdtZW50UHJ1bmVyLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/interval/Interval.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...r/routing/segmentpruner/interval/IntervalTree.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsVHJlZS5qYXZh) | `0.00% <0.00%> (ø)` | |
| ... and [1304 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [7e0398b...f966c1e](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] mayankshriv commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
mayankshriv commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r559312533
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,390 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.filesystem.PinotFSFactory;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-segmentNameGeneratorType", metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Segment name generator type, default to FIXED type.")
+ private String _segmentNameGeneratorType = BatchConfigProperties.SegmentNameGeneratorType.FIXED;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
Review comment:
`-additionalConfigs`?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
+ String segmentName = (extraConfigs.containsKey(SEGMENT_NAME)) ? extraConfigs.get(SEGMENT_NAME)
+ : String.format("%s_%s", _table, DigestUtils.sha256Hex(_dataFilePath));
+ segmentNameGeneratorSpec.setConfigs(ImmutableMap.of(SEGMENT_NAME, segmentName));
+ spec.setSegmentNameGeneratorSpec(segmentNameGeneratorSpec);
+
+ // set PinotClusterSpecs
+ PinotClusterSpec pinotClusterSpec = new PinotClusterSpec();
+ pinotClusterSpec.setControllerURI(_controllerURI);
+ PinotClusterSpec[] pinotClusterSpecs = new PinotClusterSpec[]{pinotClusterSpec};
+ spec.setPinotClusterSpecs(pinotClusterSpecs);
+
+ // set PushJobSpec
+ PushJobSpec pushJobSpec = new PushJobSpec();
+ pushJobSpec.setPushAttempts(3);
+ pushJobSpec.setPushRetryIntervalMillis(10000);
+ spec.setPushJobSpec(pushJobSpec);
+
+ return spec;
+ }
+
+ private Map<String, String> getS3PinotFSConfigs(Map<String, String> extraConfigs) {
+ Map<String, String> s3PinotFSConfigs = new HashMap<>();
+ s3PinotFSConfigs.put("region", System.getProperty("AWS_REGION", "us-west-2"));
+ s3PinotFSConfigs.putAll(IngestionConfigUtils.getConfigMapWithPrefix(extraConfigs,
+ BatchConfigProperties.INPUT_FS_PROP_PREFIX + IngestionConfigUtils.DOT_SEPARATOR));
+ return s3PinotFSConfigs;
+ }
+
+ private PinotFSSpec getPinotFSSpec(String scheme, String className, Map<String, String> configs) {
+ PinotFSSpec pinotFSSpec = new PinotFSSpec();
+ pinotFSSpec.setScheme(scheme);
+ pinotFSSpec.setClassName(className);
+ pinotFSSpec.setConfigs(configs);
+ return pinotFSSpec;
+ }
+
+ private Map<String, String> getExtraConfigs(List<String> extraConfigs) {
+ if (extraConfigs == null) {
+ return Collections.emptyMap();
+ }
+ Map<String, String> recordReaderConfigs = new HashMap<>();
+ for (String kvPair : extraConfigs) {
+ String[] splits = kvPair.split("=", 2);
+ if ((splits.length == 2) && (splits[0] != null) && (splits[1] != null)) {
+ recordReaderConfigs.put(splits[0], splits[1]);
+ }
+ }
+ return recordReaderConfigs;
+ }
+
+ private String getRecordReaderConfigClass(FileFormat format) {
+ switch (format) {
+ case CSV:
+ return "org.apache.pinot.plugin.inputformat.csv.CSVRecordReaderConfig";
+ case PROTO:
+ return "org.apache.pinot.plugin.inputformat.protobuf.ProtoBufRecordReaderConfig";
+ case THRIFT:
+ return "org.apache.pinot.plugin.inputformat.thrift.ThriftRecordReaderConfig";
+ case ORC:
+ case JSON:
+ case AVRO:
+ case GZIPPED_AVRO:
+ case PARQUET:
+ return null;
+ default:
+ throw new IllegalArgumentException("Unsupported file format - " + format);
+ }
+ }
+
+ private String getRecordReaderClass(FileFormat format) {
Review comment:
Yeah, perhaps can be done outside of this PR.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (ff29061) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **increase** coverage by `6.86%`.
> The diff coverage is `73.16%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
+ Coverage 66.44% 73.30% +6.86%
==========================================
Files 1075 1318 +243
Lines 54773 64169 +9396
Branches 8168 9334 +1166
==========================================
+ Hits 36396 47042 +10646
+ Misses 15700 14067 -1633
- Partials 2677 3060 +383
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `44.17% <38.69%> (?)` | |
| unittests | `64.94% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `44.44% <0.00%> (-4.40%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `78.57% <ø> (+5.40%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../common/config/tuner/TableConfigTunerRegistry.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1RhYmxlQ29uZmlnVHVuZXJSZWdpc3RyeS5qYXZh) | `72.00% <ø> (ø)` | |
| [.../apache/pinot/common/exception/QueryException.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZXhjZXB0aW9uL1F1ZXJ5RXhjZXB0aW9uLmphdmE=) | `90.27% <ø> (+5.55%)` | :arrow_up: |
| [...pinot/common/function/AggregationFunctionType.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZnVuY3Rpb24vQWdncmVnYXRpb25GdW5jdGlvblR5cGUuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1106 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [7e0398b...ff29061](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (79feffc) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `1.49%`.
> The diff coverage is `56.80%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
- Coverage 66.44% 64.95% -1.50%
==========================================
Files 1075 1318 +243
Lines 54773 64169 +9396
Branches 8168 9334 +1166
==========================================
+ Hits 36396 41680 +5284
- Misses 15700 19519 +3819
- Partials 2677 2970 +293
```
| Flag | Coverage Δ | |
|---|---|---|
| unittests | `64.95% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...e/pinot/broker/api/resources/PinotBrokerDebug.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYXBpL3Jlc291cmNlcy9QaW5vdEJyb2tlckRlYnVnLmphdmE=) | `0.00% <0.00%> (-79.32%)` | :arrow_down: |
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `71.42% <ø> (-28.58%)` | :arrow_down: |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `33.96% <0.00%> (-32.71%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ava/org/apache/pinot/client/AbstractResultSet.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Fic3RyYWN0UmVzdWx0U2V0LmphdmE=) | `66.66% <0.00%> (+9.52%)` | :arrow_up: |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `35.55% <0.00%> (-13.29%)` | :arrow_down: |
| [...inot/client/JsonAsyncHttpPinotClientTransport.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0pzb25Bc3luY0h0dHBQaW5vdENsaWVudFRyYW5zcG9ydC5qYXZh) | `10.90% <0.00%> (-51.10%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `73.80% <ø> (+0.63%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1152 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [7e0398b...79feffc](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (ff29061) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `1.50%`.
> The diff coverage is `56.80%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
- Coverage 66.44% 64.94% -1.51%
==========================================
Files 1075 1318 +243
Lines 54773 64169 +9396
Branches 8168 9334 +1166
==========================================
+ Hits 36396 41675 +5279
- Misses 15700 19527 +3827
- Partials 2677 2967 +290
```
| Flag | Coverage Δ | |
|---|---|---|
| unittests | `64.94% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...e/pinot/broker/api/resources/PinotBrokerDebug.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYXBpL3Jlc291cmNlcy9QaW5vdEJyb2tlckRlYnVnLmphdmE=) | `0.00% <0.00%> (-79.32%)` | :arrow_down: |
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `71.42% <ø> (-28.58%)` | :arrow_down: |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `33.96% <0.00%> (-32.71%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ava/org/apache/pinot/client/AbstractResultSet.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Fic3RyYWN0UmVzdWx0U2V0LmphdmE=) | `66.66% <0.00%> (+9.52%)` | :arrow_up: |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `35.55% <0.00%> (-13.29%)` | :arrow_down: |
| [...inot/client/JsonAsyncHttpPinotClientTransport.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0pzb25Bc3luY0h0dHBQaW5vdENsaWVudFRyYW5zcG9ydC5qYXZh) | `10.90% <0.00%> (-51.10%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `73.80% <ø> (+0.63%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1152 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [7e0398b...ff29061](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r552979185
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
+ String segmentName = (extraConfigs.containsKey(SEGMENT_NAME)) ? extraConfigs.get(SEGMENT_NAME)
+ : String.format("%s_%s", _table, DigestUtils.sha256Hex(_dataFilePath));
+ segmentNameGeneratorSpec.setConfigs(ImmutableMap.of(SEGMENT_NAME, segmentName));
+ spec.setSegmentNameGeneratorSpec(segmentNameGeneratorSpec);
+
+ // set PinotClusterSpecs
+ PinotClusterSpec pinotClusterSpec = new PinotClusterSpec();
+ pinotClusterSpec.setControllerURI(_controllerURI);
+ PinotClusterSpec[] pinotClusterSpecs = new PinotClusterSpec[]{pinotClusterSpec};
+ spec.setPinotClusterSpecs(pinotClusterSpecs);
+
+ // set PushJobSpec
+ PushJobSpec pushJobSpec = new PushJobSpec();
+ pushJobSpec.setPushAttempts(3);
+ pushJobSpec.setPushRetryIntervalMillis(10000);
+ spec.setPushJobSpec(pushJobSpec);
+
+ return spec;
+ }
+
+ private Map<String, String> getS3PinotFSConfigs(Map<String, String> extraConfigs) {
+ Map<String, String> s3PinotFSConfigs = new HashMap<>();
+ s3PinotFSConfigs.put("region", System.getProperty("AWS_REGION", "us-west-2"));
+ s3PinotFSConfigs.putAll(IngestionConfigUtils.getConfigMapWithPrefix(extraConfigs,
+ BatchConfigProperties.INPUT_FS_PROP_PREFIX + IngestionConfigUtils.DOT_SEPARATOR));
+ return s3PinotFSConfigs;
+ }
+
+ private PinotFSSpec getPinotFSSpec(String scheme, String className, Map<String, String> configs) {
+ PinotFSSpec pinotFSSpec = new PinotFSSpec();
+ pinotFSSpec.setScheme(scheme);
+ pinotFSSpec.setClassName(className);
+ pinotFSSpec.setConfigs(configs);
+ return pinotFSSpec;
+ }
+
+ private Map<String, String> getExtraConfigs(List<String> extraConfigs) {
+ if (extraConfigs == null) {
+ return Collections.emptyMap();
+ }
+ Map<String, String> recordReaderConfigs = new HashMap<>();
+ for (String kvPair : extraConfigs) {
+ String[] splits = kvPair.split("=", 2);
+ if ((splits.length == 2) && (splits[0] != null) && (splits[1] != null)) {
+ recordReaderConfigs.put(splits[0], splits[1]);
+ }
+ }
+ return recordReaderConfigs;
+ }
+
+ private String getRecordReaderConfigClass(FileFormat format) {
+ switch (format) {
+ case CSV:
+ return "org.apache.pinot.plugin.inputformat.csv.CSVRecordReaderConfig";
+ case PROTO:
+ return "org.apache.pinot.plugin.inputformat.protobuf.ProtoBufRecordReaderConfig";
+ case THRIFT:
+ return "org.apache.pinot.plugin.inputformat.thrift.ThriftRecordReaderConfig";
+ case ORC:
+ case JSON:
+ case AVRO:
+ case GZIPPED_AVRO:
+ case PARQUET:
+ return null;
+ default:
+ throw new IllegalArgumentException("Unsupported file format - " + format);
+ }
+ }
+
+ private String getRecordReaderClass(FileFormat format) {
Review comment:
RecordReaderFactory is used to register class but doesn't have default mappings from fileformat to RecordReaderClass. Do you think we should move this default mapping method to the Factory class?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (79feffc) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **increase** coverage by `6.94%`.
> The diff coverage is `73.06%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
+ Coverage 66.44% 73.39% +6.94%
==========================================
Files 1075 1318 +243
Lines 54773 64169 +9396
Branches 8168 9334 +1166
==========================================
+ Hits 36396 47094 +10698
+ Misses 15700 14018 -1682
- Partials 2677 3057 +380
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `44.37% <38.60%> (?)` | |
| unittests | `64.95% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `44.44% <0.00%> (-4.40%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `78.57% <ø> (+5.40%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../common/config/tuner/TableConfigTunerRegistry.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1RhYmxlQ29uZmlnVHVuZXJSZWdpc3RyeS5qYXZh) | `72.00% <ø> (ø)` | |
| [.../apache/pinot/common/exception/QueryException.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZXhjZXB0aW9uL1F1ZXJ5RXhjZXB0aW9uLmphdmE=) | `90.27% <ø> (+5.55%)` | :arrow_up: |
| [...pinot/common/function/AggregationFunctionType.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZnVuY3Rpb24vQWdncmVnYXRpb25GdW5jdGlvblR5cGUuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1107 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [7e0398b...79feffc](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r553065067
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
Review comment:
Added an optional param
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (f924f2f) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `1.45%`.
> The diff coverage is `56.80%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
- Coverage 66.44% 64.99% -1.46%
==========================================
Files 1075 1318 +243
Lines 54773 64106 +9333
Branches 8168 9329 +1161
==========================================
+ Hits 36396 41664 +5268
- Misses 15700 19476 +3776
- Partials 2677 2966 +289
```
| Flag | Coverage Δ | |
|---|---|---|
| unittests | `64.99% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...e/pinot/broker/api/resources/PinotBrokerDebug.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYXBpL3Jlc291cmNlcy9QaW5vdEJyb2tlckRlYnVnLmphdmE=) | `0.00% <0.00%> (-79.32%)` | :arrow_down: |
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `71.42% <ø> (-28.58%)` | :arrow_down: |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `33.96% <0.00%> (-32.71%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ava/org/apache/pinot/client/AbstractResultSet.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Fic3RyYWN0UmVzdWx0U2V0LmphdmE=) | `66.66% <0.00%> (+9.52%)` | :arrow_up: |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `35.55% <0.00%> (-13.29%)` | :arrow_down: |
| [...inot/client/JsonAsyncHttpPinotClientTransport.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0pzb25Bc3luY0h0dHBQaW5vdENsaWVudFRyYW5zcG9ydC5qYXZh) | `10.90% <0.00%> (-51.10%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `73.80% <ø> (+0.63%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1148 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [37f2e28...f924f2f](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r552977306
##########
File path: pinot-controller/src/main/java/org/apache/pinot/controller/helix/ControllerRequestURLBuilder.java
##########
@@ -211,6 +211,9 @@ public String forTableView(String tableName, String view, @Nullable String table
}
return url;
}
+ public String forTableSchemaGet(String tableName) {
+ return StringUtil.join("/", _baseUrl, "tables", tableName, "schema");
Review comment:
I feel it should be `/` since it's for constructing URL, not filesystem path.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 merged pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 merged pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (6905653) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `21.99%`.
> The diff coverage is `38.69%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
===========================================
- Coverage 66.44% 44.45% -22.00%
===========================================
Files 1075 1314 +239
Lines 54773 63652 +8879
Branches 8168 9258 +1090
===========================================
- Hits 36396 28297 -8099
- Misses 15700 32999 +17299
+ Partials 2677 2356 -321
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `44.45% <38.69%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...org/apache/pinot/broker/queryquota/HitCounter.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9IaXRDb3VudGVyLmphdmE=) | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
| [...che/pinot/broker/queryquota/MaxHitRateTracker.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9NYXhIaXRSYXRlVHJhY2tlci5qYXZh) | `0.00% <0.00%> (ø)` | |
| [...ache/pinot/broker/queryquota/QueryQuotaEntity.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9RdWVyeVF1b3RhRW50aXR5LmphdmE=) | `0.00% <0.00%> (-50.00%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ceselector/StrictReplicaGroupInstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL1N0cmljdFJlcGxpY2FHcm91cEluc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/TimeSegmentPruner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL1RpbWVTZWdtZW50UHJ1bmVyLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/interval/Interval.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...r/routing/segmentpruner/interval/IntervalTree.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsVHJlZS5qYXZh) | `0.00% <0.00%> (ø)` | |
| ... and [1282 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [6b43aef...6905653](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (e2a0fc9) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **increase** coverage by `7.13%`.
> The diff coverage is `72.97%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
+ Coverage 66.44% 73.58% +7.13%
==========================================
Files 1075 1330 +255
Lines 54773 64929 +10156
Branches 8168 9469 +1301
==========================================
+ Hits 36396 47775 +11379
+ Misses 15700 14033 -1667
- Partials 2677 3121 +444
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `43.95% <38.51%> (?)` | |
| unittests | `65.23% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `44.44% <0.00%> (-4.40%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `78.57% <ø> (+5.40%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../common/config/tuner/TableConfigTunerRegistry.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1RhYmxlQ29uZmlnVHVuZXJSZWdpc3RyeS5qYXZh) | `72.00% <ø> (ø)` | |
| [.../apache/pinot/common/exception/QueryException.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZXhjZXB0aW9uL1F1ZXJ5RXhjZXB0aW9uLmphdmE=) | `90.27% <ø> (+5.55%)` | :arrow_up: |
| [...pinot/common/function/AggregationFunctionType.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vZnVuY3Rpb24vQWdncmVnYXRpb25GdW5jdGlvblR5cGUuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1121 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [950295a...e2a0fc9](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r553065273
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
Review comment:
I want to infer it and user can set extra configs using `-extraConfigs`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (e2a0fc9) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `1.21%`.
> The diff coverage is `56.80%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
- Coverage 66.44% 65.23% -1.22%
==========================================
Files 1075 1330 +255
Lines 54773 64929 +10156
Branches 8168 9469 +1301
==========================================
+ Hits 36396 42355 +5959
- Misses 15700 19544 +3844
- Partials 2677 3030 +353
```
| Flag | Coverage Δ | |
|---|---|---|
| unittests | `65.23% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...e/pinot/broker/api/resources/PinotBrokerDebug.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYXBpL3Jlc291cmNlcy9QaW5vdEJyb2tlckRlYnVnLmphdmE=) | `0.00% <0.00%> (-79.32%)` | :arrow_down: |
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `71.42% <ø> (-28.58%)` | :arrow_down: |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `33.96% <0.00%> (-32.71%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ava/org/apache/pinot/client/AbstractResultSet.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Fic3RyYWN0UmVzdWx0U2V0LmphdmE=) | `66.66% <0.00%> (+9.52%)` | :arrow_up: |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `35.55% <0.00%> (-13.29%)` | :arrow_down: |
| [...inot/client/JsonAsyncHttpPinotClientTransport.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0pzb25Bc3luY0h0dHBQaW5vdENsaWVudFRyYW5zcG9ydC5qYXZh) | `10.90% <0.00%> (-51.10%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `73.80% <ø> (+0.63%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1166 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [950295a...e2a0fc9](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r552978533
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
Review comment:
Not for now, it can be as simple as writing a bash to loop all the files then call this cmd.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r552980428
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
+ String segmentName = (extraConfigs.containsKey(SEGMENT_NAME)) ? extraConfigs.get(SEGMENT_NAME)
+ : String.format("%s_%s", _table, DigestUtils.sha256Hex(_dataFilePath));
+ segmentNameGeneratorSpec.setConfigs(ImmutableMap.of(SEGMENT_NAME, segmentName));
+ spec.setSegmentNameGeneratorSpec(segmentNameGeneratorSpec);
+
+ // set PinotClusterSpecs
+ PinotClusterSpec pinotClusterSpec = new PinotClusterSpec();
+ pinotClusterSpec.setControllerURI(_controllerURI);
+ PinotClusterSpec[] pinotClusterSpecs = new PinotClusterSpec[]{pinotClusterSpec};
+ spec.setPinotClusterSpecs(pinotClusterSpecs);
+
+ // set PushJobSpec
+ PushJobSpec pushJobSpec = new PushJobSpec();
+ pushJobSpec.setPushAttempts(3);
+ pushJobSpec.setPushRetryIntervalMillis(10000);
+ spec.setPushJobSpec(pushJobSpec);
+
+ return spec;
+ }
+
+ private Map<String, String> getS3PinotFSConfigs(Map<String, String> extraConfigs) {
+ Map<String, String> s3PinotFSConfigs = new HashMap<>();
+ s3PinotFSConfigs.put("region", System.getProperty("AWS_REGION", "us-west-2"));
Review comment:
this is just a default value in case not set at all.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] mayankshriv commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
mayankshriv commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r551413695
##########
File path: pinot-controller/src/main/java/org/apache/pinot/controller/helix/ControllerRequestURLBuilder.java
##########
@@ -211,6 +211,9 @@ public String forTableView(String tableName, String view, @Nullable String table
}
return url;
}
+ public String forTableSchemaGet(String tableName) {
+ return StringUtil.join("/", _baseUrl, "tables", tableName, "schema");
Review comment:
Better to use `File.separator` than `/`?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
Review comment:
SegmentNameGenerator should also come from input arg?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
+ String segmentName = (extraConfigs.containsKey(SEGMENT_NAME)) ? extraConfigs.get(SEGMENT_NAME)
+ : String.format("%s_%s", _table, DigestUtils.sha256Hex(_dataFilePath));
+ segmentNameGeneratorSpec.setConfigs(ImmutableMap.of(SEGMENT_NAME, segmentName));
+ spec.setSegmentNameGeneratorSpec(segmentNameGeneratorSpec);
+
+ // set PinotClusterSpecs
+ PinotClusterSpec pinotClusterSpec = new PinotClusterSpec();
+ pinotClusterSpec.setControllerURI(_controllerURI);
+ PinotClusterSpec[] pinotClusterSpecs = new PinotClusterSpec[]{pinotClusterSpec};
+ spec.setPinotClusterSpecs(pinotClusterSpecs);
+
+ // set PushJobSpec
+ PushJobSpec pushJobSpec = new PushJobSpec();
+ pushJobSpec.setPushAttempts(3);
+ pushJobSpec.setPushRetryIntervalMillis(10000);
+ spec.setPushJobSpec(pushJobSpec);
+
+ return spec;
+ }
+
+ private Map<String, String> getS3PinotFSConfigs(Map<String, String> extraConfigs) {
+ Map<String, String> s3PinotFSConfigs = new HashMap<>();
+ s3PinotFSConfigs.put("region", System.getProperty("AWS_REGION", "us-west-2"));
+ s3PinotFSConfigs.putAll(IngestionConfigUtils.getConfigMapWithPrefix(extraConfigs,
+ BatchConfigProperties.INPUT_FS_PROP_PREFIX + IngestionConfigUtils.DOT_SEPARATOR));
+ return s3PinotFSConfigs;
+ }
+
+ private PinotFSSpec getPinotFSSpec(String scheme, String className, Map<String, String> configs) {
+ PinotFSSpec pinotFSSpec = new PinotFSSpec();
+ pinotFSSpec.setScheme(scheme);
+ pinotFSSpec.setClassName(className);
+ pinotFSSpec.setConfigs(configs);
+ return pinotFSSpec;
+ }
+
+ private Map<String, String> getExtraConfigs(List<String> extraConfigs) {
+ if (extraConfigs == null) {
+ return Collections.emptyMap();
+ }
+ Map<String, String> recordReaderConfigs = new HashMap<>();
+ for (String kvPair : extraConfigs) {
+ String[] splits = kvPair.split("=", 2);
+ if ((splits.length == 2) && (splits[0] != null) && (splits[1] != null)) {
+ recordReaderConfigs.put(splits[0], splits[1]);
+ }
+ }
+ return recordReaderConfigs;
+ }
+
+ private String getRecordReaderConfigClass(FileFormat format) {
+ switch (format) {
+ case CSV:
+ return "org.apache.pinot.plugin.inputformat.csv.CSVRecordReaderConfig";
+ case PROTO:
+ return "org.apache.pinot.plugin.inputformat.protobuf.ProtoBufRecordReaderConfig";
+ case THRIFT:
+ return "org.apache.pinot.plugin.inputformat.thrift.ThriftRecordReaderConfig";
+ case ORC:
+ case JSON:
+ case AVRO:
+ case GZIPPED_AVRO:
+ case PARQUET:
+ return null;
+ default:
+ throw new IllegalArgumentException("Unsupported file format - " + format);
+ }
+ }
+
+ private String getRecordReaderClass(FileFormat format) {
Review comment:
Do we not have a RecordReaderFactory?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
Review comment:
Does this also support dataDir that contains multiple data files to be imported?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
+ spec.setPinotFSSpecs(pinotFSSpecs);
+
+ // set RecordReaderSpec
+ RecordReaderSpec recordReaderSpec = new RecordReaderSpec();
+ recordReaderSpec.setDataFormat(_format.name());
+ recordReaderSpec.setClassName(getRecordReaderClass(_format));
+ recordReaderSpec.setConfigClassName(getRecordReaderConfigClass(_format));
+ recordReaderSpec.setConfigs(IngestionConfigUtils.getRecordReaderProps(extraConfigs));
+ spec.setRecordReaderSpec(recordReaderSpec);
+
+ // set TableSpec
+ TableSpec tableSpec = new TableSpec();
+ tableSpec.setTableName(_table);
+ tableSpec.setSchemaURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableSchemaGet(_table));
+ tableSpec.setTableConfigURI(ControllerRequestURLBuilder.baseUrl(_controllerURI).forTableGet(_table));
+ spec.setTableSpec(tableSpec);
+
+ // set SegmentNameGeneratorSpec
+ SegmentNameGeneratorSpec segmentNameGeneratorSpec = new SegmentNameGeneratorSpec();
+ segmentNameGeneratorSpec
+ .setType(org.apache.pinot.spi.ingestion.batch.BatchConfigProperties.SegmentNameGeneratorType.FIXED);
+ String segmentName = (extraConfigs.containsKey(SEGMENT_NAME)) ? extraConfigs.get(SEGMENT_NAME)
+ : String.format("%s_%s", _table, DigestUtils.sha256Hex(_dataFilePath));
+ segmentNameGeneratorSpec.setConfigs(ImmutableMap.of(SEGMENT_NAME, segmentName));
+ spec.setSegmentNameGeneratorSpec(segmentNameGeneratorSpec);
+
+ // set PinotClusterSpecs
+ PinotClusterSpec pinotClusterSpec = new PinotClusterSpec();
+ pinotClusterSpec.setControllerURI(_controllerURI);
+ PinotClusterSpec[] pinotClusterSpecs = new PinotClusterSpec[]{pinotClusterSpec};
+ spec.setPinotClusterSpecs(pinotClusterSpecs);
+
+ // set PushJobSpec
+ PushJobSpec pushJobSpec = new PushJobSpec();
+ pushJobSpec.setPushAttempts(3);
+ pushJobSpec.setPushRetryIntervalMillis(10000);
+ spec.setPushJobSpec(pushJobSpec);
+
+ return spec;
+ }
+
+ private Map<String, String> getS3PinotFSConfigs(Map<String, String> extraConfigs) {
+ Map<String, String> s3PinotFSConfigs = new HashMap<>();
+ s3PinotFSConfigs.put("region", System.getProperty("AWS_REGION", "us-west-2"));
Review comment:
Hmm, should these be hardcoded?
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,340 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import com.google.common.collect.ImmutableMap;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
+ private List<String> _extraConfigs;
+
+ @SuppressWarnings("FieldCanBeLocal")
+ @Option(name = "-help", help = true, aliases = {"-h", "--h", "--help"}, usage = "Print this message.")
+ private boolean _help = false;
+
+ public ImportDataCommand setDataFilePath(String dataFilePath) {
+ _dataFilePath = dataFilePath;
+ return this;
+ }
+
+ public ImportDataCommand setFormat(FileFormat format) {
+ _format = format;
+ return this;
+ }
+
+ public ImportDataCommand setTable(String table) {
+ _table = table;
+ return this;
+ }
+
+ public ImportDataCommand setControllerURI(String controllerURI) {
+ _controllerURI = controllerURI;
+ return this;
+ }
+
+ public ImportDataCommand setTempDir(String tempDir) {
+ _tempDir = tempDir;
+ return this;
+ }
+
+ public List<String> getExtraConfigs() {
+ return _extraConfigs;
+ }
+
+ public ImportDataCommand setExtraConfigs(List<String> extraConfigs) {
+ _extraConfigs = extraConfigs;
+ return this;
+ }
+
+ public String getDataFilePath() {
+ return _dataFilePath;
+ }
+
+ public FileFormat getFormat() {
+ return _format;
+ }
+
+ public String getTable() {
+ return _table;
+ }
+
+ public String getControllerURI() {
+ return _controllerURI;
+ }
+
+ public String getTempDir() {
+ return _tempDir;
+ }
+
+ @Override
+ public String toString() {
+ String results = String
+ .format("InsertData -dataFilePath %s -format %s -table %s -controllerURI %s -tempDir %s", _dataFilePath,
+ _format, _table, _controllerURI, _tempDir);
+ if (_extraConfigs != null) {
+ results += " -extraConfigs " + Arrays.toString(_extraConfigs.toArray());
+ }
+ return results;
+ }
+
+ @Override
+ public final String getName() {
+ return "InsertData";
+ }
+
+ @Override
+ public String description() {
+ return "Insert data into Pinot cluster.";
+ }
+
+ @Override
+ public boolean getHelp() {
+ return _help;
+ }
+
+ @Override
+ public boolean execute()
+ throws IOException {
+ LOGGER.info("Executing command: {}", toString());
+ Preconditions.checkArgument(_table != null, "'table' must be specified");
+ Preconditions.checkArgument(_format != null, "'format' must be specified");
+ Preconditions.checkArgument(_dataFilePath != null, "'dataFilePath' must be specified");
+
+ try {
+
+ URI dataFileURI = URI.create(_dataFilePath);
+ if ((dataFileURI.getScheme() == null)) {
+ File dataFile = new File(_dataFilePath);
+ Preconditions.checkArgument(dataFile.exists(), "'dataFile': '%s' doesn't exist", dataFile);
+ LOGGER.info("Found data files: {} of format: {}", dataFile, _format);
+ }
+
+ initTempDir();
+ IngestionJobLauncher.runIngestionJob(generateSegmentGenerationJobSpec());
+ LOGGER.info("Successfully load data from {} to Pinot.", _dataFilePath);
+ return true;
+ } catch (Exception e) {
+ throw e;
+ } finally {
+ FileUtils.deleteQuietly(new File(_tempDir));
+ }
+ }
+
+ private void initTempDir()
+ throws IOException {
+ File tempDir = new File(_tempDir);
+ if (tempDir.exists()) {
+ LOGGER.info("Deleting the existing 'tempDir': {}", tempDir);
+ FileUtils.forceDelete(tempDir);
+ }
+ FileUtils.forceMkdir(tempDir);
+ }
+
+ private SegmentGenerationJobSpec generateSegmentGenerationJobSpec() {
+ final Map<String, String> extraConfigs = getExtraConfigs(_extraConfigs);
+
+ SegmentGenerationJobSpec spec = new SegmentGenerationJobSpec();
+ URI dataFileURI = URI.create(_dataFilePath);
+ URI parent = dataFileURI.getPath().endsWith("/") ? dataFileURI.resolve("..") : dataFileURI.resolve(".");
+ spec.setInputDirURI(parent.toString());
+ spec.setIncludeFileNamePattern("glob:**" + dataFileURI.getPath());
+ spec.setOutputDirURI(_tempDir);
+ spec.setCleanUpOutputDir(true);
+ spec.setOverwriteOutput(true);
+ spec.setJobType("SegmentCreationAndTarPush");
+
+ // set ExecutionFrameworkSpec
+ ExecutionFrameworkSpec executionFrameworkSpec = new ExecutionFrameworkSpec();
+ executionFrameworkSpec.setName("standalone");
+ executionFrameworkSpec.setSegmentGenerationJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner");
+ executionFrameworkSpec.setSegmentTarPushJobRunnerClassName(
+ "org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner");
+ spec.setExecutionFrameworkSpec(executionFrameworkSpec);
+
+ // set PinotFSSpecs
+ List<PinotFSSpec> pinotFSSpecs = new ArrayList<>();
+ pinotFSSpecs.add(getPinotFSSpec("file", "org.apache.pinot.spi.filesystem.LocalPinotFS", Collections.emptyMap()));
+ pinotFSSpecs
+ .add(getPinotFSSpec("s3", "org.apache.pinot.plugin.filesystem.S3PinotFS", getS3PinotFSConfigs(extraConfigs)));
Review comment:
File system to use should come from input argument?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] fx19880617 commented on a change in pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
fx19880617 commented on a change in pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#discussion_r559315182
##########
File path: pinot-tools/src/main/java/org/apache/pinot/tools/admin/command/ImportDataCommand.java
##########
@@ -0,0 +1,390 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.pinot.tools.admin.command;
+
+import com.google.common.base.Preconditions;
+import java.io.File;
+import java.io.IOException;
+import java.net.URI;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.io.FileUtils;
+import org.apache.pinot.controller.helix.ControllerRequestURLBuilder;
+import org.apache.pinot.spi.data.readers.FileFormat;
+import org.apache.pinot.spi.filesystem.PinotFSFactory;
+import org.apache.pinot.spi.ingestion.batch.BatchConfigProperties;
+import org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher;
+import org.apache.pinot.spi.ingestion.batch.spec.ExecutionFrameworkSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PinotFSSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.PushJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.RecordReaderSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.SegmentNameGeneratorSpec;
+import org.apache.pinot.spi.ingestion.batch.spec.TableSpec;
+import org.apache.pinot.spi.utils.IngestionConfigUtils;
+import org.apache.pinot.tools.Command;
+import org.kohsuke.args4j.Option;
+import org.kohsuke.args4j.spi.StringArrayOptionHandler;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+
+/**
+ * Class to implement ImportData command.
+ */
+@SuppressWarnings("unused")
+public class ImportDataCommand extends AbstractBaseAdminCommand implements Command {
+ private static final Logger LOGGER = LoggerFactory.getLogger(ImportDataCommand.class);
+ private static final String SEGMENT_NAME = "segment.name";
+
+ @Option(name = "-dataFilePath", required = true, metaVar = "<string>", usage = "data file path.")
+ private String _dataFilePath;
+
+ @Option(name = "-format", required = true, metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Input data format.")
+ private FileFormat _format;
+
+ @Option(name = "-segmentNameGeneratorType", metaVar = "<AVRO/CSV/JSON/THRIFT/PARQUET/ORC>", usage = "Segment name generator type, default to FIXED type.")
+ private String _segmentNameGeneratorType = BatchConfigProperties.SegmentNameGeneratorType.FIXED;
+
+ @Option(name = "-table", required = true, metaVar = "<string>", usage = "Table name.")
+ private String _table;
+
+ @Option(name = "-controllerURI", metaVar = "<string>", usage = "Pinot Controller URI.")
+ private String _controllerURI = "http://localhost:9000";
+
+ @Option(name = "-tempDir", metaVar = "<string>", usage = "Temporary directory used to hold data during segment creation.")
+ private String _tempDir = new File(FileUtils.getTempDirectory(), getClass().getSimpleName()).getAbsolutePath();
+
+ @Option(name = "-extraConfigs", metaVar = "<extra configs>", handler = StringArrayOptionHandler.class, usage = "Extra configs to be set.")
Review comment:
done
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io edited a comment on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io edited a comment on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (c3d16d1) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `21.98%`.
> The diff coverage is `38.69%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
===========================================
- Coverage 66.44% 44.46% -21.99%
===========================================
Files 1075 1314 +239
Lines 54773 63652 +8879
Branches 8168 9258 +1090
===========================================
- Hits 36396 28305 -8091
- Misses 15700 32991 +17291
+ Partials 2677 2356 -321
```
| Flag | Coverage Δ | |
|---|---|---|
| integration | `44.46% <38.69%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `100.00% <ø> (ø)` | |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `52.83% <0.00%> (-13.84%)` | :arrow_down: |
| [...org/apache/pinot/broker/queryquota/HitCounter.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9IaXRDb3VudGVyLmphdmE=) | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
| [...che/pinot/broker/queryquota/MaxHitRateTracker.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9NYXhIaXRSYXRlVHJhY2tlci5qYXZh) | `0.00% <0.00%> (ø)` | |
| [...ache/pinot/broker/queryquota/QueryQuotaEntity.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcXVlcnlxdW90YS9RdWVyeVF1b3RhRW50aXR5LmphdmE=) | `0.00% <0.00%> (-50.00%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ceselector/StrictReplicaGroupInstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL1N0cmljdFJlcGxpY2FHcm91cEluc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/TimeSegmentPruner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL1RpbWVTZWdtZW50UHJ1bmVyLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...roker/routing/segmentpruner/interval/Interval.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsLmphdmE=) | `0.00% <0.00%> (ø)` | |
| [...r/routing/segmentpruner/interval/IntervalTree.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9zZWdtZW50cHJ1bmVyL2ludGVydmFsL0ludGVydmFsVHJlZS5qYXZh) | `0.00% <0.00%> (ø)` | |
| ... and [1283 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [6b43aef...6905653](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org
[GitHub] [incubator-pinot] codecov-io commented on pull request #6396: Adding ImportData sub command in pinot admin
Posted by GitBox <gi...@apache.org>.
codecov-io commented on pull request #6396:
URL: https://github.com/apache/incubator-pinot/pull/6396#issuecomment-752361098
# [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=h1) Report
> Merging [#6396](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=desc) (5985dc5) into [master](https://codecov.io/gh/apache/incubator-pinot/commit/1beaab59b73f26c4e35f3b9bc856b03806cddf5a?el=desc) (1beaab5) will **decrease** coverage by `1.02%`.
> The diff coverage is `56.80%`.
[![Impacted file tree graph](https://codecov.io/gh/apache/incubator-pinot/pull/6396/graphs/tree.svg?width=650&height=150&src=pr&token=4ibza2ugkz)](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree)
```diff
@@ Coverage Diff @@
## master #6396 +/- ##
==========================================
- Coverage 66.44% 65.42% -1.03%
==========================================
Files 1075 1314 +239
Lines 54773 63652 +8879
Branches 8168 9258 +1090
==========================================
+ Hits 36396 41646 +5250
- Misses 15700 19040 +3340
- Partials 2677 2966 +289
```
| Flag | Coverage Δ | |
|---|---|---|
| unittests | `65.42% <56.80%> (?)` | |
Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags#carryforward-flags-in-the-pull-request-comment) to find out more.
| [Impacted Files](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=tree) | Coverage Δ | |
|---|---|---|
| [...e/pinot/broker/api/resources/PinotBrokerDebug.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYXBpL3Jlc291cmNlcy9QaW5vdEJyb2tlckRlYnVnLmphdmE=) | `0.00% <0.00%> (-79.32%)` | :arrow_down: |
| [...ot/broker/broker/AllowAllAccessControlFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL0FsbG93QWxsQWNjZXNzQ29udHJvbEZhY3RvcnkuamF2YQ==) | `71.42% <ø> (-28.58%)` | :arrow_down: |
| [.../helix/BrokerUserDefinedMessageHandlerFactory.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvYnJva2VyL2hlbGl4L0Jyb2tlclVzZXJEZWZpbmVkTWVzc2FnZUhhbmRsZXJGYWN0b3J5LmphdmE=) | `33.96% <0.00%> (-32.71%)` | :arrow_down: |
| [...ker/routing/instanceselector/InstanceSelector.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtYnJva2VyL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9icm9rZXIvcm91dGluZy9pbnN0YW5jZXNlbGVjdG9yL0luc3RhbmNlU2VsZWN0b3IuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ava/org/apache/pinot/client/AbstractResultSet.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Fic3RyYWN0UmVzdWx0U2V0LmphdmE=) | `66.66% <0.00%> (+9.52%)` | :arrow_up: |
| [.../main/java/org/apache/pinot/client/Connection.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0Nvbm5lY3Rpb24uamF2YQ==) | `35.55% <0.00%> (-13.29%)` | :arrow_down: |
| [...inot/client/JsonAsyncHttpPinotClientTransport.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY2xpZW50cy9waW5vdC1qYXZhLWNsaWVudC9zcmMvbWFpbi9qYXZhL29yZy9hcGFjaGUvcGlub3QvY2xpZW50L0pzb25Bc3luY0h0dHBQaW5vdENsaWVudFRyYW5zcG9ydC5qYXZh) | `10.90% <0.00%> (-51.10%)` | :arrow_down: |
| [...not/common/assignment/InstancePartitionsUtils.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vYXNzaWdubWVudC9JbnN0YW5jZVBhcnRpdGlvbnNVdGlscy5qYXZh) | `73.80% <ø> (+0.63%)` | :arrow_up: |
| [...common/config/tuner/NoOpTableTableConfigTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL05vT3BUYWJsZVRhYmxlQ29uZmlnVHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| [...ot/common/config/tuner/RealTimeAutoIndexTuner.java](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree#diff-cGlub3QtY29tbW9uL3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9waW5vdC9jb21tb24vY29uZmlnL3R1bmVyL1JlYWxUaW1lQXV0b0luZGV4VHVuZXIuamF2YQ==) | `100.00% <ø> (ø)` | |
| ... and [1140 more](https://codecov.io/gh/apache/incubator-pinot/pull/6396/diff?src=pr&el=tree-more) | |
------
[Continue to review full report at Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=continue).
> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
> Powered by [Codecov](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=footer). Last update [6b43aef...6905653](https://codecov.io/gh/apache/incubator-pinot/pull/6396?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@pinot.apache.org
For additional commands, e-mail: commits-help@pinot.apache.org