You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@nifi.apache.org by jo...@apache.org on 2021/12/14 17:15:18 UTC

[nifi] branch support/nifi-1.15 updated (200538a -> 4e88943)

This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a change to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git.


    from 200538a  NIFI-9399 Apply Secure Processing to TransformXml XSLT
     new ffebbab  NIFI-9473 - Upgrade Jackson from 2.12.3 to 2.12.5
     new e66c21a  NIFI-9468 - Bump Kafka client from 2.6.0 to 2.6.3
     new c7be58a  NIFI-9426 Removed unused jackson-mapper-asl from MiNiFi
     new 95159eb  NIFI-9420 Upgraded MiNiFi Guava from 26.0 to 31.0.1
     new 25a2738  NIFI-9419 ParseCEF - Upgraded parcefone and supported empty extensions
     new bd16b93  NIFI-9408 - MiNiFi - remove Ignite dependencies
     new b79714c  NIFI-9396 - MiNiFi - bump junit to 4.13.2
     new 94fee5b  NIFI-9395 - MiNiFi - bump httpclient to 4.5.13
     new 73b3246  NIFI-9394 Removed RequestLogger and TimerFilter
     new 3955b88  NIFI-9393 Set Port Scheduled State for Flow Definitions
     new 7208cc8  Revert "NIFI-9394 Removed RequestLogger and TimerFilter"
     new 534e6ea  NIFI-9260 Making the 'write and rename' behaviour optional for PutHDFS
     new 2273fe5  NIFI-9194: Upsert for Oracle12+
     new 2f9963a  NIFI-9185 Add Avro logical type to SelectHive3QL processor
     new 4e88943  NIFI-5821 Added Engine Name to Script Engine property descriptions

The 15 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../provider/nifi/rest/TemplatesIteratorTest.java  |  14 +--
 .../minifi-framework/minifi-framework-core/pom.xml |   5 -
 minifi/pom.xml                                     |  26 +----
 .../apache/nifi/groups/StandardProcessGroup.java   |   7 ++
 .../flow/mapping/NiFiRegistryFlowMapper.java       |  11 +-
 .../flow/mapping/NiFiRegistryFlowMapperTest.java   |   6 +
 .../org/apache/nifi/processors/hadoop/PutHDFS.java |  35 +++++-
 .../apache/nifi/processors/hadoop/PutHDFSTest.java |  74 +++++++++---
 .../apache/nifi/processors/hive/SelectHive3QL.java |  19 +++-
 .../org/apache/nifi/util/hive/HiveJdbcCommon.java  |  72 ++++++++++--
 .../nifi/processors/hive/TestSelectHive3QL.java    | 124 +++++++++++++++++++++
 nifi-nar-bundles/nifi-kafka-bundle/pom.xml         |   2 +-
 .../nifi/script/ScriptingComponentHelper.java      |  15 ++-
 .../script/TestScriptingComponentHelper.java       |  54 +++++++++
 .../apache/nifi/processors/standard/ParseCEF.java  |  20 +++-
 .../standard/db/impl/Oracle12DatabaseAdapter.java  | 105 ++++++++++++++++-
 .../nifi/processors/standard/TestParseCEF.java     |  32 ++++++
 .../db/impl/TestOracle12DatabaseAdapter.java       |  79 +++++++++++++
 nifi-nar-bundles/nifi-standard-bundle/pom.xml      |   2 +-
 pom.xml                                            |   2 +-
 20 files changed, 622 insertions(+), 82 deletions(-)
 create mode 100644 nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/java/org/apache/nifi/processors/script/TestScriptingComponentHelper.java

[nifi] 10/15: NIFI-9393 Set Port Scheduled State for Flow Definitions

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 3955b88302b5f60e104c12b656fed64b5169b8b1
Author: exceptionfactory <ex...@apache.org>
AuthorDate: Thu Nov 18 14:04:17 2021 -0600

    NIFI-9393 Set Port Scheduled State for Flow Definitions
    
    - Set Scheduled State for Versioned Port and Versioned Remote Port when mapping Flow Definition
    - Updated StandardProcessGroup to set disable Port based on Scheduled State of DISABLED
    - Updated StandardProcessGroup to set Remote Port transmitting based on Scheduled State of ENABLED
    
    Signed-off-by: Nathan Gough <th...@gmail.com>
    
    This closes #5534.
---
 .../java/org/apache/nifi/groups/StandardProcessGroup.java     |  7 +++++++
 .../nifi/registry/flow/mapping/NiFiRegistryFlowMapper.java    | 11 +++++++++--
 .../registry/flow/mapping/NiFiRegistryFlowMapperTest.java     |  6 ++++++
 3 files changed, 22 insertions(+), 2 deletions(-)

diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/groups/StandardProcessGroup.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/groups/StandardProcessGroup.java
index 3583444..8618bd7 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/groups/StandardProcessGroup.java
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/groups/StandardProcessGroup.java
@@ -4930,6 +4930,9 @@ public final class StandardProcessGroup implements ProcessGroup {
         port.setName(name);
         port.setPosition(new Position(proposed.getPosition().getX(), proposed.getPosition().getY()));
         port.setMaxConcurrentTasks(proposed.getConcurrentlySchedulableTaskCount());
+        if (org.apache.nifi.flow.ScheduledState.DISABLED == proposed.getScheduledState()) {
+            port.disable();
+        }
     }
 
     private Port addInputPort(final ProcessGroup destination, final VersionedPort proposed, final String componentIdSeed, final String temporaryName) {
@@ -5185,6 +5188,10 @@ public final class StandardProcessGroup implements ProcessGroup {
         descriptor.setId(generateUuid(proposed.getIdentifier(), rpgId, componentIdSeed));
         descriptor.setName(proposed.getName());
         descriptor.setUseCompression(proposed.isUseCompression());
+
+        final boolean transmitting = org.apache.nifi.flow.ScheduledState.ENABLED == proposed.getScheduledState();
+        descriptor.setTransmitting(transmitting);
+
         return descriptor;
     }
 
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapper.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapper.java
index b0b3334..889f3f1 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapper.java
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-components/src/main/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapper.java
@@ -581,6 +581,7 @@ public class NiFiRegistryFlowMapper {
         versionedPort.setName(port.getName());
         versionedPort.setPosition(mapPosition(port.getPosition()));
         versionedPort.setType(PortType.valueOf(port.getConnectableType().name()));
+        versionedPort.setScheduledState(mapScheduledState(port.getScheduledState()));
 
         if (port instanceof PublicPort) {
             versionedPort.setAllowRemoteAccess(true);
@@ -621,8 +622,7 @@ public class NiFiRegistryFlowMapper {
         processor.setSchedulingStrategy(procNode.getSchedulingStrategy().name());
         processor.setStyle(procNode.getStyle());
         processor.setYieldDuration(procNode.getYieldPeriod());
-        processor.setScheduledState(procNode.getScheduledState() == ScheduledState.DISABLED ? org.apache.nifi.flow.ScheduledState.DISABLED
-            : org.apache.nifi.flow.ScheduledState.ENABLED);
+        processor.setScheduledState(mapScheduledState(procNode.getScheduledState()));
 
         return processor;
     }
@@ -664,6 +664,7 @@ public class NiFiRegistryFlowMapper {
         port.setBatchSize(mapBatchSettings(remotePort));
         port.setTargetId(remotePort.getTargetIdentifier());
         port.setComponentType(componentType);
+        port.setScheduledState(mapScheduledState(remotePort.getScheduledState()));
         return port;
     }
 
@@ -730,4 +731,10 @@ public class NiFiRegistryFlowMapper {
         versionedParameter.setValue(descriptor.isSensitive() ? null : parameter.getValue());
         return versionedParameter;
     }
+
+    private org.apache.nifi.flow.ScheduledState mapScheduledState(final ScheduledState scheduledState) {
+         return scheduledState == ScheduledState.DISABLED
+                 ? org.apache.nifi.flow.ScheduledState.DISABLED
+                 : org.apache.nifi.flow.ScheduledState.ENABLED;
+    }
 }
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/test/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapperTest.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/test/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapperTest.java
index bdbf6c1..b5882b6 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/test/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapperTest.java
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/test/java/org/apache/nifi/registry/flow/mapping/NiFiRegistryFlowMapperTest.java
@@ -34,6 +34,7 @@ import org.apache.nifi.connectable.Size;
 import org.apache.nifi.controller.ControllerService;
 import org.apache.nifi.controller.ProcessorNode;
 import org.apache.nifi.controller.PropertyConfiguration;
+import org.apache.nifi.controller.ScheduledState;
 import org.apache.nifi.controller.label.Label;
 import org.apache.nifi.controller.queue.FlowFileQueue;
 import org.apache.nifi.controller.queue.LoadBalanceCompression;
@@ -436,6 +437,7 @@ public class NiFiRegistryFlowMapperTest {
         prepareComponentAuthorizable(port, processGroupId);
         preparePositionable(port);
         prepareConnectable(port, ConnectableType.valueOf(portType.name()));
+        when(port.getScheduledState()).thenReturn(ScheduledState.RUNNING);
         return port;
     }
 
@@ -532,6 +534,7 @@ public class NiFiRegistryFlowMapperTest {
         prepareComponentAuthorizable(remoteGroupPort, remoteProcessGroup.getIdentifier());
         when(remoteGroupPort.getName()).thenReturn("remotePort" + (counter++));
         when(remoteGroupPort.getRemoteProcessGroup()).thenReturn(remoteProcessGroup);
+        when(remoteGroupPort.getScheduledState()).thenReturn(ScheduledState.DISABLED);
         return remoteGroupPort;
     }
 
@@ -751,6 +754,7 @@ public class NiFiRegistryFlowMapperTest {
             assertEquals(port.getPosition().getY(), versionedPort.getPosition().getY(), 0);
             assertEquals(port.getName(), versionedPort.getName());
             assertEquals(portType, versionedPort.getType());
+            assertEquals(org.apache.nifi.flow.ScheduledState.ENABLED, versionedPort.getScheduledState());
         }
     }
 
@@ -767,6 +771,8 @@ public class NiFiRegistryFlowMapperTest {
             assertEquals(expectedPortGroupIdentifier, versionedRemotePort.getGroupIdentifier());
             assertEquals(remotePort.getName(), versionedRemotePort.getName());
             assertEquals(componentType, versionedRemotePort.getComponentType());
+            assertNotNull(versionedRemotePort.getScheduledState());
+            assertEquals(remotePort.getScheduledState().name(), versionedRemotePort.getScheduledState().name());
         }
     }
 

[nifi] 13/15: NIFI-9194: Upsert for Oracle12+

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 2273fe57660cad05943e018abcbeb2fec8d4325c
Author: Roberto Santos <rs...@gmail.com>
AuthorDate: Sat Sep 4 08:40:16 2021 -0300

    NIFI-9194: Upsert for Oracle12+
    
    Fixes pr #5366.
    
    Fixes pr #5366. Replace tabchars fot whitespaces.
    
    Fixes pr #5366. Replaced tabchars for whitespaces. Removed unnecessary comments.
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5366
---
 .../standard/db/impl/Oracle12DatabaseAdapter.java  | 105 ++++++++++++++++++++-
 .../db/impl/TestOracle12DatabaseAdapter.java       |  79 ++++++++++++++++
 2 files changed, 179 insertions(+), 5 deletions(-)

diff --git a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/Oracle12DatabaseAdapter.java b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/Oracle12DatabaseAdapter.java
index 18f3ceb..63e7379 100644
--- a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/Oracle12DatabaseAdapter.java
+++ b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/db/impl/Oracle12DatabaseAdapter.java
@@ -16,12 +16,14 @@
  */
 package org.apache.nifi.processors.standard.db.impl;
 
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.List;
+import java.util.stream.Collectors;
+
 import org.apache.commons.lang3.StringUtils;
 import org.apache.nifi.processors.standard.db.DatabaseAdapter;
 
-/**
- * A database adapter that generates MS SQL Compatible SQL.
- */
 public class Oracle12DatabaseAdapter implements DatabaseAdapter {
     @Override
     public String getName() {
@@ -34,12 +36,14 @@ public class Oracle12DatabaseAdapter implements DatabaseAdapter {
     }
 
     @Override
-    public String getSelectStatement(String tableName, String columnNames, String whereClause, String orderByClause, Long limit, Long offset) {
+    public String getSelectStatement(String tableName, String columnNames, String whereClause, String orderByClause,
+            Long limit, Long offset) {
         return getSelectStatement(tableName, columnNames, whereClause, orderByClause, limit, offset, null);
     }
 
     @Override
-    public String getSelectStatement(String tableName, String columnNames, String whereClause, String orderByClause, Long limit, Long offset, String columnForPartitioning) {
+    public String getSelectStatement(String tableName, String columnNames, String whereClause, String orderByClause,
+            Long limit, Long offset, String columnForPartitioning) {
         if (StringUtils.isEmpty(tableName)) {
             throw new IllegalArgumentException("Table name cannot be null or empty");
         }
@@ -93,4 +97,95 @@ public class Oracle12DatabaseAdapter implements DatabaseAdapter {
     public String getTableAliasClause(String tableName) {
         return tableName;
     }
+
+    @Override
+    public boolean supportsUpsert() {
+        return true;
+    }
+
+    @Override
+    public String getUpsertStatement(String table, List<String> columnNames, Collection<String> uniqueKeyColumnNames)
+            throws IllegalArgumentException {
+        if (StringUtils.isEmpty(table)) {
+            throw new IllegalArgumentException("Table name cannot be null or blank");
+        }
+        if (columnNames == null || columnNames.isEmpty()) {
+            throw new IllegalArgumentException("Column names cannot be null or empty");
+        }
+        if (uniqueKeyColumnNames == null || uniqueKeyColumnNames.isEmpty()) {
+            throw new IllegalArgumentException("Key column names cannot be null or empty");
+        }
+
+        String newValuesAlias = "n";
+
+        String columns = columnNames.stream().collect(Collectors.joining(", ? "));
+
+        columns = "? " + columns;
+
+        List<String> columnsAssignment = getColumnsAssignment(columnNames, newValuesAlias, table);
+
+        List<String> conflictColumnsClause = getConflictColumnsClause(uniqueKeyColumnNames, columnsAssignment, table,
+                newValuesAlias);
+        String conflictClause = "(" + conflictColumnsClause.stream().collect(Collectors.joining(" AND ")) + ")";
+
+        String insertStatement = columnNames.stream().collect(Collectors.joining(", "));
+        String insertValues = newValuesAlias + "."
+                + columnNames.stream().collect(Collectors.joining(", " + newValuesAlias + "."));
+
+        columnsAssignment.removeAll(conflictColumnsClause);
+        String updateStatement = columnsAssignment.stream().collect(Collectors.joining(", "));
+
+        StringBuilder statementStringBuilder = new StringBuilder("MERGE INTO ").append(table).append(" USING (SELECT ")
+                .append(columns).append(" FROM DUAL) ").append(newValuesAlias).append(" ON ").append(conflictClause)
+                .append(" WHEN NOT MATCHED THEN INSERT (").append(insertStatement).append(") VALUES (")
+                .append(insertValues).append(")").append(" WHEN MATCHED THEN UPDATE SET ").append(updateStatement);
+
+        return statementStringBuilder.toString();
+    }
+
+    private List<String> getConflictColumnsClause(Collection<String> uniqueKeyColumnNames, List<String> conflictColumns,
+            String table, String newTableAlias) {
+        List<String> conflictColumnsClause = conflictColumns.stream()
+                .filter(column -> uniqueKeyColumnNames.stream().anyMatch(
+                        uniqueKey -> column.equalsIgnoreCase(getColumnAssignment(table, uniqueKey, newTableAlias))))
+                .collect(Collectors.toList());
+
+        if (conflictColumnsClause.isEmpty()) {
+
+            // Try it with normalized columns
+            conflictColumnsClause = conflictColumns.stream()
+                    .filter((column -> uniqueKeyColumnNames.stream()
+                            .anyMatch(uniqueKey -> normalizeColumnName(column).equalsIgnoreCase(
+                                    normalizeColumnName(getColumnAssignment(table, uniqueKey, newTableAlias))))))
+                    .collect(Collectors.toList());
+        }
+
+        return conflictColumnsClause;
+
+    }
+
+    private String normalizeColumnName(final String colName) {
+        return colName == null ? null : colName.toUpperCase().replace("_", "");
+    }
+
+    private List<String> getColumnsAssignment(Collection<String> columnsNames, String newTableAlias, String table) {
+        List<String> conflictClause = new ArrayList<>();
+
+        for (String columnName : columnsNames) {
+
+            StringBuilder statementStringBuilder = new StringBuilder();
+
+            statementStringBuilder.append(getColumnAssignment(table, columnName, newTableAlias));
+
+            conflictClause.add(statementStringBuilder.toString());
+
+        }
+
+        return conflictClause;
+    }
+
+    private String getColumnAssignment(String table, String columnName, String newTableAlias) {
+        return table + "." + columnName + " = " + newTableAlias + "." + columnName;
+    }
+
 }
diff --git a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/db/impl/TestOracle12DatabaseAdapter.java b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/db/impl/TestOracle12DatabaseAdapter.java
index 2315e98..99d625a 100644
--- a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/db/impl/TestOracle12DatabaseAdapter.java
+++ b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/db/impl/TestOracle12DatabaseAdapter.java
@@ -16,6 +16,15 @@
  */
 package org.apache.nifi.processors.standard.db.impl;
 
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertTrue;
+import static org.junit.Assert.fail;
+
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
 import org.apache.nifi.processors.standard.db.DatabaseAdapter;
 import org.junit.Assert;
 import org.junit.Test;
@@ -86,4 +95,74 @@ public class TestOracle12DatabaseAdapter {
         String expected4 = "SELECT some(set),of(columns),that,might,contain,methods,a.* FROM database.tablename";
         Assert.assertEquals(expected4, sql4);
     }
+
+    @Test
+    public void testSupportsUpsert() throws Exception {
+        assertTrue(db.getClass().getSimpleName() + " should support upsert", db.supportsUpsert());
+    }
+
+    @Test
+    public void testGetUpsertStatementWithNullTableName() throws Exception {
+        testGetUpsertStatement(null, Arrays.asList("notEmpty"), Arrays.asList("notEmpty"), new IllegalArgumentException("Table name cannot be null or blank"));
+    }
+
+    @Test
+    public void testGetUpsertStatementWithBlankTableName() throws Exception {
+        testGetUpsertStatement("", Arrays.asList("notEmpty"), Arrays.asList("notEmpty"), new IllegalArgumentException("Table name cannot be null or blank"));
+    }
+
+    @Test
+    public void testGetUpsertStatementWithNullColumnNames() throws Exception {
+        testGetUpsertStatement("notEmpty", null, Arrays.asList("notEmpty"), new IllegalArgumentException("Column names cannot be null or empty"));
+    }
+
+    @Test
+    public void testGetUpsertStatementWithEmptyColumnNames() throws Exception {
+        testGetUpsertStatement("notEmpty", Collections.emptyList(), Arrays.asList("notEmpty"), new IllegalArgumentException("Column names cannot be null or empty"));
+    }
+
+    @Test
+    public void testGetUpsertStatementWithNullKeyColumnNames() throws Exception {
+        testGetUpsertStatement("notEmpty", Arrays.asList("notEmpty"), null, new IllegalArgumentException("Key column names cannot be null or empty"));
+    }
+
+    @Test
+    public void testGetUpsertStatementWithEmptyKeyColumnNames() throws Exception {
+        testGetUpsertStatement("notEmpty", Arrays.asList("notEmpty"), Collections.emptyList(), new IllegalArgumentException("Key column names cannot be null or empty"));
+    }
+
+    @Test
+    public void testGetUpsertStatement() throws Exception {
+        // GIVEN
+        String tableName = "table";
+        List<String> columnNames = Arrays.asList("column1","column2", "column3", "column4");
+        Collection<String> uniqueKeyColumnNames = Arrays.asList("column2","column4");
+
+        String expected = "MERGE INTO table USING (SELECT ? column1, ? column2, ? column3, ? column4 FROM DUAL) n" +
+        " ON (table.column2 = n.column2 AND table.column4 = n.column4) WHEN NOT MATCHED THEN" +
+        " INSERT (column1, column2, column3, column4) VALUES (n.column1, n.column2, n.column3, n.column4)" +
+        " WHEN MATCHED THEN UPDATE SET table.column1 = n.column1, table.column3 = n.column3";
+
+        // WHEN
+        // THEN
+        testGetUpsertStatement(tableName, columnNames, uniqueKeyColumnNames, expected);
+    }
+
+    private void testGetUpsertStatement(String tableName, List<String> columnNames, Collection<String> uniqueKeyColumnNames, IllegalArgumentException expected) {
+        try {
+            testGetUpsertStatement(tableName, columnNames, uniqueKeyColumnNames, (String)null);
+            fail();
+        } catch (IllegalArgumentException e) {
+            assertEquals(expected.getMessage(), e.getMessage());
+        }
+    }
+
+    private void testGetUpsertStatement(String tableName, List<String> columnNames, Collection<String> uniqueKeyColumnNames, String expected) {
+        // WHEN
+        String actual = db.getUpsertStatement(tableName, columnNames, uniqueKeyColumnNames);
+
+        // THEN
+        assertEquals(expected, actual);
+    }
+
 }
\ No newline at end of file

[nifi] 08/15: NIFI-9395 - MiNiFi - bump httpclient to 4.5.13

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 94fee5b289ad86d8d7d56f4f3e52c23dab00d5e7
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Fri Nov 19 12:54:52 2021 +0100

    NIFI-9395 - MiNiFi - bump httpclient to 4.5.13
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5537
---
 minifi/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/minifi/pom.xml b/minifi/pom.xml
index 54d4f32..4fc3c28 100644
--- a/minifi/pom.xml
+++ b/minifi/pom.xml
@@ -580,7 +580,7 @@ limitations under the License.
             <dependency>
                 <groupId>org.apache.httpcomponents</groupId>
                 <artifactId>httpclient</artifactId>
-                <version>4.5.3</version>
+                <version>4.5.13</version>
             </dependency>
             <dependency>
                 <groupId>javax.mail</groupId>

[nifi] 04/15: NIFI-9420 Upgraded MiNiFi Guava from 26.0 to 31.0.1

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 95159ebe4823e401aad57b2669f50be798a47d38
Author: exceptionfactory <ex...@apache.org>
AuthorDate: Mon Nov 29 11:51:47 2021 -0600

    NIFI-9420 Upgraded MiNiFi Guava from 26.0 to 31.0.1
    
    - Replaced usage of Guava Lists with standard Java classes in TemplatesIteratorTest
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5556
---
 .../c2/provider/nifi/rest/TemplatesIteratorTest.java       | 14 +++++++-------
 minifi/pom.xml                                             |  2 +-
 2 files changed, 8 insertions(+), 8 deletions(-)

diff --git a/minifi/minifi-c2/minifi-c2-provider/minifi-c2-provider-nifi-rest/src/test/java/org/apache/nifi/minifi/c2/provider/nifi/rest/TemplatesIteratorTest.java b/minifi/minifi-c2/minifi-c2-provider/minifi-c2-provider-nifi-rest/src/test/java/org/apache/nifi/minifi/c2/provider/nifi/rest/TemplatesIteratorTest.java
index c8e19b2..3e2fe9d 100644
--- a/minifi/minifi-c2/minifi-c2-provider/minifi-c2-provider-nifi-rest/src/test/java/org/apache/nifi/minifi/c2/provider/nifi/rest/TemplatesIteratorTest.java
+++ b/minifi/minifi-c2/minifi-c2-provider/minifi-c2-provider-nifi-rest/src/test/java/org/apache/nifi/minifi/c2/provider/nifi/rest/TemplatesIteratorTest.java
@@ -18,7 +18,6 @@
 package org.apache.nifi.minifi.c2.provider.nifi.rest;
 
 import com.fasterxml.jackson.core.JsonFactory;
-import com.google.common.collect.Lists;
 import org.apache.nifi.minifi.c2.api.ConfigurationProviderException;
 import org.apache.nifi.minifi.c2.api.util.Pair;
 import org.apache.nifi.minifi.c2.provider.util.HttpConnector;
@@ -27,6 +26,7 @@ import org.junit.Test;
 
 import java.io.IOException;
 import java.net.HttpURLConnection;
+import java.util.ArrayList;
 import java.util.List;
 import java.util.NoSuchElementException;
 
@@ -64,9 +64,9 @@ public class TemplatesIteratorTest {
     @Test
     public void testIteratorNoTemplates() throws ConfigurationProviderException, IOException {
         when(httpURLConnection.getInputStream()).thenReturn(TemplatesIteratorTest.class.getClassLoader().getResourceAsStream("noTemplates.json"));
-        List<Pair<String, String>> idToNameList;
+        List<Pair<String, String>> idToNameList = new ArrayList<>();
         try (TemplatesIterator templatesIterator = new TemplatesIterator(httpConnector, jsonFactory)) {
-            idToNameList = Lists.newArrayList(templatesIterator);
+            templatesIterator.forEachRemaining(idToNameList::add);
         }
         assertEquals(0, idToNameList.size());
 
@@ -76,9 +76,9 @@ public class TemplatesIteratorTest {
     @Test
     public void testIteratorSingleTemplate() throws ConfigurationProviderException, IOException {
         when(httpURLConnection.getInputStream()).thenReturn(TemplatesIteratorTest.class.getClassLoader().getResourceAsStream("oneTemplate.json"));
-        List<Pair<String, String>> idToNameList;
+        List<Pair<String, String>> idToNameList = new ArrayList<>();
         try (TemplatesIterator templatesIterator = new TemplatesIterator(httpConnector, jsonFactory)) {
-            idToNameList = Lists.newArrayList(templatesIterator);
+            templatesIterator.forEachRemaining(idToNameList::add);
         }
         assertEquals(1, idToNameList.size());
         Pair<String, String> idNamePair = idToNameList.get(0);
@@ -91,9 +91,9 @@ public class TemplatesIteratorTest {
     @Test
     public void testIteratorTwoTemplates() throws ConfigurationProviderException, IOException {
         when(httpURLConnection.getInputStream()).thenReturn(TemplatesIteratorTest.class.getClassLoader().getResourceAsStream("twoTemplates.json"));
-        List<Pair<String, String>> idToNameList;
+        List<Pair<String, String>> idToNameList = new ArrayList<>();
         try (TemplatesIterator templatesIterator = new TemplatesIterator(httpConnector, jsonFactory)) {
-            idToNameList = Lists.newArrayList(templatesIterator);
+            templatesIterator.forEachRemaining(idToNameList::add);
         }
         assertEquals(2, idToNameList.size());
         Pair<String, String> idNamePair = idToNameList.get(0);
diff --git a/minifi/pom.xml b/minifi/pom.xml
index 0eb96ac..6a6e122 100644
--- a/minifi/pom.xml
+++ b/minifi/pom.xml
@@ -948,7 +948,7 @@ limitations under the License.
             <dependency>
                 <groupId>com.google.guava</groupId>
                 <artifactId>guava</artifactId>
-                <version>26.0-jre</version>
+                <version>31.0.1-jre</version>
             </dependency>
             <dependency>
                 <groupId>com.h2database</groupId>

[nifi] 11/15: Revert "NIFI-9394 Removed RequestLogger and TimerFilter"

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 7208cc8b2287927a58bc16262114365e1fbe1900
Author: Joe Witt <jo...@apache.org>
AuthorDate: Tue Dec 14 10:07:40 2021 -0700

    Revert "NIFI-9394 Removed RequestLogger and TimerFilter"
    
    This reverts commit 73b32464303bb75e89b01a7c817f059e52f03d5b.
---
 .../src/main/resources/conf/logback.xml            |  3 +
 .../org/apache/nifi/web/filter/RequestLogger.java  | 77 ++++++++++++++++++++++
 .../org/apache/nifi/web/filter/TimerFilter.java    | 72 ++++++++++++++++++++
 .../nifi-web-api/src/main/webapp/WEB-INF/web.xml   | 16 +++++
 .../resources/conf/clustered/node1/logback.xml     |  3 +
 .../resources/conf/clustered/node2/logback.xml     |  3 +
 .../src/test/resources/conf/default/logback.xml    |  3 +
 .../src/test/resources/conf/logback.xml            |  3 +
 .../src/test/resources/upgrade/conf/logback.xml    |  3 +
 9 files changed, 183 insertions(+)

diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
index e6e50e4..93a9afa 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
@@ -157,6 +157,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java
new file mode 100644
index 0000000..bb30a1e
--- /dev/null
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java
@@ -0,0 +1,77 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.web.filter;
+
+import java.io.IOException;
+
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+
+import org.apache.nifi.authorization.user.NiFiUserUtils;
+import org.apache.nifi.logging.NiFiLog;
+import org.apache.nifi.authorization.user.NiFiUser;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * A filter to log requests.
+ *
+ */
+public class RequestLogger implements Filter {
+
+    private static final Logger logger = new NiFiLog(LoggerFactory.getLogger(RequestLogger.class));
+
+    @Override
+    public void doFilter(final ServletRequest req, final ServletResponse resp, final FilterChain filterChain)
+            throws IOException, ServletException {
+
+        final HttpServletRequest request = (HttpServletRequest) req;
+
+        // only log http requests has https requests are logged elsewhere
+        if ("http".equalsIgnoreCase(request.getScheme())) {
+            final NiFiUser user = NiFiUserUtils.getNiFiUser();
+
+            // get the user details for the log message
+            String identity = "<no user found>";
+            if (user != null) {
+                identity = user.getIdentity();
+            }
+
+            // log the request attempt - response details will be logged later
+            logger.info(String.format("Attempting request for (%s) %s %s (source ip: %s)", identity, request.getMethod(),
+                    request.getRequestURL().toString(), request.getRemoteAddr()));
+        }
+
+        // continue the filter chain
+        filterChain.doFilter(req, resp);
+    }
+
+    @Override
+    public void init(final FilterConfig config) {
+    }
+
+    @Override
+    public void destroy() {
+    }
+
+}
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java
new file mode 100644
index 0000000..a522fa5
--- /dev/null
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java
@@ -0,0 +1,72 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.web.filter;
+
+import java.io.IOException;
+import java.util.concurrent.TimeUnit;
+
+import javax.servlet.Filter;
+import javax.servlet.FilterChain;
+import javax.servlet.FilterConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+import javax.servlet.http.HttpServletRequest;
+
+import org.apache.nifi.cluster.coordination.http.replication.RequestReplicator;
+import org.apache.nifi.logging.NiFiLog;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * A filter to time requests.
+ *
+ */
+public class TimerFilter implements Filter {
+
+    private static final Logger logger = new NiFiLog(LoggerFactory.getLogger(TimerFilter.class));
+
+    @Override
+    public void doFilter(final ServletRequest req, final ServletResponse resp, final FilterChain filterChain)
+            throws IOException, ServletException {
+
+        final HttpServletRequest request = (HttpServletRequest) req;
+
+        final long start = System.nanoTime();
+        try {
+            filterChain.doFilter(req, resp);
+        } finally {
+            final long stop = System.nanoTime();
+            final String requestId = ((HttpServletRequest) req).getHeader(RequestReplicator.REQUEST_TRANSACTION_ID_HEADER);
+            final String replicationHeader = ((HttpServletRequest) req).getHeader(RequestReplicator.REPLICATION_INDICATOR_HEADER);
+            final boolean validationPhase = RequestReplicator.NODE_CONTINUE.equals(replicationHeader);
+            final String requestDescription = validationPhase ? "Validation Phase of Request " + requestId : "Request ID " + requestId;
+
+            logger.debug("{} {} from {} duration for {}: {} millis", request.getMethod(), request.getRequestURL().toString(),
+                req.getRemoteHost(), requestDescription, TimeUnit.MILLISECONDS.convert(stop - start, TimeUnit.NANOSECONDS));
+        }
+    }
+
+    @Override
+    public void init(final FilterConfig config) {
+    }
+
+    @Override
+    public void destroy() {
+    }
+
+}
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
index 894798e..8ec3fa4 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
@@ -51,6 +51,14 @@
         <url-pattern>/*</url-pattern>
     </filter-mapping>
     <filter>
+        <filter-name>timer</filter-name>
+        <filter-class>org.apache.nifi.web.filter.TimerFilter</filter-class>
+    </filter>
+    <filter-mapping>
+        <filter-name>timer</filter-name>
+        <url-pattern>/*</url-pattern>
+    </filter-mapping>
+    <filter>
         <filter-name>springSecurityFilterChain</filter-name>
         <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
     </filter>
@@ -58,4 +66,12 @@
         <filter-name>springSecurityFilterChain</filter-name>
         <url-pattern>/*</url-pattern>
     </filter-mapping>
+    <filter>
+        <filter-name>requestLogger</filter-name>
+        <filter-class>org.apache.nifi.web.filter.RequestLogger</filter-class>
+    </filter>
+    <filter-mapping>
+        <filter-name>requestLogger</filter-name>
+        <url-pattern>/*</url-pattern>
+    </filter-mapping>
 </web-app>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
index 6c3f0bb..ecda00a 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
@@ -145,6 +145,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
index a24bb4b..4de6225 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
@@ -147,6 +147,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
index c42b3be..cc69039 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
@@ -144,6 +144,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
index c38e0d4..042ee48 100644
--- a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
+++ b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
@@ -133,6 +133,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
 
 
     <!--
diff --git a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
index 710f1dc..d3ace7a 100644
--- a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
+++ b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
@@ -134,6 +134,9 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
+    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
+        <appender-ref ref="USER_FILE"/>
+    </logger>
 
 
     <!--

[nifi] 14/15: NIFI-9185 Add Avro logical type to SelectHive3QL processor

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 2f9963a534d55df2608fe4efaa0841bee7bba393
Author: Timea Barna <ti...@gmail.com>
AuthorDate: Tue Aug 31 13:44:34 2021 +0200

    NIFI-9185 Add Avro logical type to SelectHive3QL processor
    
    Modifying unit test to avoid systemdefault timezone usuage
    
    NIFI-9185 Applying review recommendations removing duplicate dependency from pom.xml
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5358
---
 .../apache/nifi/processors/hive/SelectHive3QL.java |  19 +++-
 .../org/apache/nifi/util/hive/HiveJdbcCommon.java  |  72 ++++++++++--
 .../nifi/processors/hive/TestSelectHive3QL.java    | 124 +++++++++++++++++++++
 3 files changed, 202 insertions(+), 13 deletions(-)

diff --git a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/processors/hive/SelectHive3QL.java b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/processors/hive/SelectHive3QL.java
index f124c73..af87bd1 100644
--- a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/processors/hive/SelectHive3QL.java
+++ b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/processors/hive/SelectHive3QL.java
@@ -236,6 +236,21 @@ public class SelectHive3QL extends AbstractHive3QLProcessor {
             .expressionLanguageSupported(ExpressionLanguageScope.NONE)
             .build();
 
+    public static final PropertyDescriptor USE_AVRO_LOGICAL_TYPES = new PropertyDescriptor.Builder()
+            .name("use-logical-types")
+            .displayName("Use Avro Logical Types")
+            .description("Whether to use Avro Logical Types for DECIMAL, DATE and TIMESTAMP columns. "
+                    + "If disabled, written as string. "
+                    + "If enabled, Logical types are used and written as its underlying type, specifically, "
+                    + "DECIMAL as logical 'decimal': written as bytes with additional precision and scale meta data, "
+                    + "DATE as logical 'date': written as int denoting days since Unix epoch (1970-01-01), "
+                    + "and TIMESTAMP as logical 'timestamp-millis': written as long denoting milliseconds since Unix epoch. "
+                    + "If a reader of written Avro records also knows these logical types, then these values can be deserialized with more context depending on reader implementation.")
+            .allowableValues("true", "false")
+            .defaultValue("false")
+            .required(true)
+            .build();
+
     private final static List<PropertyDescriptor> propertyDescriptors;
     private final static Set<Relationship> relationships;
 
@@ -255,6 +270,7 @@ public class SelectHive3QL extends AbstractHive3QLProcessor {
         _propertyDescriptors.add(MAX_FRAGMENTS);
         _propertyDescriptors.add(HIVEQL_OUTPUT_FORMAT);
         _propertyDescriptors.add(NORMALIZE_NAMES_FOR_AVRO);
+        _propertyDescriptors.add(USE_AVRO_LOGICAL_TYPES);
         _propertyDescriptors.add(HIVEQL_CSV_HEADER);
         _propertyDescriptors.add(HIVEQL_CSV_ALT_HEADER);
         _propertyDescriptors.add(HIVEQL_CSV_DELIMITER);
@@ -344,6 +360,7 @@ public class SelectHive3QL extends AbstractHive3QLProcessor {
         final String delimiter = context.getProperty(HIVEQL_CSV_DELIMITER).evaluateAttributeExpressions(fileToProcess).getValue();
         final boolean quote = context.getProperty(HIVEQL_CSV_QUOTE).asBoolean();
         final boolean escape = context.getProperty(HIVEQL_CSV_HEADER).asBoolean();
+        final boolean useLogicalTypes = context.getProperty(USE_AVRO_LOGICAL_TYPES).asBoolean();
         final String fragmentIdentifier = UUID.randomUUID().toString();
 
         try (final Connection con = dbcpService.getConnection(fileToProcess == null ? Collections.emptyMap() : fileToProcess.getAttributes());
@@ -411,7 +428,7 @@ public class SelectHive3QL extends AbstractHive3QLProcessor {
                         flowfile = session.write(flowfile, out -> {
                             try {
                                 if (AVRO.equals(outputFormat)) {
-                                    nrOfRows.set(HiveJdbcCommon.convertToAvroStream(resultSet, out, maxRowsPerFlowFile, convertNamesForAvro));
+                                    nrOfRows.set(HiveJdbcCommon.convertToAvroStream(resultSet, out, maxRowsPerFlowFile, convertNamesForAvro, useLogicalTypes));
                                 } else if (CSV.equals(outputFormat)) {
                                     CsvOutputOptions options = new CsvOutputOptions(header, altHeader, delimiter, quote, escape, maxRowsPerFlowFile);
                                     nrOfRows.set(HiveJdbcCommon.convertToCsvStream(resultSet, out, options));
diff --git a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/util/hive/HiveJdbcCommon.java b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/util/hive/HiveJdbcCommon.java
index 09eecce..2a704c0 100644
--- a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/util/hive/HiveJdbcCommon.java
+++ b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/main/java/org/apache/nifi/util/hive/HiveJdbcCommon.java
@@ -16,6 +16,7 @@
  */
 package org.apache.nifi.util.hive;
 
+import org.apache.avro.LogicalTypes;
 import org.apache.avro.Schema;
 import org.apache.avro.SchemaBuilder;
 import org.apache.avro.SchemaBuilder.FieldAssembler;
@@ -29,6 +30,7 @@ import org.apache.commons.lang3.StringUtils;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.hive.conf.HiveConf;
+import org.apache.nifi.avro.AvroTypeUtil;
 import org.apache.nifi.components.PropertyDescriptor;
 
 import java.io.IOException;
@@ -88,6 +90,10 @@ public class HiveJdbcCommon {
     public static final String MIME_TYPE_AVRO_BINARY = "application/avro-binary";
     public static final String CSV_MIME_TYPE = "text/csv";
 
+    private static final Schema TIMESTAMP_MILLIS_SCHEMA = LogicalTypes.timestampMillis().addToSchema(Schema.create(Schema.Type.LONG));
+    private static final Schema DATE_SCHEMA = LogicalTypes.date().addToSchema(Schema.create(Schema.Type.INT));
+    private static final int DEFAULT_PRECISION = 10;
+    private static final int DEFAULT_SCALE = 0;
 
     public static final PropertyDescriptor NORMALIZE_NAMES_FOR_AVRO = new PropertyDescriptor.Builder()
             .name("hive-normalize-avro")
@@ -99,14 +105,15 @@ public class HiveJdbcCommon {
             .required(true)
             .build();
 
-    public static long convertToAvroStream(final ResultSet rs, final OutputStream outStream, final int maxRows, boolean convertNames) throws SQLException, IOException {
-        return convertToAvroStream(rs, outStream, null, maxRows, convertNames, null);
+    public static long convertToAvroStream(final ResultSet rs, final OutputStream outStream, final int maxRows, boolean convertNames, final boolean useLogicalTypes) throws SQLException, IOException {
+        return convertToAvroStream(rs, outStream, null, maxRows, convertNames, null, useLogicalTypes);
     }
 
 
-    public static long convertToAvroStream(final ResultSet rs, final OutputStream outStream, String recordName, final int maxRows, boolean convertNames, ResultSetRowCallback callback)
+    public static long convertToAvroStream(final ResultSet rs, final OutputStream outStream, String recordName, final int maxRows, boolean convertNames,
+                                           ResultSetRowCallback callback, final boolean useLogicalTypes)
             throws SQLException, IOException {
-        final Schema schema = createSchema(rs, recordName, convertNames);
+        final Schema schema = createSchema(rs, recordName, convertNames, useLogicalTypes);
         final GenericRecord rec = new GenericData.Record(schema);
 
         final DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(schema);
@@ -149,7 +156,16 @@ public class HiveJdbcCommon {
                         // org.apache.avro.AvroRuntimeException: Unknown datum type java.lang.Byte
                         rec.put(i - 1, ((Byte) value).intValue());
 
-                    } else if (value instanceof BigDecimal || value instanceof BigInteger) {
+                    } else if (value instanceof BigDecimal) {
+                        if (useLogicalTypes) {
+                            final int precision = getPrecision(meta.getPrecision(i));
+                            final int scale = getScale(meta.getScale(i));
+                            rec.put(i - 1, AvroTypeUtil.convertToAvroObject(value, LogicalTypes.decimal(precision, scale).addToSchema(Schema.create(Schema.Type.BYTES))));
+                        } else {
+                            rec.put(i - 1, value.toString());
+                        }
+
+                    } else if (value instanceof BigInteger) {
                         // Avro can't handle BigDecimal and BigInteger as numbers - it will throw an AvroRuntimeException such as: "Unknown datum type: java.math.BigDecimal: 38"
                         rec.put(i - 1, value.toString());
 
@@ -170,10 +186,14 @@ public class HiveJdbcCommon {
                         rec.put(i - 1, value);
                     } else if (value instanceof java.sql.SQLXML) {
                         rec.put(i - 1, ((java.sql.SQLXML) value).getString());
+                    } else if (useLogicalTypes && javaSqlType == DATE) {
+                        rec.put(i - 1, AvroTypeUtil.convertToAvroObject(value, DATE_SCHEMA));
+                    } else if (useLogicalTypes && javaSqlType == TIMESTAMP) {
+                        rec.put(i - 1, AvroTypeUtil.convertToAvroObject(value, TIMESTAMP_MILLIS_SCHEMA));
                     } else {
                         // The different types that we support are numbers (int, long, double, float),
-                        // as well as boolean values and Strings. Since Avro doesn't provide
-                        // timestamp types, we want to convert those to Strings. So we will cast anything other
+                        // as well as boolean values, decimal, date, timestamp and Strings. Since Avro doesn't provide
+                        // times type, we want to convert those to Strings. So we will cast anything other
                         // than numbers or booleans to strings by using the toString() method.
                         rec.put(i - 1, value.toString());
                     }
@@ -190,7 +210,7 @@ public class HiveJdbcCommon {
     }
 
     public static Schema createSchema(final ResultSet rs, boolean convertNames) throws SQLException {
-        return createSchema(rs, null, false);
+        return createSchema(rs, null, false, false);
     }
 
     /**
@@ -203,7 +223,7 @@ public class HiveJdbcCommon {
      * @return A Schema object representing the result set converted to an Avro record
      * @throws SQLException if any error occurs during conversion
      */
-    public static Schema createSchema(final ResultSet rs, String recordName, boolean convertNames) throws SQLException {
+    public static Schema createSchema(final ResultSet rs, String recordName, boolean convertNames, final boolean useLogicalTypes) throws SQLException {
         final ResultSetMetaData meta = rs.getMetaData();
         final int nrOfColumns = meta.getColumnCount();
         String tableName = StringUtils.isEmpty(recordName) ? "NiFi_SelectHiveQL_Record" : recordName;
@@ -298,14 +318,32 @@ public class HiveJdbcCommon {
                 // Did not find direct suitable type, need to be clarified!!!!
                 case DECIMAL:
                 case NUMERIC:
-                    builder.name(columnName).type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault();
+                    if (useLogicalTypes) {
+                        final int precision = getPrecision(meta.getPrecision(i));
+                        final int scale = getScale(meta.getScale(i));
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and()
+                                .type(LogicalTypes.decimal(precision, scale).addToSchema(Schema.create(Schema.Type.BYTES))).endUnion().noDefault();
+                    } else {
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault();
+                    }
                     break;
 
-                // Did not find direct suitable type, need to be clarified!!!!
+                // Dates were introduced in Hive 0.12.0
                 case DATE:
+                    if (useLogicalTypes) {
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and().type(DATE_SCHEMA).endUnion().noDefault();
+                    } else {
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault();
+                    }
+                    break;
+                // Did not find direct suitable type, need to be clarified!!!!
                 case TIME:
                 case TIMESTAMP:
-                    builder.name(columnName).type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault();
+                    if (useLogicalTypes) {
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and().type(TIMESTAMP_MILLIS_SCHEMA).endUnion().noDefault();
+                    } else {
+                        builder.name(columnName).type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault();
+                    }
                     break;
 
                 case BINARY:
@@ -461,4 +499,14 @@ public class HiveJdbcCommon {
         }
         return hiveConfig;
     }
+
+    //If data in result set contains invalid precision value use Hive default precision.
+    private static int getPrecision(int precision) {
+        return precision > 1 ? precision : DEFAULT_PRECISION;
+    }
+
+    //If data in result set contains invalid scale value use Hive default scale.
+    private static int getScale(int scale) {
+        return scale > 0 ? scale : DEFAULT_SCALE;
+    }
 }
diff --git a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHive3QL.java b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHive3QL.java
index 356106f..d319742 100644
--- a/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHive3QL.java
+++ b/nifi-nar-bundles/nifi-hive-bundle/nifi-hive3-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHive3QL.java
@@ -16,6 +16,9 @@
  */
 package org.apache.nifi.processors.hive;
 
+import org.apache.avro.LogicalTypes;
+import org.apache.avro.Schema;
+import org.apache.avro.SchemaBuilder;
 import org.apache.avro.file.DataFileStream;
 import org.apache.avro.generic.GenericDatumReader;
 import org.apache.avro.generic.GenericRecord;
@@ -44,11 +47,15 @@ import java.io.File;
 import java.io.IOException;
 import java.io.InputStream;
 import java.io.InputStreamReader;
+import java.math.BigDecimal;
+import java.nio.ByteBuffer;
 import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.SQLException;
 import java.sql.Statement;
+import java.sql.Timestamp;
 import java.sql.Types;
+import java.time.LocalDate;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
@@ -67,6 +74,11 @@ public class TestSelectHive3QL {
     private static final Logger LOGGER;
     private final static String MAX_ROWS_KEY = "maxRows";
     private final int NUM_OF_ROWS = 100;
+    private static final int ID = 1;
+    private static final String NAME = "Joe Smith";
+    private static final String BIRTH_DATE = "1956-11-22";
+    private static final String BIG_NUMBER = "12345678.12";
+    private static final String CREATED_ON = "1962-09-23 03:23:34.234";
 
 
     static {
@@ -690,6 +702,111 @@ public class TestSelectHive3QL {
         runner.clearTransferState();
     }
 
+    @Test
+    public void testAvroRecordCreatedWithoutLogicalTypesByDefault() throws SQLException, IOException {
+        final Schema expectedSchema = SchemaBuilder.record("NiFi_SelectHiveQL_Record").namespace("any.data").fields()
+                .name("ID").type().unionOf().nullBuilder().endNull().and().intType().endUnion().noDefault()
+                .name("NAME").type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault()
+                .name("BIRTH_DATE").type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault()
+                .name("BIG_NUMBER").type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault()
+                .name("CREATED_ON").type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault()
+                .endRecord();
+
+        // load test data to database
+        final Connection con = ((DBCPService) runner.getControllerService("dbcp")).getConnection();
+        final Statement stmt = con.createStatement();
+        final InputStream in;
+        final MockFlowFile mff;
+
+        try {
+            stmt.execute("drop table TEST_QUERY_DB_TABLE");
+        } catch (final SQLException sqle) {
+            // Ignore this error, probably a "table does not exist" since Derby doesn't yet support DROP IF EXISTS [DERBY-4842]
+        }
+
+        stmt.execute("create table TEST_QUERY_DB_TABLE (id integer not null, name varchar(100), birth_date date, big_number decimal(10,2),created_on timestamp)");
+        stmt.execute("insert into TEST_QUERY_DB_TABLE (id, name, birth_date, big_number, created_on) VALUES (" +
+                ID + ", '" + NAME + "', '" + BIRTH_DATE + "', " + BIG_NUMBER + ", '" + CREATED_ON + "')");
+
+        runner.setIncomingConnection(false);
+        runner.setProperty(SelectHive3QL.HIVEQL_SELECT_QUERY, "SELECT * FROM TEST_QUERY_DB_TABLE");
+
+        runner.run();
+
+        runner.assertAllFlowFilesTransferred(SelectHive3QL.REL_SUCCESS, 1);
+        mff = runner.getFlowFilesForRelationship(SelectHive3QL.REL_SUCCESS).get(0);
+        in = new ByteArrayInputStream(mff.toByteArray());
+
+        final GenericRecord record = getFirstRecordFromStream(in);
+
+        assertEquals(expectedSchema, record.getSchema());
+        assertEquals(ID, record.get("ID"));
+        assertEquals(NAME, record.get("NAME").toString());
+        assertEquals(BIRTH_DATE, record.get("BIRTH_DATE").toString());
+        assertEquals(BIG_NUMBER, record.get("BIG_NUMBER").toString());
+        assertEquals(CREATED_ON, record.get("CREATED_ON").toString());
+
+        runner.clearTransferState();
+    }
+
+    @Test
+    public void testAvroRecordCreatedWithLogicalTypesWhenSet() throws SQLException, IOException {
+        final Schema expectedSchema = SchemaBuilder.record("NiFi_SelectHiveQL_Record").namespace("any.data").fields()
+                .name("ID").type().unionOf().nullBuilder().endNull().and().intType().endUnion().noDefault()
+                .name("NAME").type().unionOf().nullBuilder().endNull().and().stringType().endUnion().noDefault()
+                .name("BIRTH_DATE").type().unionOf().nullBuilder().endNull().and()
+                .type(LogicalTypes.date().addToSchema(Schema.create(Schema.Type.INT))).endUnion().noDefault()
+                .name("BIG_NUMBER").type().unionOf().nullBuilder().endNull().and()
+                .type(LogicalTypes.decimal(10, 2).addToSchema(Schema.create(Schema.Type.BYTES))).endUnion().noDefault()
+                .name("CREATED_ON").type().unionOf().nullBuilder().endNull().and()
+                .type(LogicalTypes.timestampMillis().addToSchema(Schema.create(Schema.Type.LONG))).endUnion().noDefault()
+                .endRecord();
+
+        final int expectedBirthDate = (int) LocalDate.parse(BIRTH_DATE).toEpochDay();
+        final BigDecimal decimal = new BigDecimal(BIG_NUMBER).setScale(2, BigDecimal.ROUND_HALF_EVEN);
+        final ByteBuffer expectedBigNumber = ByteBuffer.wrap(decimal.unscaledValue().toByteArray());
+        final Timestamp timestamp = Timestamp.valueOf(CREATED_ON);
+        final long expectedCreatedOn = timestamp.getTime();
+
+        // load test data to database
+        final Connection con = ((DBCPService) runner.getControllerService("dbcp")).getConnection();
+        final Statement stmt = con.createStatement();
+        final InputStream in;
+        final MockFlowFile mff;
+
+        try {
+            stmt.execute("drop table TEST_QUERY_DB_TABLE");
+        } catch (final SQLException sqle) {
+            // Ignore this error, probably a "table does not exist" since Derby doesn't yet support DROP IF EXISTS [DERBY-4842]
+        }
+
+        stmt.execute("create table TEST_QUERY_DB_TABLE (id integer not null, name varchar(100), birth_date date, big_number decimal(10,2),created_on timestamp)");
+        stmt.execute("insert into TEST_QUERY_DB_TABLE (id, name, birth_date, big_number, created_on) VALUES (" +
+                ID + ", '" + NAME + "', '" + BIRTH_DATE + "', " + BIG_NUMBER + ", '" + CREATED_ON + "')");
+
+        runner.setIncomingConnection(false);
+        runner.setProperty(SelectHive3QL.HIVEQL_SELECT_QUERY, "SELECT * FROM TEST_QUERY_DB_TABLE");
+        runner.setProperty(SelectHive3QL.USE_AVRO_LOGICAL_TYPES, "true");
+
+        runner.run();
+
+        runner.assertAllFlowFilesTransferred(SelectHive3QL.REL_SUCCESS, 1);
+        mff = runner.getFlowFilesForRelationship(SelectHive3QL.REL_SUCCESS).get(0);
+        in = new ByteArrayInputStream(mff.toByteArray());
+
+        final GenericRecord record = getFirstRecordFromStream(in);
+
+        assertEquals(expectedSchema, record.getSchema());
+        assertEquals(ID, record.get("ID"));
+        assertEquals(NAME, record.get("NAME").toString());
+        assertEquals(expectedBirthDate, record.get("BIRTH_DATE"));
+        assertEquals(expectedBigNumber, record.get("BIG_NUMBER"));
+        assertEquals(expectedCreatedOn, record.get("CREATED_ON"));
+
+
+        runner.clearTransferState();
+    }
+
     private long getNumberOfRecordsFromStream(InputStream in) throws IOException {
         final DatumReader<GenericRecord> datumReader = new GenericDatumReader<>();
         try (DataFileStream<GenericRecord> dataFileReader = new DataFileStream<>(in, datumReader)) {
@@ -707,6 +824,13 @@ public class TestSelectHive3QL {
         }
     }
 
+    private GenericRecord getFirstRecordFromStream(InputStream in) throws IOException {
+        final DatumReader<GenericRecord> datumReader = new GenericDatumReader<>();
+        try (DataFileStream<GenericRecord> dataFileReader = new DataFileStream<>(in, datumReader)) {
+            return dataFileReader.next();
+        }
+    }
+
     /**
      * Simple implementation only for SelectHive3QL processor testing.
      */

[nifi] 06/15: NIFI-9408 - MiNiFi - remove Ignite dependencies

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit bd16b93da6ee5911872f3e868d43464d70da0621
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Tue Nov 23 14:13:57 2021 +0100

    NIFI-9408 - MiNiFi - remove Ignite dependencies
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5546
---
 minifi/pom.xml | 15 ---------------
 1 file changed, 15 deletions(-)

diff --git a/minifi/pom.xml b/minifi/pom.xml
index 6a6e122..1dd9afb 100644
--- a/minifi/pom.xml
+++ b/minifi/pom.xml
@@ -558,21 +558,6 @@ limitations under the License.
                 <version>1.5.3-M1</version>
             </dependency>
             <dependency>
-                <groupId>org.apache.ignite</groupId>
-                <artifactId>ignite-core</artifactId>
-                <version>1.6.0</version>
-            </dependency>
-            <dependency>
-                <groupId>org.apache.ignite</groupId>
-                <artifactId>ignite-spring</artifactId>
-                <version>1.6.0</version>
-            </dependency>
-            <dependency>
-                <groupId>org.apache.ignite</groupId>
-                <artifactId>ignite-log4j2</artifactId>
-                <version>1.6.0</version>
-            </dependency>
-            <dependency>
                 <groupId>commons-cli</groupId>
                 <artifactId>commons-cli</artifactId>
                 <version>1.3.1</version>

[nifi] 07/15: NIFI-9396 - MiNiFi - bump junit to 4.13.2

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit b79714c6f6e1273c7a83511d639389779c6506ef
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Fri Nov 19 13:04:27 2021 +0100

    NIFI-9396 - MiNiFi - bump junit to 4.13.2
    
    This closes #5538
    
    Signed-off-by: David Handermann <ex...@apache.org>
---
 minifi/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/minifi/pom.xml b/minifi/pom.xml
index 1dd9afb..54d4f32 100644
--- a/minifi/pom.xml
+++ b/minifi/pom.xml
@@ -508,7 +508,7 @@ limitations under the License.
             <dependency>
                 <groupId>junit</groupId>
                 <artifactId>junit</artifactId>
-                <version>4.12</version>
+                <version>4.13.2</version>
             </dependency>
             <dependency>
                 <groupId>org.mockito</groupId>

[nifi] 09/15: NIFI-9394 Removed RequestLogger and TimerFilter

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 73b32464303bb75e89b01a7c817f059e52f03d5b
Author: exceptionfactory <ex...@apache.org>
AuthorDate: Fri Nov 19 14:58:04 2021 -0600

    NIFI-9394 Removed RequestLogger and TimerFilter
    
    - Removed logger references from logback.xml
    
    Signed-off-by: Joe Gresock <jg...@gmail.com>
    
    This closes #5543.
---
 .../src/main/resources/conf/logback.xml            |  3 -
 .../org/apache/nifi/web/filter/RequestLogger.java  | 77 ----------------------
 .../org/apache/nifi/web/filter/TimerFilter.java    | 72 --------------------
 .../nifi-web-api/src/main/webapp/WEB-INF/web.xml   | 16 -----
 .../resources/conf/clustered/node1/logback.xml     |  3 -
 .../resources/conf/clustered/node2/logback.xml     |  3 -
 .../src/test/resources/conf/default/logback.xml    |  3 -
 .../src/test/resources/conf/logback.xml            |  3 -
 .../src/test/resources/upgrade/conf/logback.xml    |  3 -
 9 files changed, 183 deletions(-)

diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
index 93a9afa..e6e50e4 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/conf/logback.xml
@@ -157,9 +157,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java
deleted file mode 100644
index bb30a1e..0000000
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/RequestLogger.java
+++ /dev/null
@@ -1,77 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.nifi.web.filter;
-
-import java.io.IOException;
-
-import javax.servlet.Filter;
-import javax.servlet.FilterChain;
-import javax.servlet.FilterConfig;
-import javax.servlet.ServletException;
-import javax.servlet.ServletRequest;
-import javax.servlet.ServletResponse;
-import javax.servlet.http.HttpServletRequest;
-
-import org.apache.nifi.authorization.user.NiFiUserUtils;
-import org.apache.nifi.logging.NiFiLog;
-import org.apache.nifi.authorization.user.NiFiUser;
-
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-/**
- * A filter to log requests.
- *
- */
-public class RequestLogger implements Filter {
-
-    private static final Logger logger = new NiFiLog(LoggerFactory.getLogger(RequestLogger.class));
-
-    @Override
-    public void doFilter(final ServletRequest req, final ServletResponse resp, final FilterChain filterChain)
-            throws IOException, ServletException {
-
-        final HttpServletRequest request = (HttpServletRequest) req;
-
-        // only log http requests has https requests are logged elsewhere
-        if ("http".equalsIgnoreCase(request.getScheme())) {
-            final NiFiUser user = NiFiUserUtils.getNiFiUser();
-
-            // get the user details for the log message
-            String identity = "<no user found>";
-            if (user != null) {
-                identity = user.getIdentity();
-            }
-
-            // log the request attempt - response details will be logged later
-            logger.info(String.format("Attempting request for (%s) %s %s (source ip: %s)", identity, request.getMethod(),
-                    request.getRequestURL().toString(), request.getRemoteAddr()));
-        }
-
-        // continue the filter chain
-        filterChain.doFilter(req, resp);
-    }
-
-    @Override
-    public void init(final FilterConfig config) {
-    }
-
-    @Override
-    public void destroy() {
-    }
-
-}
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java
deleted file mode 100644
index a522fa5..0000000
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/filter/TimerFilter.java
+++ /dev/null
@@ -1,72 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.nifi.web.filter;
-
-import java.io.IOException;
-import java.util.concurrent.TimeUnit;
-
-import javax.servlet.Filter;
-import javax.servlet.FilterChain;
-import javax.servlet.FilterConfig;
-import javax.servlet.ServletException;
-import javax.servlet.ServletRequest;
-import javax.servlet.ServletResponse;
-import javax.servlet.http.HttpServletRequest;
-
-import org.apache.nifi.cluster.coordination.http.replication.RequestReplicator;
-import org.apache.nifi.logging.NiFiLog;
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
-/**
- * A filter to time requests.
- *
- */
-public class TimerFilter implements Filter {
-
-    private static final Logger logger = new NiFiLog(LoggerFactory.getLogger(TimerFilter.class));
-
-    @Override
-    public void doFilter(final ServletRequest req, final ServletResponse resp, final FilterChain filterChain)
-            throws IOException, ServletException {
-
-        final HttpServletRequest request = (HttpServletRequest) req;
-
-        final long start = System.nanoTime();
-        try {
-            filterChain.doFilter(req, resp);
-        } finally {
-            final long stop = System.nanoTime();
-            final String requestId = ((HttpServletRequest) req).getHeader(RequestReplicator.REQUEST_TRANSACTION_ID_HEADER);
-            final String replicationHeader = ((HttpServletRequest) req).getHeader(RequestReplicator.REPLICATION_INDICATOR_HEADER);
-            final boolean validationPhase = RequestReplicator.NODE_CONTINUE.equals(replicationHeader);
-            final String requestDescription = validationPhase ? "Validation Phase of Request " + requestId : "Request ID " + requestId;
-
-            logger.debug("{} {} from {} duration for {}: {} millis", request.getMethod(), request.getRequestURL().toString(),
-                req.getRemoteHost(), requestDescription, TimeUnit.MILLISECONDS.convert(stop - start, TimeUnit.NANOSECONDS));
-        }
-    }
-
-    @Override
-    public void init(final FilterConfig config) {
-    }
-
-    @Override
-    public void destroy() {
-    }
-
-}
diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
index 8ec3fa4..894798e 100644
--- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
+++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/webapp/WEB-INF/web.xml
@@ -51,14 +51,6 @@
         <url-pattern>/*</url-pattern>
     </filter-mapping>
     <filter>
-        <filter-name>timer</filter-name>
-        <filter-class>org.apache.nifi.web.filter.TimerFilter</filter-class>
-    </filter>
-    <filter-mapping>
-        <filter-name>timer</filter-name>
-        <url-pattern>/*</url-pattern>
-    </filter-mapping>
-    <filter>
         <filter-name>springSecurityFilterChain</filter-name>
         <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
     </filter>
@@ -66,12 +58,4 @@
         <filter-name>springSecurityFilterChain</filter-name>
         <url-pattern>/*</url-pattern>
     </filter-mapping>
-    <filter>
-        <filter-name>requestLogger</filter-name>
-        <filter-class>org.apache.nifi.web.filter.RequestLogger</filter-class>
-    </filter>
-    <filter-mapping>
-        <filter-name>requestLogger</filter-name>
-        <url-pattern>/*</url-pattern>
-    </filter-mapping>
 </web-app>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
index ecda00a..6c3f0bb 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node1/logback.xml
@@ -145,9 +145,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
index 4de6225..a24bb4b 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/clustered/node2/logback.xml
@@ -147,9 +147,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
index cc69039..c42b3be 100644
--- a/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
+++ b/nifi-system-tests/nifi-system-test-suite/src/test/resources/conf/default/logback.xml
@@ -144,9 +144,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
     <logger name="org.apache.nifi.web.api.AccessResource" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
diff --git a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
index 042ee48..c38e0d4 100644
--- a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
+++ b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/conf/logback.xml
@@ -133,9 +133,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
 
 
     <!--
diff --git a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
index d3ace7a..710f1dc 100644
--- a/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
+++ b/nifi-toolkit/nifi-toolkit-admin/src/test/resources/upgrade/conf/logback.xml
@@ -134,9 +134,6 @@
     <logger name="org.apache.nifi.cluster.authorization" level="INFO" additivity="false">
         <appender-ref ref="USER_FILE"/>
     </logger>
-    <logger name="org.apache.nifi.web.filter.RequestLogger" level="INFO" additivity="false">
-        <appender-ref ref="USER_FILE"/>
-    </logger>
 
 
     <!--

[nifi] 15/15: NIFI-5821 Added Engine Name to Script Engine property descriptions

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 4e88943e60817aac6d779cf0bf2c970e9cfdb16b
Author: exceptionfactory <ex...@apache.org>
AuthorDate: Wed Nov 17 08:53:14 2021 -0600

    NIFI-5821 Added Engine Name to Script Engine property descriptions
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5529
---
 .../nifi/script/ScriptingComponentHelper.java      | 15 +++++-
 .../script/TestScriptingComponentHelper.java       | 54 ++++++++++++++++++++++
 2 files changed, 68 insertions(+), 1 deletion(-)

diff --git a/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/script/ScriptingComponentHelper.java b/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/script/ScriptingComponentHelper.java
index cf92340..1db75bf 100644
--- a/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/script/ScriptingComponentHelper.java
+++ b/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/script/ScriptingComponentHelper.java
@@ -48,10 +48,13 @@ import java.util.concurrent.BlockingQueue;
 import java.util.concurrent.LinkedBlockingQueue;
 import java.util.concurrent.atomic.AtomicBoolean;
 
+import static org.apache.commons.lang3.StringUtils.defaultIfBlank;
+
 /**
  * This class contains variables and methods common to scripting processors, reporting tasks, etc.
  */
 public class ScriptingComponentHelper {
+    private static final String UNKNOWN_VERSION = "UNKNOWN";
 
     public PropertyDescriptor SCRIPT_ENGINE;
 
@@ -155,7 +158,8 @@ public class ScriptingComponentHelper {
             List<AllowableValue> engineList = new LinkedList<>();
             for (ScriptEngineFactory factory : scriptEngineFactories) {
                 if (!requireInvocable || factory.getScriptEngine() instanceof Invocable) {
-                    engineList.add(new AllowableValue(factory.getLanguageName()));
+                    final AllowableValue scriptEngineAllowableValue = getScriptLanguageAllowableValue(factory);
+                    engineList.add(scriptEngineAllowableValue);
                     scriptEngineFactoryMap.put(factory.getLanguageName(), factory);
                 }
             }
@@ -269,4 +273,13 @@ public class ScriptingComponentHelper {
             scriptRunnerQ.clear();
         }
     }
+
+    private AllowableValue getScriptLanguageAllowableValue(final ScriptEngineFactory factory) {
+        final String languageName = factory.getLanguageName();
+        final String languageVersion = defaultIfBlank(factory.getLanguageVersion(), UNKNOWN_VERSION);
+        final String engineVersion = defaultIfBlank(factory.getEngineVersion(), UNKNOWN_VERSION);
+
+        final String description = String.format("%s %s [%s %s]", languageName, languageVersion, factory.getEngineName(), engineVersion);
+        return new AllowableValue(languageName, languageName, description);
+    }
 }
diff --git a/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/java/org/apache/nifi/processors/script/TestScriptingComponentHelper.java b/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/java/org/apache/nifi/processors/script/TestScriptingComponentHelper.java
new file mode 100644
index 0000000..459e673
--- /dev/null
+++ b/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/test/java/org/apache/nifi/processors/script/TestScriptingComponentHelper.java
@@ -0,0 +1,54 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.script;
+
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.script.ScriptingComponentHelper;
+import org.junit.jupiter.api.Test;
+
+import java.util.List;
+import java.util.Optional;
+
+import static org.junit.jupiter.api.Assertions.assertFalse;
+import static org.junit.jupiter.api.Assertions.assertNotNull;
+import static org.junit.jupiter.api.Assertions.assertTrue;
+
+public class TestScriptingComponentHelper {
+    private static final String SCRIPT_ENGINE_PROPERTY = "Script Engine";
+
+    @Test
+    public void testScriptEngineAllowableValuesWithDescriptions() {
+        final ScriptingComponentHelper helper = new ScriptingComponentHelper();
+        helper.createResources();
+
+        final List<PropertyDescriptor> descriptors = helper.getDescriptors();
+        final Optional<PropertyDescriptor> optionalScriptEngine = descriptors.stream().filter(
+                descriptor -> descriptor.getName().equals(SCRIPT_ENGINE_PROPERTY)
+        ).findFirst();
+
+        assertTrue(optionalScriptEngine.isPresent());
+        final PropertyDescriptor scriptEngineDescriptor = optionalScriptEngine.get();
+
+        final List<AllowableValue> allowableValues =scriptEngineDescriptor.getAllowableValues();
+        assertFalse(allowableValues.isEmpty());
+
+        for (final AllowableValue allowableValue : allowableValues) {
+            assertNotNull(allowableValue.getDescription());
+        }
+    }
+}

[nifi] 01/15: NIFI-9473 - Upgrade Jackson from 2.12.3 to 2.12.5

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit ffebbab4bc983fef2bc42666520bce1555bf8629
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Fri Dec 10 12:52:22 2021 +0100

    NIFI-9473 - Upgrade Jackson from 2.12.3 to 2.12.5
    
    This closes #5591
    
    Signed-off-by: David Handermann <ex...@apache.org>
---
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/pom.xml b/pom.xml
index daba91e..7794014 100644
--- a/pom.xml
+++ b/pom.xml
@@ -97,7 +97,7 @@
         <org.slf4j.version>1.7.32</org.slf4j.version>
         <ranger.version>2.1.0</ranger.version>
         <jetty.version>9.4.44.v20210927</jetty.version>
-        <jackson.version>2.12.3</jackson.version>
+        <jackson.version>2.12.5</jackson.version>
         <jaxb.runtime.version>2.3.5</jaxb.runtime.version>
         <jakarta.xml.bind-api.version>2.3.3</jakarta.xml.bind-api.version>
         <nifi.groovy.version>2.5.14</nifi.groovy.version>

[nifi] 12/15: NIFI-9260 Making the 'write and rename' behaviour optional for PutHDFS

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 534e6eafe7b8622e6d141802a40d7c6e01a09361
Author: Bence Simon <si...@gmail.com>
AuthorDate: Thu Sep 30 15:57:41 2021 +0200

    NIFI-9260 Making the 'write and rename' behaviour optional for PutHDFS
    
    This closes #5423.
    
    Signed-off-by: Peter Turcsanyi <tu...@apache.org>
---
 .../org/apache/nifi/processors/hadoop/PutHDFS.java | 35 ++++++++--
 .../apache/nifi/processors/hadoop/PutHDFSTest.java | 74 +++++++++++++++++-----
 2 files changed, 90 insertions(+), 19 deletions(-)

diff --git a/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java b/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java
index 62b7996..d5d85fc 100644
--- a/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java
+++ b/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java
@@ -125,6 +125,9 @@ public class PutHDFS extends AbstractHadoopProcessor {
     protected static final String FAIL_RESOLUTION = "fail";
     protected static final String APPEND_RESOLUTION = "append";
 
+    protected static final String WRITE_AND_RENAME = "writeAndRename";
+    protected static final String SIMPLE_WRITE = "simpleWrite";
+
     protected static final AllowableValue REPLACE_RESOLUTION_AV = new AllowableValue(REPLACE_RESOLUTION,
             REPLACE_RESOLUTION, "Replaces the existing file if any.");
     protected static final AllowableValue IGNORE_RESOLUTION_AV = new AllowableValue(IGNORE_RESOLUTION, IGNORE_RESOLUTION,
@@ -134,6 +137,11 @@ public class PutHDFS extends AbstractHadoopProcessor {
     protected static final AllowableValue APPEND_RESOLUTION_AV = new AllowableValue(APPEND_RESOLUTION, APPEND_RESOLUTION,
             "Appends to the existing file if any, creates a new file otherwise.");
 
+    protected static final AllowableValue WRITE_AND_RENAME_AV = new AllowableValue(WRITE_AND_RENAME, "Write and rename",
+            "The processor writes FlowFile data into a temporary file and renames it after completion. This prevents other processes from reading partially written files.");
+    protected static final AllowableValue SIMPLE_WRITE_AV = new AllowableValue(SIMPLE_WRITE, "Simple write",
+            "The processor writes FlowFile data directly to the destination file. In some cases this might cause reading partially written files.");
+
     protected static final PropertyDescriptor CONFLICT_RESOLUTION = new PropertyDescriptor.Builder()
             .name("Conflict Resolution Strategy")
             .description("Indicates what should happen when a file with the same name already exists in the output directory")
@@ -142,6 +150,15 @@ public class PutHDFS extends AbstractHadoopProcessor {
             .allowableValues(REPLACE_RESOLUTION_AV, IGNORE_RESOLUTION_AV, FAIL_RESOLUTION_AV, APPEND_RESOLUTION_AV)
             .build();
 
+    protected static final PropertyDescriptor WRITING_STRATEGY = new PropertyDescriptor.Builder()
+            .name("writing-strategy")
+            .displayName("Writing Strategy")
+            .description("Defines the approach for writing the FlowFile data.")
+            .required(true)
+            .defaultValue(WRITE_AND_RENAME_AV.getValue())
+            .allowableValues(WRITE_AND_RENAME_AV, SIMPLE_WRITE_AV)
+            .build();
+
     public static final PropertyDescriptor BLOCK_SIZE = new PropertyDescriptor.Builder()
             .name("Block Size")
             .description("Size of each block as written to HDFS. This overrides the Hadoop Configuration")
@@ -219,6 +236,7 @@ public class PutHDFS extends AbstractHadoopProcessor {
                 .description("The parent HDFS directory to which files should be written. The directory will be created if it doesn't exist.")
                 .build());
         props.add(CONFLICT_RESOLUTION);
+        props.add(WRITING_STRATEGY);
         props.add(BLOCK_SIZE);
         props.add(BUFFER_SIZE);
         props.add(REPLICATION_FACTOR);
@@ -280,6 +298,7 @@ public class PutHDFS extends AbstractHadoopProcessor {
                 Path tempDotCopyFile = null;
                 FlowFile putFlowFile = flowFile;
                 try {
+                    final String writingStrategy = context.getProperty(WRITING_STRATEGY).getValue();
                     final Path dirPath = getNormalizedPath(context, DIRECTORY, putFlowFile);
                     final String conflictResponse = context.getProperty(CONFLICT_RESOLUTION).getValue();
                     final long blockSize = getBlockSize(context, session, putFlowFile, dirPath);
@@ -295,6 +314,11 @@ public class PutHDFS extends AbstractHadoopProcessor {
                     final Path tempCopyFile = new Path(dirPath, "." + filename);
                     final Path copyFile = new Path(dirPath, filename);
 
+                    // Depending on the writing strategy, we might need a temporary file
+                    final Path actualCopyFile = (writingStrategy.equals(WRITE_AND_RENAME))
+                            ? tempCopyFile
+                            : copyFile;
+
                     // Create destination directory if it does not exist
                     boolean targetDirCreated = false;
                     try {
@@ -361,7 +385,7 @@ public class PutHDFS extends AbstractHadoopProcessor {
                                         cflags.add(CreateFlag.IGNORE_CLIENT_LOCALITY);
                                     }
 
-                                    fos = hdfs.create(tempCopyFile, FsCreateModes.applyUMask(FsPermission.getFileDefault(),
+                                    fos = hdfs.create(actualCopyFile, FsCreateModes.applyUMask(FsPermission.getFileDefault(),
                                             FsPermission.getUMask(hdfs.getConf())), cflags, bufferSize, replication, blockSize,
                                             null, null);
                                 }
@@ -369,7 +393,7 @@ public class PutHDFS extends AbstractHadoopProcessor {
                                 if (codec != null) {
                                     fos = codec.createOutputStream(fos);
                                 }
-                                createdFile = tempCopyFile;
+                                createdFile = actualCopyFile;
                                 BufferedInputStream bis = new BufferedInputStream(in);
                                 StreamUtils.copy(bis, fos);
                                 bis = null;
@@ -399,9 +423,12 @@ public class PutHDFS extends AbstractHadoopProcessor {
                     final long millis = stopWatch.getDuration(TimeUnit.MILLISECONDS);
                     tempDotCopyFile = tempCopyFile;
 
-                    if (!conflictResponse.equals(APPEND_RESOLUTION)
-                            || (conflictResponse.equals(APPEND_RESOLUTION) && !destinationExists)) {
+                    if  (
+                        writingStrategy.equals(WRITE_AND_RENAME)
+                        && (!conflictResponse.equals(APPEND_RESOLUTION) || (conflictResponse.equals(APPEND_RESOLUTION) && !destinationExists))
+                    ) {
                         boolean renamed = false;
+
                         for (int i = 0; i < 10; i++) { // try to rename multiple times.
                             if (hdfs.rename(tempCopyFile, copyFile)) {
                                 renamed = true;
diff --git a/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/test/java/org/apache/nifi/processors/hadoop/PutHDFSTest.java b/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/test/java/org/apache/nifi/processors/hadoop/PutHDFSTest.java
index c41114b..22b3ec2 100644
--- a/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/test/java/org/apache/nifi/processors/hadoop/PutHDFSTest.java
+++ b/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/test/java/org/apache/nifi/processors/hadoop/PutHDFSTest.java
@@ -71,6 +71,9 @@ import static org.mockito.Mockito.times;
 import static org.mockito.Mockito.verify;
 
 public class PutHDFSTest {
+    private final static String TARGET_DIRECTORY = "target/test-classes";
+    private final static String SOURCE_DIRECTORY = "src/test/resources/testdata";
+    private final static String FILE_NAME = "randombytes-1";
 
     private KerberosProperties kerberosProperties;
     private FileSystem mockFileSystem;
@@ -197,27 +200,32 @@ public class PutHDFSTest {
 
     @Test
     public void testPutFile() throws IOException {
-        PutHDFS proc = new TestablePutHDFS(kerberosProperties, mockFileSystem);
-        TestRunner runner = TestRunners.newTestRunner(proc);
-        runner.setProperty(PutHDFS.DIRECTORY, "target/test-classes");
-        runner.setProperty(PutHDFS.CONFLICT_RESOLUTION, "replace");
-        try (FileInputStream fis = new FileInputStream("src/test/resources/testdata/randombytes-1")) {
-            Map<String, String> attributes = new HashMap<>();
-            attributes.put(CoreAttributes.FILENAME.key(), "randombytes-1");
+        // given
+        final FileSystem spyFileSystem = Mockito.spy(mockFileSystem);
+        final PutHDFS proc = new TestablePutHDFS(kerberosProperties, spyFileSystem);
+        final TestRunner runner = TestRunners.newTestRunner(proc);
+        runner.setProperty(PutHDFS.DIRECTORY, TARGET_DIRECTORY);
+        runner.setProperty(PutHDFS.CONFLICT_RESOLUTION, PutHDFS.REPLACE_RESOLUTION);
+
+        // when
+        try (final FileInputStream fis = new FileInputStream(SOURCE_DIRECTORY + "/" + FILE_NAME)) {
+            final Map<String, String> attributes = new HashMap<>();
+            attributes.put(CoreAttributes.FILENAME.key(), FILE_NAME);
             runner.enqueue(fis, attributes);
             runner.run();
         }
 
-        List<MockFlowFile> failedFlowFiles = runner
-                .getFlowFilesForRelationship(new Relationship.Builder().name("failure").build());
+        // then
+        final List<MockFlowFile> failedFlowFiles = runner.getFlowFilesForRelationship(PutHDFS.REL_FAILURE);
         assertTrue(failedFlowFiles.isEmpty());
 
-        List<MockFlowFile> flowFiles = runner.getFlowFilesForRelationship(PutHDFS.REL_SUCCESS);
+        final List<MockFlowFile> flowFiles = runner.getFlowFilesForRelationship(PutHDFS.REL_SUCCESS);
         assertEquals(1, flowFiles.size());
-        MockFlowFile flowFile = flowFiles.get(0);
-        assertTrue(mockFileSystem.exists(new Path("target/test-classes/randombytes-1")));
-        assertEquals("randombytes-1", flowFile.getAttribute(CoreAttributes.FILENAME.key()));
-        assertEquals("target/test-classes", flowFile.getAttribute(PutHDFS.ABSOLUTE_HDFS_PATH_ATTRIBUTE));
+
+        final MockFlowFile flowFile = flowFiles.get(0);
+        assertTrue(spyFileSystem.exists(new Path(TARGET_DIRECTORY + "/" + FILE_NAME)));
+        assertEquals(FILE_NAME, flowFile.getAttribute(CoreAttributes.FILENAME.key()));
+        assertEquals(TARGET_DIRECTORY, flowFile.getAttribute(PutHDFS.ABSOLUTE_HDFS_PATH_ATTRIBUTE));
         assertEquals("true", flowFile.getAttribute(PutHDFS.TARGET_HDFS_DIR_CREATED_ATTRIBUTE));
 
         final List<ProvenanceEventRecord> provenanceEvents = runner.getProvenanceEvents();
@@ -225,7 +233,43 @@ public class PutHDFSTest {
         final ProvenanceEventRecord sendEvent = provenanceEvents.get(0);
         assertEquals(ProvenanceEventType.SEND, sendEvent.getEventType());
         // If it runs with a real HDFS, the protocol will be "hdfs://", but with a local filesystem, just assert the filename.
-        assertTrue(sendEvent.getTransitUri().endsWith("target/test-classes/randombytes-1"));
+        assertTrue(sendEvent.getTransitUri().endsWith(TARGET_DIRECTORY + "/" + FILE_NAME));
+
+        Mockito.verify(spyFileSystem, Mockito.times(1)).rename(Mockito.any(Path.class), Mockito.any(Path.class));
+    }
+
+    @Test
+    public void testPutFileWithSimpleWrite() throws IOException {
+        // given
+        final FileSystem spyFileSystem = Mockito.spy(mockFileSystem);
+        final PutHDFS proc = new TestablePutHDFS(kerberosProperties, spyFileSystem);
+        final TestRunner runner = TestRunners.newTestRunner(proc);
+        runner.setProperty(PutHDFS.DIRECTORY, TARGET_DIRECTORY);
+        runner.setProperty(PutHDFS.CONFLICT_RESOLUTION, PutHDFS.REPLACE_RESOLUTION);
+        runner.setProperty(PutHDFS.WRITING_STRATEGY, PutHDFS.SIMPLE_WRITE);
+
+        // when
+        try (final FileInputStream fis = new FileInputStream(SOURCE_DIRECTORY + "/" + FILE_NAME)) {
+            final Map<String, String> attributes = new HashMap<>();
+            attributes.put(CoreAttributes.FILENAME.key(), FILE_NAME);
+            runner.enqueue(fis, attributes);
+            runner.run();
+        }
+
+        // then
+        final List<MockFlowFile> failedFlowFiles = runner.getFlowFilesForRelationship(PutHDFS.REL_FAILURE);
+        assertTrue(failedFlowFiles.isEmpty());
+
+        final List<MockFlowFile> flowFiles = runner.getFlowFilesForRelationship(PutHDFS.REL_SUCCESS);
+        assertEquals(1, flowFiles.size());
+
+        final MockFlowFile flowFile = flowFiles.get(0);
+        assertTrue(spyFileSystem.exists(new Path(TARGET_DIRECTORY + "/" + FILE_NAME)));
+        assertEquals(FILE_NAME, flowFile.getAttribute(CoreAttributes.FILENAME.key()));
+        assertEquals(TARGET_DIRECTORY, flowFile.getAttribute(PutHDFS.ABSOLUTE_HDFS_PATH_ATTRIBUTE));
+        assertEquals("true", flowFile.getAttribute(PutHDFS.TARGET_HDFS_DIR_CREATED_ATTRIBUTE));
+
+        Mockito.verify(spyFileSystem, Mockito.never()).rename(Mockito.any(Path.class), Mockito.any(Path.class));
     }
 
     @Test

[nifi] 05/15: NIFI-9419 ParseCEF - Upgraded parcefone and supported empty extensions

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit 25a273834ec76032e426a6fa3f12ba570925b7c9
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Mon Nov 29 18:45:54 2021 +0100

    NIFI-9419 ParseCEF - Upgraded parcefone and supported empty extensions
    
    - Upgraded com.fluenda:parcefone from 2.0.0 to 2.1.0
    - Added Accept empty extensions property to ParseCEF
    
    This closes #5555
    
    Co-authored-by: David Handermann <ex...@apache.org>
    Signed-off-by: David Handermann <ex...@apache.org>
---
 .../apache/nifi/processors/standard/ParseCEF.java  | 20 +++++++++++---
 .../nifi/processors/standard/TestParseCEF.java     | 32 ++++++++++++++++++++++
 nifi-nar-bundles/nifi-standard-bundle/pom.xml      |  2 +-
 3 files changed, 49 insertions(+), 5 deletions(-)

diff --git a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ParseCEF.java b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ParseCEF.java
index 4d8c7f6..de486a2 100644
--- a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ParseCEF.java
+++ b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ParseCEF.java
@@ -147,6 +147,16 @@ public class ParseCEF extends AbstractProcessor {
             .allowableValues("true", "false")
             .build();
 
+    public static final PropertyDescriptor ACCEPT_EMPTY_EXTENSIONS = new PropertyDescriptor.Builder()
+            .name("ACCEPT_EMPTY_EXTENSIONS")
+            .displayName("Accept empty extensions")
+            .description("If set to true, empty extensions will be accepted and will be associated to a null value.")
+            .addValidator(StandardValidators.BOOLEAN_VALIDATOR)
+            .required(true)
+            .defaultValue("false")
+            .allowableValues("true", "false")
+            .build();
+
     public static final PropertyDescriptor VALIDATE_DATA = new PropertyDescriptor.Builder()
             .name("VALIDATE_DATA")
             .displayName("Validate the CEF event")
@@ -200,6 +210,7 @@ public class ParseCEF extends AbstractProcessor {
         properties.add(FIELDS_DESTINATION);
         properties.add(APPEND_RAW_MESSAGE_TO_JSON);
         properties.add(INCLUDE_CUSTOM_EXTENSIONS);
+        properties.add(ACCEPT_EMPTY_EXTENSIONS);
         properties.add(VALIDATE_DATA);
         properties.add(TIME_REPRESENTATION);
         properties.add(DATETIME_REPRESENTATION);
@@ -262,12 +273,13 @@ public class ParseCEF extends AbstractProcessor {
             // validator failed to identify an invalid Locale
             final Locale parcefoneLocale = Locale.forLanguageTag(context.getProperty(DATETIME_REPRESENTATION).getValue());
             final boolean validateData = context.getProperty(VALIDATE_DATA).asBoolean();
-            event = parser.parse(buffer, validateData, parcefoneLocale);
+            final boolean acceptEmptyExtensions = context.getProperty(ACCEPT_EMPTY_EXTENSIONS).asBoolean();
+            event = parser.parse(buffer, validateData, acceptEmptyExtensions, parcefoneLocale);
 
         } catch (Exception e) {
             // This should never trigger but adding in here as a fencing mechanism to
             // address possible ParCEFone bugs.
-            getLogger().error("Parser returned unexpected Exception {} while processing {}; routing to failure", new Object[] {e, flowFile});
+            getLogger().error("CEF Parsing Failed: {}", flowFile, e);
             session.transfer(flowFile, REL_FAILURE);
             return;
         }
@@ -339,7 +351,7 @@ public class ParseCEF extends AbstractProcessor {
             session.transfer(flowFile, REL_SUCCESS);
         } catch (CEFHandlingException e) {
             // The flowfile has failed parsing & validation, routing to failure and committing
-            getLogger().error("Failed to parse {} as a CEF message due to {}; routing to failure", new Object[] {flowFile, e});
+            getLogger().error("Reading CEF Event Failed: {}", flowFile, e);
             // Create a provenance event recording the routing to failure
             session.getProvenanceReporter().route(flowFile, REL_FAILURE);
             session.transfer(flowFile, REL_FAILURE);
@@ -379,6 +391,7 @@ public class ParseCEF extends AbstractProcessor {
                 return new ValidationResult.Builder().subject(subject).input(input).valid(false)
                         .explanation(subject + " cannot be empty").build();
             }
+
             final Locale testLocale = Locale.forLanguageTag(input);
             final Locale[] availableLocales = Locale.getAvailableLocales();
 
@@ -389,7 +402,6 @@ public class ParseCEF extends AbstractProcessor {
                         .explanation(input + " is not a valid locale format.").build();
             } else {
                 return new ValidationResult.Builder().subject(subject).input(input).valid(true).build();
-
             }
 
         }
diff --git a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestParseCEF.java b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestParseCEF.java
index 94c61ca..9ec2e87 100644
--- a/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestParseCEF.java
+++ b/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestParseCEF.java
@@ -19,6 +19,7 @@ package org.apache.nifi.processors.standard;
 
 import com.fasterxml.jackson.databind.JsonNode;
 import com.fasterxml.jackson.databind.ObjectMapper;
+
 import org.apache.nifi.util.MockFlowFile;
 import org.apache.nifi.util.TestRunner;
 import org.apache.nifi.util.TestRunners;
@@ -343,6 +344,37 @@ public class TestParseCEF {
     }
 
     @Test
+    public void testAcceptEmptyExtensions() throws Exception {
+        String sample3 = "CEF:0|TestVendor|TestProduct|TestVersion|TestEventClassID|TestName|Low|" +
+                "rt=Feb 09 2015 00:27:43 UTC cn3Label=Test Long cn3= " +
+                "cfp1=1.234 cfp1Label=Test FP Number smac=00:00:0c:07:ac:00 " +
+                "c6a3=2001:cdba::3257:9652 c6a3Label=Test IPv6 cs1Label=Test String cs1=test test test chocolate " +
+                "destinationTranslatedAddress=123.123.123.123 " +
+                "deviceCustomDate1=Feb 06 2015 13:27:43 " +
+                "dpt= agt= dlat=";
+
+        final TestRunner runner = TestRunners.newTestRunner(new ParseCEF());
+        runner.setProperty(ParseCEF.FIELDS_DESTINATION, ParseCEF.DESTINATION_CONTENT);
+        runner.setProperty(ParseCEF.TIME_REPRESENTATION, ParseCEF.UTC);
+        runner.setProperty(ParseCEF.INCLUDE_CUSTOM_EXTENSIONS, "true");
+        runner.setProperty(ParseCEF.ACCEPT_EMPTY_EXTENSIONS, "true");
+        runner.setProperty(ParseCEF.VALIDATE_DATA, "false");
+        runner.enqueue(sample3.getBytes());
+        runner.run();
+
+        runner.assertAllFlowFilesTransferred(ParseCEF.REL_SUCCESS, 1);
+        final MockFlowFile mff = runner.getFlowFilesForRelationship(ParseCEF.REL_SUCCESS).get(0);
+
+        byte [] rawJson = mff.toByteArray();
+
+        JsonNode results = new ObjectMapper().readTree(rawJson);
+
+        JsonNode extensions = results.get("extension");
+        Assert.assertTrue(extensions.has("cn3"));
+        Assert.assertTrue(extensions.get("cn3").isNull());
+    }
+
+    @Test
     public void testDataValidation() throws Exception {
         String invalidEvent = sample1 + " proto=ICMP"; // according to the standard, proto can be either tcp or udp.
 
diff --git a/nifi-nar-bundles/nifi-standard-bundle/pom.xml b/nifi-nar-bundles/nifi-standard-bundle/pom.xml
index f69c089..a3aad04 100644
--- a/nifi-nar-bundles/nifi-standard-bundle/pom.xml
+++ b/nifi-nar-bundles/nifi-standard-bundle/pom.xml
@@ -316,7 +316,7 @@
             <dependency>
                 <groupId>com.fluenda</groupId>
                 <artifactId>parcefone</artifactId>
-                <version>2.0.0</version>
+                <version>2.1.0</version>
             </dependency>
             <dependency>
                 <groupId>com.github.wnameless.json</groupId>

[nifi] 03/15: NIFI-9426 Removed unused jackson-mapper-asl from MiNiFi

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit c7be58abe6023d52ad52d2bda40dbd29019e9d60
Author: exceptionfactory <ex...@apache.org>
AuthorDate: Tue Nov 30 14:32:29 2021 -0600

    NIFI-9426 Removed unused jackson-mapper-asl from MiNiFi
    
    Signed-off-by: Matthew Burgess <ma...@apache.org>
    
    This closes #5561
---
 .../minifi-framework/minifi-framework-core/pom.xml                   | 5 -----
 minifi/pom.xml                                                       | 5 -----
 2 files changed, 10 deletions(-)

diff --git a/minifi/minifi-nar-bundles/minifi-framework-bundle/minifi-framework/minifi-framework-core/pom.xml b/minifi/minifi-nar-bundles/minifi-framework-bundle/minifi-framework/minifi-framework-core/pom.xml
index 91b43c1..6b147dd 100644
--- a/minifi/minifi-nar-bundles/minifi-framework-bundle/minifi-framework/minifi-framework-core/pom.xml
+++ b/minifi/minifi-nar-bundles/minifi-framework-bundle/minifi-framework/minifi-framework-core/pom.xml
@@ -38,11 +38,6 @@ limitations under the License.
             <scope>provided</scope>
         </dependency>
         <dependency>
-            <groupId>org.codehaus.jackson</groupId>
-            <artifactId>jackson-mapper-asl</artifactId>
-        </dependency>
-
-        <dependency>
             <groupId>org.eclipse.jetty</groupId>
             <artifactId>jetty-servlet</artifactId>
             <version>${jetty.version}</version>
diff --git a/minifi/pom.xml b/minifi/pom.xml
index 068ae01..0eb96ac 100644
--- a/minifi/pom.xml
+++ b/minifi/pom.xml
@@ -956,11 +956,6 @@ limitations under the License.
                 <version>1.4.199</version>
             </dependency>
             <dependency>
-                <groupId>org.codehaus.jackson</groupId>
-                <artifactId>jackson-mapper-asl</artifactId>
-                <version>1.9.13</version>
-            </dependency>
-            <dependency>
                 <groupId>com.fasterxml.jackson.core</groupId>
                 <artifactId>jackson-databind</artifactId>
                 <version>${jackson.version}</version>

[nifi] 02/15: NIFI-9468 - Bump Kafka client from 2.6.0 to 2.6.3

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

joewitt pushed a commit to branch support/nifi-1.15
in repository https://gitbox.apache.org/repos/asf/nifi.git

commit e66c21a38042f433d5c9bd7ca8eadbf29ba5f7b7
Author: Pierre Villard <pi...@gmail.com>
AuthorDate: Thu Dec 9 18:13:53 2021 +0100

    NIFI-9468 - Bump Kafka client from 2.6.0 to 2.6.3
    
    This closes #5588
    
    Signed-off-by: David Handermann <ex...@apache.org>
---
 nifi-nar-bundles/nifi-kafka-bundle/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/nifi-nar-bundles/nifi-kafka-bundle/pom.xml b/nifi-nar-bundles/nifi-kafka-bundle/pom.xml
index fcddda0..921f594 100644
--- a/nifi-nar-bundles/nifi-kafka-bundle/pom.xml
+++ b/nifi-nar-bundles/nifi-kafka-bundle/pom.xml
@@ -29,7 +29,7 @@
       <kafka11.version>0.11.0.3</kafka11.version>
       <kafka1.0.version>1.0.2</kafka1.0.version>
       <kafka2.0.version>2.0.0</kafka2.0.version>
-      <kafka2.6.version>2.6.0</kafka2.6.version>
+      <kafka2.6.version>2.6.3</kafka2.6.version>
     </properties>
 
     <modules>