You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@nifi.apache.org by trixpan <gi...@git.apache.org> on 2017/03/24 10:24:25 UTC

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

GitHub user trixpan opened a pull request:

    https://github.com/apache/nifi/pull/1619

    NIFI-2747 - Introduce FuzzyHashContent processor

    Thank you for submitting a contribution to Apache NiFi.
    
    In order to streamline the review of the contribution we ask you
    to ensure the following steps have been taken:
    
    ### For all changes:
    - [X] Is there a JIRA ticket associated with this PR? Is it referenced 
         in the commit message?
    
    - [X] Does your PR title start with NIFI-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character.
    
    - [X] Has your PR been rebased against the latest commit within the target branch (typically master)?
    
    - [X] ~~~Is your initial contribution a single, squashed commit?~~~
    
    ### For code changes:
    - [X] Have you ensured that the full suite of tests is executed via mvn -Pcontrib-check clean install at the root nifi folder?
    - [X] Have you written or updated unit tests to verify your changes?
    - [X] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? 
    - [X] If applicable, have you updated the LICENSE file, including the main LICENSE file under nifi-assembly?
    - [X] If applicable, have you updated the NOTICE file, including the main NOTICE file found under nifi-assembly?
    - [X] If adding new Properties, have you added .displayName in addition to .name (programmatic access) for each of the new properties?
    
    ### For documentation related changes:
    - [X] Have you ensured that format looks appropriate for the output in which it is rendered?
    
    ### Note:
    Please ensure that once the PR is submitted, you check travis-ci for build issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/trixpan/nifi NIFI-2747

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/nifi/pull/1619.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1619
    
----
commit 0f810d513d57d799e976f797d4c2062ab3e59ee2
Author: Andre F de Miranda <tr...@users.noreply.github.com>
Date:   2017-03-24T06:53:58Z

    NIFI-2747 - Introduce FuzzyHashContent processor

commit 6f47cb52fcae254782cb47364f51ffcd999bc22a
Author: Andre F de Miranda <tr...@users.noreply.github.com>
Date:   2017-03-24T10:22:23Z

    NIFI-3466 - Minor typo missed during devel/review

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108526050
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    --- End diff --
    
    Does it make sense to allow this to be driven via EL?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108526366
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    --- End diff --
    
    If EL is not applied to associated descriptor, perhaps it makes more sense to check/cache this in onScheduled? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108933894
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    --- End diff --
    
    sounds fair.  Not sure how old HashContent is, may predate EL.  More of a comment as to whether or not this would be useful, but have no bearing for the overall flow and how it would be used.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    changes look good, L&N looks correct for added deps.
    
    will get this merged, thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108919106
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    --- End diff --
    
    well noted. fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    @trixpan the absence of streaming impls unfortunate \U0001f641
    
    Could we please note this in a fashion similar to EvaluateJsonPath?  Just would like it documented for those that make use of it.  I'm going to link this issue to the PerformanceConsider annotation conversation.
    
    Otherwise, looks good to go.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108529113
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    +                            hashValueHolder.set(new SpamSum().HashString(holder.toString()));
    +                        }
    +
    +                        if (algorithm.equals(allowableValueTLSH.getValue())) {
    --- End diff --
    
    Same as above


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    @trixpan sorry, been traveling and have had a bit of a hectic schedule.  would you please be able to rebase and then I will look to get this closed out in the next day or so?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108917562
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor ---
    @@ -0,0 +1,15 @@
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#     http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +org.apache.nifi.processors.cybersecurity.MyProcessor
    --- End diff --
    
    oopsie


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    also, regarding Both underlying implementations use in memory objects. I found some pseudo stream ones but despite a method allowing streaming, the underlying calls still loaded the whole content in memory prior  to hashing... so I rather keep it honest. :-)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108918795
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    --- End diff --
    
    fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108528798
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    --- End diff --
    
    typo: criptographic -> cryptographic


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108919701
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    +                            hashValueHolder.set(new SpamSum().HashString(holder.toString()));
    +                        }
    +
    +                        if (algorithm.equals(allowableValueTLSH.getValue())) {
    +                            hashValueHolder.set(new TLSH(holder.toString()).hash());
    +                        }
    +
    +                    }
    +                }
    +            });
    +
    +            final String attributeName = context.getProperty(ATTRIBUTE_NAME).getValue();
    +            flowFile = session.putAttribute(flowFile, attributeName, hashValueHolder.get());
    --- End diff --
    
    I would discourage this. Instead I rather implictly push the user to name ATTRIBUTE_NAME accordingly (as we do with HashContent processor)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108919804
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    --- End diff --
    
    great idea. Addressed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    Note: This PR also contain a minor typo correction to "logic independent" part of of ParseCEF processor.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108529084
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    --- End diff --
    
    with usage of property descriptor instead of a string we can just do an algorithm.equals(allowableValueSSDEEP)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/nifi/pull/1619


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108525905
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor ---
    @@ -0,0 +1,15 @@
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#     http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +org.apache.nifi.processors.cybersecurity.MyProcessor
    --- End diff --
    
    This needs to be updated to reflect your processor.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108527276
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    --- End diff --
    
    Also, maybe just get the PropertyDescriptor and use that for a comparison in your #read instead of comparing strings?  Doesn't seem like this is being used anywhere.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108918786
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/resources/META-INF/services/org.apache.nifi.processor.Processor ---
    @@ -0,0 +1,15 @@
    +# Licensed to the Apache Software Foundation (ASF) under one or more
    +# contributor license agreements.  See the NOTICE file distributed with
    +# this work for additional information regarding copyright ownership.
    +# The ASF licenses this file to You under the Apache License, Version 2.0
    +# (the "License"); you may not use this file except in compliance with
    +# the License.  You may obtain a copy of the License at
    +#
    +#     http://www.apache.org/licenses/LICENSE-2.0
    +#
    +# Unless required by applicable law or agreed to in writing, software
    +# distributed under the License is distributed on an "AS IS" BASIS,
    +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    +# See the License for the specific language governing permissions and
    +# limitations under the License.
    +org.apache.nifi.processors.cybersecurity.MyProcessor
    --- End diff --
    
    fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    reviewing changes


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    @apiri? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    @apiri hopefully feedback has been addressed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108526752
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    --- End diff --
    
    Based on the algorithm used, perhaps we should evaluate length to avoid the less than 512 bytes issue for TLSH and route these files to a suitable relationship.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    @apiri - Added the note. Please certainly link to PerformanceConsider. As you may recall earlier this week while coding this processor I posted a message around the very same issue: It would be handier to have a consistent way of falling those memory intensive processors.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by apiri <gi...@git.apache.org>.
Github user apiri commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108527388
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    +                            hashValueHolder.set(new SpamSum().HashString(holder.toString()));
    +                        }
    +
    +                        if (algorithm.equals(allowableValueTLSH.getValue())) {
    +                            hashValueHolder.set(new TLSH(holder.toString()).hash());
    +                        }
    +
    +                    }
    +                }
    +            });
    +
    +            final String attributeName = context.getProperty(ATTRIBUTE_NAME).getValue();
    +            flowFile = session.putAttribute(flowFile, attributeName, hashValueHolder.get());
    --- End diff --
    
    Does it make sense to also include an attribute that specifies the algorithm applied?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108919134
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    +                            hashValueHolder.set(new SpamSum().HashString(holder.toString()));
    +                        }
    +
    +                        if (algorithm.equals(allowableValueTLSH.getValue())) {
    --- End diff --
    
    fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108918981
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    --- End diff --
    
    I am mimicking the HashContent processor and IRRC we don't allow that over there.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi issue #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on the issue:

    https://github.com/apache/nifi/pull/1619
  
    no worries. rebased


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] nifi pull request #1619: NIFI-2747 - Introduce FuzzyHashContent processor

Posted by trixpan <gi...@git.apache.org>.
Github user trixpan commented on a diff in the pull request:

    https://github.com/apache/nifi/pull/1619#discussion_r108931643
  
    --- Diff: nifi-nar-bundles/nifi-cybersecurity-bundle/nifi-cybersecurity-processors/src/main/java/org/apache/nifi/processors/cybersecurity/FuzzyHashContent.java ---
    @@ -0,0 +1,192 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.nifi.processors.cybersecurity;
    +
    +import com.idealista.tlsh.TLSH;
    +import com.idealista.tlsh.exceptions.InsufficientComplexityException;
    +import info.debatty.java.spamsum.SpamSum;
    +
    +import org.apache.nifi.annotation.behavior.EventDriven;
    +import org.apache.nifi.annotation.behavior.InputRequirement;
    +import org.apache.nifi.annotation.behavior.SideEffectFree;
    +import org.apache.nifi.annotation.behavior.SupportsBatching;
    +import org.apache.nifi.annotation.behavior.ReadsAttribute;
    +import org.apache.nifi.annotation.behavior.ReadsAttributes;
    +import org.apache.nifi.annotation.behavior.WritesAttribute;
    +import org.apache.nifi.annotation.behavior.WritesAttributes;
    +import org.apache.nifi.annotation.documentation.CapabilityDescription;
    +import org.apache.nifi.annotation.documentation.SeeAlso;
    +import org.apache.nifi.annotation.documentation.Tags;
    +import org.apache.nifi.annotation.lifecycle.OnScheduled;
    +
    +import org.apache.nifi.components.AllowableValue;
    +import org.apache.nifi.components.PropertyDescriptor;
    +import org.apache.nifi.flowfile.FlowFile;
    +import org.apache.nifi.logging.ComponentLog;
    +import org.apache.nifi.processor.exception.ProcessException;
    +import org.apache.nifi.processor.AbstractProcessor;
    +import org.apache.nifi.processor.ProcessContext;
    +import org.apache.nifi.processor.ProcessSession;
    +import org.apache.nifi.processor.ProcessorInitializationContext;
    +import org.apache.nifi.processor.Relationship;
    +import org.apache.nifi.processor.io.InputStreamCallback;
    +import org.apache.nifi.processor.util.StandardValidators;
    +import org.apache.nifi.processors.standard.HashContent;
    +
    +import org.apache.nifi.stream.io.StreamUtils;
    +
    +import java.io.ByteArrayOutputStream;
    +import java.io.IOException;
    +import java.io.InputStream;
    +import java.util.ArrayList;
    +import java.util.Collections;
    +import java.util.HashSet;
    +import java.util.List;
    +import java.util.Set;
    +import java.util.concurrent.atomic.AtomicReference;
    +
    +
    +@EventDriven
    +@SideEffectFree
    +@SupportsBatching
    +@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
    +@Tags({"hashing", "fuzzy-hashing", "cyber-security"})
    +@CapabilityDescription("Calculates a fuzzy/locality-sensitive hash value for the Content of a FlowFile and puts that " +
    +        "hash value on the FlowFile as an attribute whose name is determined by the <Hash Attribute Name> property." +
    +        "Note: this processor only offers non-criptographic hash algorithms. And it should be not be " +
    +        "seen as a replacement to the HashContent processor")
    +
    +@SeeAlso({HashContent.class})
    +@ReadsAttributes({@ReadsAttribute(attribute="", description="")})
    +@WritesAttributes({@WritesAttribute(attribute = "<Hash Attribute Name>", description = "This Processor adds an attribute whose value is the result of Hashing the "
    +        + "existing FlowFile content. The name of this attribute is specified by the <Hash Attribute Name> property")})
    +
    +public class FuzzyHashContent extends AbstractProcessor {
    +
    +    public static final AllowableValue allowableValueSSDEEP = new AllowableValue(
    +            "ssdeep",
    +            "ssdeep",
    +            "Uses ssdeep / SpamSum 'context triggered piecewise hash'.");
    +    public static final AllowableValue allowableValueTLSH = new AllowableValue(
    +            "tlsh",
    +            "tlsh",
    +            "Uses TLSH (Trend 'Locality Sensitive Hash'). Note: FlowFile Content must be at least 512 characters long");
    +
    +    public static final PropertyDescriptor ATTRIBUTE_NAME = new PropertyDescriptor.Builder()
    +            .name("ATTRIBUTE_NAME")
    +            .displayName("Hash Attribute Name")
    +            .description("The name of the FlowFile Attribute into which the Hash Value should be written. " +
    +                    "If the value already exists, it will be overwritten")
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .defaultValue("fuzzyhash.value")
    +            .build();
    +
    +    public static final PropertyDescriptor HASH_ALGORITHM = new PropertyDescriptor.Builder()
    +            .name("HASH_ALGORITHM")
    +            .displayName("Hashing Algorithm")
    +            .description("The hashing algorithm utilised")
    +            .allowableValues(allowableValueSSDEEP, allowableValueTLSH)
    +            .required(true)
    +            .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
    +            .build();
    +
    +    public static final Relationship REL_SUCCESS = new Relationship.Builder()
    +            .name("Success")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    public static final Relationship REL_FAILURE = new Relationship.Builder()
    +            .name("Failure")
    +            .description("Any FlowFile that is successfully hashed will be sent to this Relationship.")
    +            .build();
    +
    +    private List<PropertyDescriptor> descriptors;
    +
    +    private Set<Relationship> relationships;
    +
    +    @Override
    +    protected void init(final ProcessorInitializationContext context) {
    +        final List<PropertyDescriptor> descriptors = new ArrayList<PropertyDescriptor>();
    +        descriptors.add(ATTRIBUTE_NAME);
    +        descriptors.add(HASH_ALGORITHM);
    +        this.descriptors = Collections.unmodifiableList(descriptors);
    +
    +        final Set<Relationship> relationships = new HashSet<Relationship>();
    +        relationships.add(REL_SUCCESS);
    +        relationships.add(REL_FAILURE);
    +        this.relationships = Collections.unmodifiableSet(relationships);
    +    }
    +
    +    @Override
    +    public Set<Relationship> getRelationships() {
    +        return this.relationships;
    +    }
    +
    +    @Override
    +    public final List<PropertyDescriptor> getSupportedPropertyDescriptors() {
    +        return descriptors;
    +    }
    +
    +    @OnScheduled
    +    public void onScheduled(final ProcessContext context) {
    +
    +    }
    +
    +    @Override
    +    public void onTrigger(final ProcessContext context, final ProcessSession session) throws ProcessException {
    +        FlowFile flowFile = session.get();
    +        if (flowFile == null) {
    +            return;
    +        }
    +
    +        final ComponentLog logger = getLogger();
    +        final String algorithm = context.getProperty(HASH_ALGORITHM).getValue();
    +
    +
    +        final AtomicReference<String> hashValueHolder = new AtomicReference<>(null);
    +
    +        try {
    +            session.read(flowFile, new InputStreamCallback() {
    +                @Override
    +                public void process(final InputStream in) throws IOException {
    +                    try (ByteArrayOutputStream holder = new ByteArrayOutputStream()) {
    +                        StreamUtils.copy(in,holder);
    +
    +                        if (algorithm.equals(allowableValueSSDEEP.getValue())) {
    --- End diff --
    
    just as a note. I did change the code to equals(allowableValueSSDEEP) but it failed to behave as expected, so I rolled back to a similar approach but without the declaration of the algorithm string.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---