You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@nifi.apache.org by Madhukar Thota <ma...@gmail.com> on 2016/05/17 15:11:38 UTC

FileSize

Friends,

Is it possible to set file size like 500 mb or 1 GB before writing to hdfs?
I want to write large files instead of lot of smaller files to hdfs?


If possible, what processor i need to use to achive the size?

-Madhu

Re: FileSize

Posted by Madhukar Thota <ma...@gmail.com>.
Thanks Joe for quick response. I will follow your suggestion.

On Tue, May 17, 2016 at 11:20 AM, Joe Witt <jo...@gmail.com> wrote:

> Madhu,
>
> Absolutely.  You can use MergeContent, for example, to pack together a
> bunch of smaller files to create a larger bundle.  I'd recommend if
> you will bundle 10s of thousands or hundreds of thousands or more of
> things that you use two MergeContent processors together where the
> first one merges at most 10,000 items and then the second merges the
> next 10,000 or so.  Hope that helps
>
> Joe
>
> On Tue, May 17, 2016 at 11:11 AM, Madhukar Thota
> <ma...@gmail.com> wrote:
> > Friends,
> >
> > Is it possible to set file size like 500 mb or 1 GB before writing to
> hdfs?
> > I want to write large files instead of lot of smaller files to hdfs?
> >
> >
> > If possible, what processor i need to use to achive the size?
> >
> > -Madhu
>

Re: FileSize

Posted by Thad Guidry <th...@gmail.com>.
On Tue, May 17, 2016 at 10:20 AM, Joe Witt <jo...@gmail.com> wrote:

> Madhu,
>
> Absolutely.  You can use MergeContent, for example, to pack together a
> bunch of smaller files to create a larger bundle.  I'd recommend if
> you will bundle 10s of thousands or hundreds of thousands or more of
> things that you use two MergeContent processors together where the
> first one merges at most 10,000 items and then the second merges the
> next 10,000 or so.  Hope that helps
>
> Joe
>
>
​
​Hi Joe,

I think this is the kind of information that should be captured in a "Best
Practices with NiFi" within the Wiki.  How about I help with maintaining a
section like that.  (the nuances that are too long to explain in code with
a help tip within the tool, or gloss over in the official documentation
would go into that section.)

I signed up on your wiki, but looks like I need access to edit and create
pages ?  (btw, I also maintain the OpenRefine wiki)​

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: FileSize

Posted by Joe Witt <jo...@gmail.com>.
Madhu,

Absolutely.  You can use MergeContent, for example, to pack together a
bunch of smaller files to create a larger bundle.  I'd recommend if
you will bundle 10s of thousands or hundreds of thousands or more of
things that you use two MergeContent processors together where the
first one merges at most 10,000 items and then the second merges the
next 10,000 or so.  Hope that helps

Joe

On Tue, May 17, 2016 at 11:11 AM, Madhukar Thota
<ma...@gmail.com> wrote:
> Friends,
>
> Is it possible to set file size like 500 mb or 1 GB before writing to hdfs?
> I want to write large files instead of lot of smaller files to hdfs?
>
>
> If possible, what processor i need to use to achive the size?
>
> -Madhu