You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by unmesha sreeveni <un...@gmail.com> on 2014/12/18 05:12:22 UTC

Run a c++ program using opencv libraries in hadoop

Hi

 How can I run c++ programs using opencv libraries in hadoop?

So far I have done MapReduce jobs in Java only..and there we can supply
external jars using command line itself.
And even tried using python language also..to run them we use hadoop
streaming API.
But I am confused how to run C++ programs using opencv libraries.

Thanks in Advance.

-- 
*Thanks & Regards *


*Unmesha Sreeveni U.B*
*Hadoop, Bigdata Developer*
*Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
http://www.unmeshasreeveni.blogspot.in/

Re: Run a c++ program using opencv libraries in hadoop

Posted by Kevin <ke...@gmail.com>.
You could run it as a shell action using Oozie. Write a shell script to run
your application. Put all the application's dependencies (e.g., *.so) into
a lib directory. Put the shell script in the parent directory of that lib
directory that I just mentioned. Create a simple Oozie workflow that runs
that shell script. Save the workflow.xml that you just created to put the
shell script in. Finally, make a directory on HDFS and put all of it in
that directory. Oozie will run that shell script in a single-map,
no-reducer job.

On Wed, Dec 17, 2014 at 11:12 PM, unmesha sreeveni <un...@gmail.com>
wrote:
>
> Hi
>
>  How can I run c++ programs using opencv libraries in hadoop?
>
> So far I have done MapReduce jobs in Java only..and there we can supply
> external jars using command line itself.
> And even tried using python language also..to run them we use hadoop
> streaming API.
> But I am confused how to run C++ programs using opencv libraries.
>
> Thanks in Advance.
>
> --
> *Thanks & Regards *
>
>
> *Unmesha Sreeveni U.B*
> *Hadoop, Bigdata Developer*
> *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
> http://www.unmeshasreeveni.blogspot.in/
>
>
>

Re: Run a c++ program using opencv libraries in hadoop

Posted by Kevin <ke...@gmail.com>.
You could run it as a shell action using Oozie. Write a shell script to run
your application. Put all the application's dependencies (e.g., *.so) into
a lib directory. Put the shell script in the parent directory of that lib
directory that I just mentioned. Create a simple Oozie workflow that runs
that shell script. Save the workflow.xml that you just created to put the
shell script in. Finally, make a directory on HDFS and put all of it in
that directory. Oozie will run that shell script in a single-map,
no-reducer job.

On Wed, Dec 17, 2014 at 11:12 PM, unmesha sreeveni <un...@gmail.com>
wrote:
>
> Hi
>
>  How can I run c++ programs using opencv libraries in hadoop?
>
> So far I have done MapReduce jobs in Java only..and there we can supply
> external jars using command line itself.
> And even tried using python language also..to run them we use hadoop
> streaming API.
> But I am confused how to run C++ programs using opencv libraries.
>
> Thanks in Advance.
>
> --
> *Thanks & Regards *
>
>
> *Unmesha Sreeveni U.B*
> *Hadoop, Bigdata Developer*
> *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
> http://www.unmeshasreeveni.blogspot.in/
>
>
>

Re: Run a c++ program using opencv libraries in hadoop

Posted by Kevin <ke...@gmail.com>.
You could run it as a shell action using Oozie. Write a shell script to run
your application. Put all the application's dependencies (e.g., *.so) into
a lib directory. Put the shell script in the parent directory of that lib
directory that I just mentioned. Create a simple Oozie workflow that runs
that shell script. Save the workflow.xml that you just created to put the
shell script in. Finally, make a directory on HDFS and put all of it in
that directory. Oozie will run that shell script in a single-map,
no-reducer job.

On Wed, Dec 17, 2014 at 11:12 PM, unmesha sreeveni <un...@gmail.com>
wrote:
>
> Hi
>
>  How can I run c++ programs using opencv libraries in hadoop?
>
> So far I have done MapReduce jobs in Java only..and there we can supply
> external jars using command line itself.
> And even tried using python language also..to run them we use hadoop
> streaming API.
> But I am confused how to run C++ programs using opencv libraries.
>
> Thanks in Advance.
>
> --
> *Thanks & Regards *
>
>
> *Unmesha Sreeveni U.B*
> *Hadoop, Bigdata Developer*
> *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
> http://www.unmeshasreeveni.blogspot.in/
>
>
>

Re: Run a c++ program using opencv libraries in hadoop

Posted by Kevin <ke...@gmail.com>.
You could run it as a shell action using Oozie. Write a shell script to run
your application. Put all the application's dependencies (e.g., *.so) into
a lib directory. Put the shell script in the parent directory of that lib
directory that I just mentioned. Create a simple Oozie workflow that runs
that shell script. Save the workflow.xml that you just created to put the
shell script in. Finally, make a directory on HDFS and put all of it in
that directory. Oozie will run that shell script in a single-map,
no-reducer job.

On Wed, Dec 17, 2014 at 11:12 PM, unmesha sreeveni <un...@gmail.com>
wrote:
>
> Hi
>
>  How can I run c++ programs using opencv libraries in hadoop?
>
> So far I have done MapReduce jobs in Java only..and there we can supply
> external jars using command line itself.
> And even tried using python language also..to run them we use hadoop
> streaming API.
> But I am confused how to run C++ programs using opencv libraries.
>
> Thanks in Advance.
>
> --
> *Thanks & Regards *
>
>
> *Unmesha Sreeveni U.B*
> *Hadoop, Bigdata Developer*
> *Centre for Cyber Security | Amrita Vishwa Vidyapeetham*
> http://www.unmeshasreeveni.blogspot.in/
>
>
>