You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@syncope.apache.org by Edwin van der Elst <ed...@finalist.nl> on 2013/10/02 17:31:42 UTC

Force job to execute on specific host

Hi all,

I have a clustering problem.

In my cluster, I have 2 syncope servers.
On only 1 machine, I have a directory where I put .csv files that should be
imported.
The import is scheduled to run every night, and that works.

But.... How can I force the job to execute on a specific instance? The csv
files are only visible on one host, and I can see that the job is executed
on the other host.

Any tips?
I am looking into NSF to make the files available on both server, but that
gives more configuration issues.

Cheers,

Edwin

Re: Force job to execute on specific host

Posted by Edwin van der Elst <ed...@finalist.com>.
Thank you Francesco & Elf
This looks like a good solution.

Regards,

Edwin


Op 3 okt. 2013, om 08:49 heeft Francesco Chicchiriccò <il...@apache.org> het volgende geschreven:

> Hi Edwin and Elf,
> for replying to Edwin's original question, Syncope tasks - managed via Quartz jobs - are design to not be executed concurrently by mean of @DisallowConcurrentExecution [1] Quartz annotation; moreover, Quartz is configured by default in Syncope to work on clusters.
> This means that you don't have to worry about the fact that your job is executed on both cluster nodes at the same time.
> 
> Now, to the specific issue of having the CSV files residing on a single node, I think that the idea of using a ConnId connector server is perfect for the job.
> Using ConnId connector servers from Syncope is supported since 1.1.0 (consider that the standalone distribution [3] features a local connector server, for example).
> 
> Basically, you need to setup a ConnId connector server either as standalone Java  process [4] or embedded in a web application [5], configure it with the desired connector bundles and finally enable it on Syncope configuration [6].
> 
> Hope this helps.
> Regards.
> 
> [1] http://quartz-scheduler.org/api/2.1.7/org/quartz/DisallowConcurrentExecution.html
> [2] http://svn.apache.org/repos/asf/syncope/branches/1_1_X/core/src/main/resources/schedulingContext.xml
> [3] https://cwiki.apache.org/confluence/display/SYNCOPE/Run+Syncope+standalone+distribution
> [4] https://connid.atlassian.net/wiki/display/BASE/Connector+Servers#ConnectorServers-InstallingaJavaConnectorServer
> [5] http://svn.apache.org/repos/asf/syncope/branches/1_1_X/build-tools/src/main/java/org/apache/syncope/buildtools/ConnIdStartStopListener.java
> [6] https://cwiki.apache.org/confluence/display/SYNCOPE/Configure+ConnId+locations
> 
> On 02/10/2013 21:19, Smlacc1 wrote:
>> I believe the java connector server does exactly that - it allows you to have the connector bundles on a seperate server to the syncope server - but in this case you would install it on just one of your syncope nodes and then point syncope at that specific node.
>> 
>> Therer may be a better way to do it, but I think this should work.
>> 
>> Elf
>> 
>> Sent from my iPad
>> 
>>> On 2 Oct 2013, at 19:35, Edwin van der Elst <ed...@finalist.com> wrote:
>>> 
>>> I will have a look at that, looks promising.
>>> I think the connectors are executed by Syncope, don't know if it is possible to execute them on an 'external' server.
>>> 
>>> Edwin
>>> 
>>>> Op 2 okt. 2013, om 18:28 heeft Smlacc1 <sm...@gmail.com> het volgende geschreven:
>>>> 
>>>> Could you run the java connector server as a seperate instance on just the server where the csv files are located?  Then you could point syncope at that.  Im assuming you can run the connector server on the same host as an existing syncope instance, of course!  This seems to suggest you can:
>>>> 
>>>> https://connid.atlassian.net/wiki/plugins/servlet/mobile#content/view/360477
>>>> 
>>>> Elf
>>>> 
>>>>> On 2 Oct 2013, at 16:31, Edwin van der Elst <ed...@finalist.nl> wrote:
>>>>> 
>>>>> Hi all,
>>>>> 
>>>>> I have a clustering problem.
>>>>> 
>>>>> In my cluster, I have 2 syncope servers.
>>>>> On only 1 machine, I have a directory where I put .csv files that should be imported.
>>>>> The import is scheduled to run every night, and that works.
>>>>> 
>>>>> But.... How can I force the job to execute on a specific instance? The csv files are only visible on one host, and I can see that the job is executed on the other host.
>>>>> 
>>>>> Any tips?
>>>>> I am looking into NSF to make the files available on both server, but that gives more configuration issues.
>>>>> 
>>>>> Cheers,
>>>>> 
>>>>> Edwin
> 
> -- 
> Francesco Chicchiriccò
> 
> ASF Member, Apache Syncope PMC chair, Apache Cocoon PMC Member
> http://people.apache.org/~ilgrosso/
> 


Re: Force job to execute on specific host

Posted by Francesco Chicchiriccò <il...@apache.org>.
Hi Edwin and Elf,
for replying to Edwin's original question, Syncope tasks - managed via 
Quartz jobs - are design to not be executed concurrently by mean of 
@DisallowConcurrentExecution [1] Quartz annotation; moreover, Quartz is 
configured by default in Syncope to work on clusters.
This means that you don't have to worry about the fact that your job is 
executed on both cluster nodes at the same time.

Now, to the specific issue of having the CSV files residing on a single 
node, I think that the idea of using a ConnId connector server is 
perfect for the job.
Using ConnId connector servers from Syncope is supported since 1.1.0 
(consider that the standalone distribution [3] features a local 
connector server, for example).

Basically, you need to setup a ConnId connector server either as 
standalone Java  process [4] or embedded in a web application [5], 
configure it with the desired connector bundles and finally enable it on 
Syncope configuration [6].

Hope this helps.
Regards.

[1] 
http://quartz-scheduler.org/api/2.1.7/org/quartz/DisallowConcurrentExecution.html
[2] 
http://svn.apache.org/repos/asf/syncope/branches/1_1_X/core/src/main/resources/schedulingContext.xml
[3] 
https://cwiki.apache.org/confluence/display/SYNCOPE/Run+Syncope+standalone+distribution
[4] 
https://connid.atlassian.net/wiki/display/BASE/Connector+Servers#ConnectorServers-InstallingaJavaConnectorServer
[5] 
http://svn.apache.org/repos/asf/syncope/branches/1_1_X/build-tools/src/main/java/org/apache/syncope/buildtools/ConnIdStartStopListener.java
[6] 
https://cwiki.apache.org/confluence/display/SYNCOPE/Configure+ConnId+locations

On 02/10/2013 21:19, Smlacc1 wrote:
> I believe the java connector server does exactly that - it allows you to have the connector bundles on a seperate server to the syncope server - but in this case you would install it on just one of your syncope nodes and then point syncope at that specific node.
>
> Therer may be a better way to do it, but I think this should work.
>
> Elf
>
> Sent from my iPad
>
>> On 2 Oct 2013, at 19:35, Edwin van der Elst <ed...@finalist.com> wrote:
>>
>> I will have a look at that, looks promising.
>> I think the connectors are executed by Syncope, don't know if it is possible to execute them on an 'external' server.
>>
>> Edwin
>>
>>> Op 2 okt. 2013, om 18:28 heeft Smlacc1 <sm...@gmail.com> het volgende geschreven:
>>>
>>> Could you run the java connector server as a seperate instance on just the server where the csv files are located?  Then you could point syncope at that.  Im assuming you can run the connector server on the same host as an existing syncope instance, of course!  This seems to suggest you can:
>>>
>>> https://connid.atlassian.net/wiki/plugins/servlet/mobile#content/view/360477
>>>
>>> Elf
>>>
>>>> On 2 Oct 2013, at 16:31, Edwin van der Elst <ed...@finalist.nl> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I have a clustering problem.
>>>>
>>>> In my cluster, I have 2 syncope servers.
>>>> On only 1 machine, I have a directory where I put .csv files that should be imported.
>>>> The import is scheduled to run every night, and that works.
>>>>
>>>> But.... How can I force the job to execute on a specific instance? The csv files are only visible on one host, and I can see that the job is executed on the other host.
>>>>
>>>> Any tips?
>>>> I am looking into NSF to make the files available on both server, but that gives more configuration issues.
>>>>
>>>> Cheers,
>>>>
>>>> Edwin

-- 
Francesco Chicchiriccò

ASF Member, Apache Syncope PMC chair, Apache Cocoon PMC Member
http://people.apache.org/~ilgrosso/


Re: Force job to execute on specific host

Posted by Smlacc1 <sm...@gmail.com>.
I believe the java connector server does exactly that - it allows you to have the connector bundles on a seperate server to the syncope server - but in this case you would install it on just one of your syncope nodes and then point syncope at that specific node.

Therer may be a better way to do it, but I think this should work.

Elf

Sent from my iPad

> On 2 Oct 2013, at 19:35, Edwin van der Elst <ed...@finalist.com> wrote:
> 
> I will have a look at that, looks promising.
> I think the connectors are executed by Syncope, don't know if it is possible to execute them on an 'external' server. 
> 
> Edwin
> 
>> Op 2 okt. 2013, om 18:28 heeft Smlacc1 <sm...@gmail.com> het volgende geschreven:
>> 
>> Could you run the java connector server as a seperate instance on just the server where the csv files are located?  Then you could point syncope at that.  Im assuming you can run the connector server on the same host as an existing syncope instance, of course!  This seems to suggest you can:
>> 
>> https://connid.atlassian.net/wiki/plugins/servlet/mobile#content/view/360477
>> 
>> Elf
>> 
>>> On 2 Oct 2013, at 16:31, Edwin van der Elst <ed...@finalist.nl> wrote:
>>> 
>>> Hi all,
>>> 
>>> I have a clustering problem.
>>> 
>>> In my cluster, I have 2 syncope servers. 
>>> On only 1 machine, I have a directory where I put .csv files that should be imported.
>>> The import is scheduled to run every night, and that works.
>>> 
>>> But.... How can I force the job to execute on a specific instance? The csv files are only visible on one host, and I can see that the job is executed on the other host.
>>> 
>>> Any tips?
>>> I am looking into NSF to make the files available on both server, but that gives more configuration issues.
>>> 
>>> Cheers,
>>> 
>>> Edwin
> 

Re: Force job to execute on specific host

Posted by Edwin van der Elst <ed...@finalist.com>.
I will have a look at that, looks promising.
I think the connectors are executed by Syncope, don't know if it is possible to execute them on an 'external' server. 

Edwin

Op 2 okt. 2013, om 18:28 heeft Smlacc1 <sm...@gmail.com> het volgende geschreven:

> Could you run the java connector server as a seperate instance on just the server where the csv files are located?  Then you could point syncope at that.  Im assuming you can run the connector server on the same host as an existing syncope instance, of course!  This seems to suggest you can:
> 
> https://connid.atlassian.net/wiki/plugins/servlet/mobile#content/view/360477
> 
> Elf
> 
>> On 2 Oct 2013, at 16:31, Edwin van der Elst <ed...@finalist.nl> wrote:
>> 
>> Hi all,
>> 
>> I have a clustering problem.
>> 
>> In my cluster, I have 2 syncope servers. 
>> On only 1 machine, I have a directory where I put .csv files that should be imported.
>> The import is scheduled to run every night, and that works.
>> 
>> But.... How can I force the job to execute on a specific instance? The csv files are only visible on one host, and I can see that the job is executed on the other host.
>> 
>> Any tips?
>> I am looking into NSF to make the files available on both server, but that gives more configuration issues.
>> 
>> Cheers,
>> 
>> Edwin


Re: Force job to execute on specific host

Posted by Smlacc1 <sm...@gmail.com>.
Could you run the java connector server as a seperate instance on just the server where the csv files are located?  Then you could point syncope at that.  Im assuming you can run the connector server on the same host as an existing syncope instance, of course!  This seems to suggest you can:

https://connid.atlassian.net/wiki/plugins/servlet/mobile#content/view/360477

Elf

> On 2 Oct 2013, at 16:31, Edwin van der Elst <ed...@finalist.nl> wrote:
> 
> Hi all,
> 
> I have a clustering problem.
> 
> In my cluster, I have 2 syncope servers. 
> On only 1 machine, I have a directory where I put .csv files that should be imported.
> The import is scheduled to run every night, and that works.
> 
> But.... How can I force the job to execute on a specific instance? The csv files are only visible on one host, and I can see that the job is executed on the other host.
> 
> Any tips?
> I am looking into NSF to make the files available on both server, but that gives more configuration issues.
> 
> Cheers,
> 
> Edwin