You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@predictionio.apache.org by Michael Zhou <zh...@gmail.com> on 2019/03/20 17:57:44 UTC

Wrong FS: file:/home/aml/ur/engine.json expected: hdfs://localhost:9000

I'm trying to run the integration test for the Universal Recommender.
However, I've been getting this error when doing "pio deploy":

2019-03-20 17:44:32,856 ERROR akka.actor.OneForOneStrategy
[pio-server-akka.actor.default-dispatcher-2] - Wrong FS:
file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
java.lang.IllegalArgumentException: Wrong FS:
file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
        at
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
        at
org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
        at
org.apache.predictionio.workflow.EngineServerPluginContext$.stringFromFile(EngineServerPluginContext.scala:85)
        at
org.apache.predictionio.workflow.EngineServerPluginContext$.apply(EngineServerPluginContext.scala:58)
        at
org.apache.predictionio.workflow.PredictionServer.<init>(CreateServer.scala:424)
        at
org.apache.predictionio.workflow.CreateServer$.createPredictionServerWithEngine(CreateServer.scala:237)
        at
org.apache.predictionio.workflow.MasterActor.createServer(CreateServer.scala:389)
        at
org.apache.predictionio.workflow.MasterActor$$anonfun$receive$1.applyOrElse(CreateServer.scala:317)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
        at
org.apache.predictionio.workflow.MasterActor.aroundReceive(CreateServer.scala:259)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588)
        at akka.actor.ActorCell.invoke(ActorCell.scala:557)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
        at akka.dispatch.Mailbox.run(Mailbox.scala:225)
        at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
        at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

My pio-env.sh is as follows:

SPARK_HOME=/usr/local/spark
ES_CONF_DIR=/usr/local/elasticsearch
HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
HBASE_CONF_DIR=/usr/local/hbase/conf

PIO_FS_BASEDIR=$HOME/.pio_store
PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp

PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH

PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE

PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=HDFS

PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=my-cluster
PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200
PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/usr/local/elasticsearch

PIO_STORAGE_SOURCES_HDFS_TYPE=hdfs
PIO_STORAGE_SOURCES_HDFS_PATH=/models

PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
PIO_STORAGE_SOURCES_HBASE_HOME=/usr/local/hbase
PIO_STORAGE_SOURCES_HBASE_HOSTS=localhost

Any help would be appreciated.

Re: Wrong FS: file:/home/aml/ur/engine.json expected: hdfs://localhost:9000

Posted by Michael Zhou <zh...@gmail.com>.
Update: This seems like a regression introduced by pio 0.14.0. It worked
after I downgraded to pio 0.13.0.
In particular, I suspect this diff
https://github.com/apache/predictionio/pull/494/files#diff-167f4e9c1445b1f87aad1dead8da208c
to
have caused the issue.
Would be better if a committer can confirm this.

On Wed, Mar 20, 2019 at 10:57 AM Michael Zhou <zh...@gmail.com>
wrote:

> I'm trying to run the integration test for the Universal Recommender.
> However, I've been getting this error when doing "pio deploy":
>
> 2019-03-20 17:44:32,856 ERROR akka.actor.OneForOneStrategy
> [pio-server-akka.actor.default-dispatcher-2] - Wrong FS:
> file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
> java.lang.IllegalArgumentException: Wrong FS:
> file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
>         at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
>         at
> org.apache.predictionio.workflow.EngineServerPluginContext$.stringFromFile(EngineServerPluginContext.scala:85)
>         at
> org.apache.predictionio.workflow.EngineServerPluginContext$.apply(EngineServerPluginContext.scala:58)
>         at
> org.apache.predictionio.workflow.PredictionServer.<init>(CreateServer.scala:424)
>         at
> org.apache.predictionio.workflow.CreateServer$.createPredictionServerWithEngine(CreateServer.scala:237)
>         at
> org.apache.predictionio.workflow.MasterActor.createServer(CreateServer.scala:389)
>         at
> org.apache.predictionio.workflow.MasterActor$$anonfun$receive$1.applyOrElse(CreateServer.scala:317)
>         at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
>         at
> org.apache.predictionio.workflow.MasterActor.aroundReceive(CreateServer.scala:259)
>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588)
>         at akka.actor.ActorCell.invoke(ActorCell.scala:557)
>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
>         at akka.dispatch.Mailbox.run(Mailbox.scala:225)
>         at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
>         at
> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>         at
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>         at
> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>         at
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> My pio-env.sh is as follows:
>
> SPARK_HOME=/usr/local/spark
> ES_CONF_DIR=/usr/local/elasticsearch
> HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
> HBASE_CONF_DIR=/usr/local/hbase/conf
>
> PIO_FS_BASEDIR=$HOME/.pio_store
> PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
> PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp
>
> PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
>
> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>
> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=HDFS
>
> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
> PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=my-cluster
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/usr/local/elasticsearch
>
> PIO_STORAGE_SOURCES_HDFS_TYPE=hdfs
> PIO_STORAGE_SOURCES_HDFS_PATH=/models
>
> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
> PIO_STORAGE_SOURCES_HBASE_HOME=/usr/local/hbase
> PIO_STORAGE_SOURCES_HBASE_HOSTS=localhost
>
> Any help would be appreciated.
>

Re: Wrong FS: file:/home/aml/ur/engine.json expected: hdfs://localhost:9000

Posted by Michael Zhou <zh...@gmail.com>.
I just did ./make-distribution.sh from the pio source directory.

On Fri, Mar 29, 2019, 12:08 Pat Ferrel <pa...@occamsmachete.com> wrote:

> Templates have their own build.sbt. This means that if you upgrade a
> version of PIO you need to upgrade the dependencies in ALL your templates.
> So what you are calling a regression may just be that the UR needs to have
> upgraded dependencies.
>
> I’d be interested in helping but let’s move back to PIO 0.14.0 first. When
> you build PIO what is the exact command line?
>
>
> From: Michael Zhou <zh...@gmail.com>
> <zh...@gmail.com>
> Reply: user@predictionio.apache.org <us...@predictionio.apache.org>
> <us...@predictionio.apache.org>
> Date: March 20, 2019 at 12:05:26 PM
> To: user@predictionio.apache.org <us...@predictionio.apache.org>
> <us...@predictionio.apache.org>
> Subject:  Re: Wrong FS: file:/home/aml/ur/engine.json expected:
> hdfs://localhost:9000
>
> Update: This seems like a regression introduced by pio 0.14.0. It worked
> after I downgraded to pio 0.13.0.
> In particular, I suspect this diff
> https://github.com/apache/predictionio/pull/494/files#diff-167f4e9c1445b1f87aad1dead8da208c to
> have caused the issue.
> Would be better if a committer can confirm this.
>
> On Wed, Mar 20, 2019 at 10:57 AM Michael Zhou <zh...@gmail.com>
> wrote:
>
>> I'm trying to run the integration test for the Universal Recommender.
>> However, I've been getting this error when doing "pio deploy":
>>
>> 2019-03-20 17:44:32,856 ERROR akka.actor.OneForOneStrategy
>> [pio-server-akka.actor.default-dispatcher-2] - Wrong FS:
>> file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
>> java.lang.IllegalArgumentException: Wrong FS:
>> file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
>>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
>>         at
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
>>         at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
>>         at
>> org.apache.predictionio.workflow.EngineServerPluginContext$.stringFromFile(EngineServerPluginContext.scala:85)
>>         at
>> org.apache.predictionio.workflow.EngineServerPluginContext$.apply(EngineServerPluginContext.scala:58)
>>         at
>> org.apache.predictionio.workflow.PredictionServer.<init>(CreateServer.scala:424)
>>         at
>> org.apache.predictionio.workflow.CreateServer$.createPredictionServerWithEngine(CreateServer.scala:237)
>>         at
>> org.apache.predictionio.workflow.MasterActor.createServer(CreateServer.scala:389)
>>         at
>> org.apache.predictionio.workflow.MasterActor$$anonfun$receive$1.applyOrElse(CreateServer.scala:317)
>>         at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
>>         at
>> org.apache.predictionio.workflow.MasterActor.aroundReceive(CreateServer.scala:259)
>>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588)
>>         at akka.actor.ActorCell.invoke(ActorCell.scala:557)
>>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
>>         at akka.dispatch.Mailbox.run(Mailbox.scala:225)
>>         at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
>>         at
>> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>         at
>> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>         at
>> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>         at
>> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> My pio-env.sh is as follows:
>>
>> SPARK_HOME=/usr/local/spark
>> ES_CONF_DIR=/usr/local/elasticsearch
>> HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
>> HBASE_CONF_DIR=/usr/local/hbase/conf
>>
>> PIO_FS_BASEDIR=$HOME/.pio_store
>> PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
>> PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp
>>
>> PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
>> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
>>
>> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>>
>> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
>> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=HDFS
>>
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=my-cluster
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200
>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/usr/local/elasticsearch
>>
>> PIO_STORAGE_SOURCES_HDFS_TYPE=hdfs
>> PIO_STORAGE_SOURCES_HDFS_PATH=/models
>>
>> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
>> PIO_STORAGE_SOURCES_HBASE_HOME=/usr/local/hbase
>> PIO_STORAGE_SOURCES_HBASE_HOSTS=localhost
>>
>> Any help would be appreciated.
>>
>

Re: Wrong FS: file:/home/aml/ur/engine.json expected: hdfs://localhost:9000

Posted by Pat Ferrel <pa...@occamsmachete.com>.
Templates have their own build.sbt. This means that if you upgrade a version of PIO you need to upgrade the dependencies in ALL your templates. So what you are calling a regression may just be that the UR needs to have upgraded dependencies.

I’d be interested in helping but let’s move back to PIO 0.14.0 first. When you build PIO what is the exact command line?
 

From: Michael Zhou <zh...@gmail.com>
Reply: user@predictionio.apache.org <us...@predictionio.apache.org>
Date: March 20, 2019 at 12:05:26 PM
To: user@predictionio.apache.org <us...@predictionio.apache.org>
Subject:  Re: Wrong FS: file:/home/aml/ur/engine.json expected: hdfs://localhost:9000  

Update: This seems like a regression introduced by pio 0.14.0. It worked after I downgraded to pio 0.13.0.
In particular, I suspect this diff https://github.com/apache/predictionio/pull/494/files#diff-167f4e9c1445b1f87aad1dead8da208c to have caused the issue.
Would be better if a committer can confirm this.

On Wed, Mar 20, 2019 at 10:57 AM Michael Zhou <zh...@gmail.com> wrote:
I'm trying to run the integration test for the Universal Recommender. However, I've been getting this error when doing "pio deploy":

2019-03-20 17:44:32,856 ERROR akka.actor.OneForOneStrategy [pio-server-akka.actor.default-dispatcher-2] - Wrong FS: file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
java.lang.IllegalArgumentException: Wrong FS: file:/home/aml/ur/engine.json, expected: hdfs://localhost:9000
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
        at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
        at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
        at org.apache.predictionio.workflow.EngineServerPluginContext$.stringFromFile(EngineServerPluginContext.scala:85)
        at org.apache.predictionio.workflow.EngineServerPluginContext$.apply(EngineServerPluginContext.scala:58)
        at org.apache.predictionio.workflow.PredictionServer.<init>(CreateServer.scala:424)
        at org.apache.predictionio.workflow.CreateServer$.createPredictionServerWithEngine(CreateServer.scala:237)
        at org.apache.predictionio.workflow.MasterActor.createServer(CreateServer.scala:389)
        at org.apache.predictionio.workflow.MasterActor$$anonfun$receive$1.applyOrElse(CreateServer.scala:317)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
        at org.apache.predictionio.workflow.MasterActor.aroundReceive(CreateServer.scala:259)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588)
        at akka.actor.ActorCell.invoke(ActorCell.scala:557)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
        at akka.dispatch.Mailbox.run(Mailbox.scala:225)
        at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
        at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

My pio-env.sh is as follows:

SPARK_HOME=/usr/local/spark
ES_CONF_DIR=/usr/local/elasticsearch
HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
HBASE_CONF_DIR=/usr/local/hbase/conf

PIO_FS_BASEDIR=$HOME/.pio_store
PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines
PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp

PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH

PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE

PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=HDFS

PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=my-cluster
PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200
PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/usr/local/elasticsearch

PIO_STORAGE_SOURCES_HDFS_TYPE=hdfs
PIO_STORAGE_SOURCES_HDFS_PATH=/models

PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
PIO_STORAGE_SOURCES_HBASE_HOME=/usr/local/hbase
PIO_STORAGE_SOURCES_HBASE_HOSTS=localhost

Any help would be appreciated.