You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@predictionio.apache.org by "Takako Shimamoto (JIRA)" <ji...@apache.org> on 2017/11/13 10:27:00 UTC

[jira] [Commented] (PIO-137) Connection pool is not yet initialized.(name:'default)

    [ https://issues.apache.org/jira/browse/PIO-137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16249353#comment-16249353 ] 

Takako Shimamoto commented on PIO-137:
--------------------------------------

The connection pool is initialized at a Spark driver, and then JDBCPEvents tries to use it in a Spark worker to delete records in the RDD. The correct solution is to initialize the connection pool (or create a connection object) at a worker.
I’ll handle this issue.

> Connection pool is not yet initialized.(name:'default)
> ------------------------------------------------------
>
>                 Key: PIO-137
>                 URL: https://issues.apache.org/jira/browse/PIO-137
>             Project: PredictionIO
>          Issue Type: Bug
>          Components: Core
>    Affects Versions: 0.11.0-incubating
>            Reporter: Mukesh Gupta
>
> While running http://github.com/actionml/db-cleaner template, I am facing the following issue:
> {code:java}
> [WARN] [TaskSetManager] Lost task 0.1 in stage 23.0 (TID 75, ip-172-31-105-42.ap-southeast-1.compute.internal): java.lang.IllegalStateException: Connection pool is not yet initialized.(name:'default)
>         at scalikejdbc.ConnectionPool$$anonfun$get$1.apply(ConnectionPool.scala:76)
>         at scalikejdbc.ConnectionPool$$anonfun$get$1.apply(ConnectionPool.scala:74)
>         at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
>         at scala.collection.AbstractMap.getOrElse(Map.scala:58)
>         at scalikejdbc.ConnectionPool$.get(ConnectionPool.scala:74)
>         at scalikejdbc.ConnectionPool$.apply(ConnectionPool.scala:65)
>         at scalikejdbc.DB$.connectionPool(DB.scala:152)
>         at scalikejdbc.DB$.localTx(DB.scala:262)
>         at org.apache.predictionio.data.storage.jdbc.JDBCPEvents$$anonfun$delete$1$$anonfun$apply$11.apply(JDBCPEvents.scala:182)
>         at org.apache.predictionio.data.storage.jdbc.JDBCPEvents$$anonfun$delete$1$$anonfun$apply$11.apply(JDBCPEvents.scala:181)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>         at org.apache.predictionio.data.storage.jdbc.JDBCPEvents$$anonfun$delete$1.apply(JDBCPEvents.scala:181)
>         at org.apache.predictionio.data.storage.jdbc.JDBCPEvents$$anonfun$delete$1.apply(JDBCPEvents.scala:179)
>         at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
>         at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920)
>         at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
>         at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>         at org.apache.spark.scheduler.Task.run(Task.scala:89)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:748)
> {code}
> Following are my PIO settings
> {code:java}
> PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/elasticsearch
> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event
> PIO_STORAGE_SOURCES_PGSQL_INDEX=enabled
> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL
> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model
> PIO_HOME=/PredictionIO
> PIO_STORAGE_SOURCES_PGSQL_USERNAME=dbuser
> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=PGSQL
> PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://hostname:5432/predictionio
> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=es-hostname
> PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=es_cluster
> PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc
> PIO_STORAGE_SOURCES_PGSQL_PASSWORD=xxxxxxxxxxxxxx
> {code}
> I have tried upgrading the scalike version to 3.1.0 (as hinted by this: https://github.com/scalikejdbc/scalikejdbc/commit/b7713a8dbfb72b05f43743a5c281b5d7f7bac824)  but the issue still remains.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)