You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@griffin.apache.org by "Lionel Liu (JIRA)" <ji...@apache.org> on 2018/08/22 02:20:00 UTC

[jira] [Comment Edited] (GRIFFIN-188) Docker dev question

    [ https://issues.apache.org/jira/browse/GRIFFIN-188?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16588281#comment-16588281 ] 

Lionel Liu edited comment on GRIFFIN-188 at 8/22/18 2:19 AM:
-------------------------------------------------------------

Hi [~djkooks], seems like you're using docker container as griffin dependent environment, and running GriffinWebApplication at local or via your IDE, am I right?

1. In the 'service/src/main/resource/application.properties' you set: 

```

spring.datasource.url=jdbc:postgresql://192.168.99.100:{color:#ff0000}5432{color}/quartz?autoReconnect=true&useSSL=false

```

192.168.99.100 should be your docker host ip address, since you can not access docker container directly, in the docker-compose.yml file we've mapped the port 5432 of docker container to the port 35432 of docker host. Thus you need to set it like this:

```

spring.datasource.url=jdbc:postgresql://192.168.99.100:{color:#ff0000}35432{color}/quartz?autoReconnect=true&useSSL=false

```

 

2. I've noticed that you're running the code of master branch, because we've modified the json format of measure module recently, the docker image `bhlx3lyx7/griffin_spark2:0.2.0` is out of date. We've also updated the docker image in these days, you can pull the new docker image `bhlx3lyx7/griffin_spark2:{color:#ff0000}0.2.1{color}`, and modify the version number in docker-compose.yml you're using too.

We'll also update the document later.

 

Hope this helps you, thanks.


was (Author: lionel_3l):
Hi [~djkooks], seems like you're using docker container as griffin dependent environment, and running GriffinWebApplication at local or via your IDE, am I right?

1. In the 'service/src/main/resource/application.properties' you set: 

```

spring.datasource.url=jdbc:postgresql://192.168.99.100:{color:#FF0000}5432{color}/quartz?autoReconnect=true&useSSL=false

```

192.168.99.100 should be your docker host ip address, since you can not access docker container directly, in the docker-compose.yml file we've mapped the port 5432 of docker container to the port 35432 of docker host. Thus you need to set it like this:

```

spring.datasource.url=jdbc:postgresql://192.168.99.100:{color:#FF0000}35432{color}/quartz?autoReconnect=true&useSSL=false

```

 

2. I've noticed that you're running the code of master branch, because we've modified the json format of measure module recently, the docker image `bhlx3lyx7/griffin_spark2:0.2.0` is out of date. We've also updated the docker image in these days, you can pull the new docker image `bhlx3lyx7/griffin_spark2:{color:#FF0000}0.2.1{color}`, and modify the version number in docker-compose.yml you're using too.

 

Hope this helps you, thanks.

> Docker dev question
> -------------------
>
>                 Key: GRIFFIN-188
>                 URL: https://issues.apache.org/jira/browse/GRIFFIN-188
>             Project: Griffin (Incubating)
>          Issue Type: Task
>            Reporter: Kwang-in (Dennis) JUNG
>            Assignee: Lionel Liu
>            Priority: Trivial
>
> Hello,
> I'm following guide in `environment for dev`, and finished docker containers setup(API goes well via postman).
> Now, I setup the properties value and run GriffinWebApplication, but it failed:
> ```
> 2018-08-21 14:45:12.385 INFO 7667 --- [ main] o.a.g.c.c.EnvConfig : {
>  "spark" : {
>  "log.level" : "WARN",
>  "checkpoint.dir" : "hdfs:///griffin/checkpoint/${JOB_NAME}",
>  "init.clear" : true,
>  "batch.interval" : "1m",
>  "process.interval" : "5m",
>  "config" : {
>  "spark.default.parallelism" : 4,
>  "spark.task.maxFailures" : 5,
>  "spark.streaming.kafkaMaxRatePerPartition" : 1000,
>  "spark.streaming.concurrentJobs" : 4,
>  "spark.yarn.maxAppAttempts" : 5,
>  "spark.yarn.am.attemptFailuresValidityInterval" : "1h",
>  "spark.yarn.max.executor.failures" : 120,
>  "spark.yarn.executor.failuresValidityInterval" : "1h",
>  "spark.hadoop.fs.hdfs.impl.disable.cache" : true
>  }
>  },
>  "sinks" : [ {
>  "type" : "CONSOLE",
>  "config" : {
>  "max.log.lines" : 100
>  }
>  }, {
>  "type" : "HDFS",
>  "config" : {
>  "path" : "hdfs:///griffin/persist",
>  "max.persist.lines" : 10000,
>  "max.lines.per.file" : 10000
>  }
>  }, {
>  "type" : "ELASTICSEARCH",
>  "config" : {
>  "method" : "post",
>  "api" : "http://es:9200/griffin/accuracy"
>  }
>  } ],
>  "griffin.checkpoint" : [ {
>  "type" : "zk",
>  "config" : {
>  "hosts" : "zk:2181",
>  "namespace" : "griffin/infocache",
>  "lock.path" : "lock",
>  "mode" : "persist",
>  "init.clear" : false,
>  "close.clear" : false
>  }
>  } ]
> }
> 2018-08-21 14:45:12.387 INFO 7667 --- [ main] o.a.g.c.u.FileUtil : Location is empty. Read from default path.
> 2018-08-21 14:45:12.396 INFO 7667 --- [ main] o.a.g.c.u.FileUtil : Location is empty. Read from default path.
> 2018-08-21 14:45:12.397 INFO 7667 --- [ main] o.s.b.f.c.PropertiesFactoryBean : Loading properties file from class path resource [quartz.properties]
> 2018-08-21 14:45:12.400 INFO 7667 --- [ main] o.a.g.c.u.PropertiesUtil : Read properties successfully from /quartz.properties.
> 2018-08-21 14:45:12.516 INFO 7667 --- [ main] o.q.i.StdSchedulerFactory : Using default implementation for ThreadExecutor
> 2018-08-21 14:45:12.605 INFO 7667 --- [ main] o.q.c.SchedulerSignalerImpl : Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
> 2018-08-21 14:45:12.605 INFO 7667 --- [ main] o.q.c.QuartzScheduler : Quartz Scheduler v.2.2.2 created.
> 2018-08-21 14:45:22.613 INFO 7667 --- [ main] o.s.s.q.LocalDataSourceJobStore : Could not detect database type. Assuming locks can be taken.
> 2018-08-21 14:45:22.613 INFO 7667 --- [ main] o.s.s.q.LocalDataSourceJobStore : Using db table-based data access locking (synchronization).
> Aug 21, 2018 2:45:22 PM org.apache.tomcat.jdbc.pool.ConnectionPool init
> SEVERE: Unable to create initial connections of pool.
> org.postgresql.util.PSQLException: The connection attempt failed.
>  at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
>  at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
>  at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:215)
>  at org.postgresql.Driver.makeConnection(Driver.java:404)
>  at org.postgresql.Driver.connect(Driver.java:272)
>  at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:310)
>  at org.apache.tomcat.jdbc.pool.PooledConnection.connect(PooledConnection.java:203)
>  at org.apache.tomcat.jdbc.pool.ConnectionPool.createConnection(ConnectionPool.java:732)
>  at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:664)
>  at org.apache.tomcat.jdbc.pool.ConnectionPool.init(ConnectionPool.java:479)
>  at org.apache.tomcat.jdbc.pool.ConnectionPool.<init>(ConnectionPool.java:154)
>  at org.apache.tomcat.jdbc.pool.DataSourceProxy.pCreatePool(DataSourceProxy.java:118)
>  at org.apache.tomcat.jdbc.pool.DataSourceProxy.createPool(DataSourceProxy.java:107)
>  at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:131)
>  at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:111)
>  at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:77)
>  at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:326)
>  at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:366)
>  at org.springframework.scheduling.quartz.LocalDataSourceJobStore.initialize(LocalDataSourceJobStore.java:150)
>  at org.quartz.impl.StdSchedulerFactory.instantiate(StdSchedulerFactory.java:1321)
>  at org.quartz.impl.StdSchedulerFactory.getScheduler(StdSchedulerFactory.java:1525)
>  at org.springframework.scheduling.quartz.SchedulerFactoryBean.createScheduler(SchedulerFactoryBean.java:597)
>  at org.springframework.scheduling.quartz.SchedulerFactoryBean.afterPropertiesSet(SchedulerFactoryBean.java:480)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1687)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1624)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
>  at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
>  at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
>  at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
>  at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
>  at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:208)
>  at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138)
>  at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1066)
>  at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:585)
>  at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:88)
>  at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:366)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1264)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:553)
>  at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
>  at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
>  at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
>  at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
>  at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
>  at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:761)
>  at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:866)
>  at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:542)
>  at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:122)
>  at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:737)
>  at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:370)
>  at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
>  at org.springframework.boot.SpringApplication.run(SpringApplication.java:1162)
>  at org.springframework.boot.SpringApplication.run(SpringApplication.java:1151)
>  at org.apache.griffin.core.GriffinWebApplication.main(GriffinWebApplication.java:39)
> Caused by: java.net.SocketTimeoutException: connect timed out
>  at java.net.PlainSocketImpl.socketConnect(Native Method)
>  at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
>  at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
>  at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
>  at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>  at java.net.Socket.connect(Socket.java:589)
>  at org.postgresql.core.PGStream.<init>(PGStream.java:61)
>  at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:144)
>  ... 53 more
> 2018-08-21 14:45:22.618 INFO 7667 --- [ main] o.s.s.q.LocalDataSourceJobStore : JobStoreCMT initialized.
> 2018-08-21 14:45:22.619 INFO 7667 --- [ main] o.q.c.QuartzScheduler : Scheduler meta-data: Quartz Scheduler (v2.2.2) 'schedulerFactoryBean' with instanceId 'LM-MacBook-531534830312583'
>  Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
>  NOT STARTED.
>  Currently in standby mode.
>  Number of jobs executed: 0
>  Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 5 threads.
>  Using job-store 'org.springframework.scheduling.quartz.LocalDataSourceJobStore' - which supports persistence. and is clustered.
> ```
> It looks like there are problem accessing postgresql. Do I have to do something more?
> I setup 'service/src/main/resource/application.properties' as:
> ```
> spring.datasource.url=jdbc:postgresql://192.168.99.100:5432/quartz?autoReconnect=true&useSSL=false
> spring.datasource.username=griffin
> spring.datasource.password=123456
> spring.jpa.generate-ddl=true
> spring.datasource.driver-class-name=org.postgresql.Driver
> ```
>  
> Thanks



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)