You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by chenxuying <cx...@163.com> on 2020/09/28 07:40:16 UTC

flink1.11.2基于官网在k8s上部署是正常的,但是加了volume配置之后报错Read-only file system

我在使用k8s部署的时候也是按照官网的方式[1],是正常使用的, 然后后面加了volume配置

{

  ...

  "spec": {

    ...

    "template": {

      ...

      "spec": {

        "volumes": [

          ...

          {

            "name": "libs-volume",

            "hostPath": {

              "path": "/data/volumes/flink/jobmanager/cxylib",

              "type": ""

            }

          },

          ...

        ],

        "containers": [

          {

            ...

            "volumeMounts": [

              {

                "name": "flink-config-volume",

                "mountPath": "/opt/flink/conf"

              },

              ...

            ],

            ...

          }

        ],

        ...

      }

    },

    ...

  },

  ...

}

然后启动jobmanager报错

Starting Job Manager

sed: couldn't open temporary file /opt/flink/conf/sedz0NYKX: Read-only file system

sed: couldn't open temporary file /opt/flink/conf/sede6R0BY: Read-only file system

/docker-entrypoint.sh: 72: /docker-entrypoint.sh: cannot create /opt/flink/conf/flink-conf.yaml: Permission denied

/docker-entrypoint.sh: 91: /docker-entrypoint.sh: cannot create /opt/flink/conf/flink-conf.yaml.tmp: Read-only file system

Starting standalonesession as a console application on host flink-jobmanager-66fb98869d-w7plb.

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".

SLF4J: Defaulting to no-operation (NOP) logger implementation

SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

Sep 28, 2020 7:11:14 AM org.apache.hadoop.util.NativeCodeLoader <clinit>

WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable




[1]:https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html#session-cluster-resource-definitions

Re: flink1.11.2基于官网在k8s上部署是正常的,但是加了volume配置之后报错Read-only file system

Posted by Yang Wang <da...@gmail.com>.
sed: couldn't open temporary file /opt/flink/conf/sedz0NYKX: Read-only file
system

上面的这个报错主要是因为挂载的flink-conf volume在容器内是不可写的,而docker-entrypoint.sh[1]会尝试修改这个文件
所以会有如上的报错,如果你的所有配置都在flink-conf的ConfigMap里面定义好了,这个报错是可以忽略的

[1].
https://github.com/apache/flink-docker/blob/dev-master/docker-entrypoint.sh

Best,
Yang

cxydevelop@163.com <cx...@163.com> 于2020年10月10日周六 上午11:20写道:

> 问题已经解决,挂载的lib目录是之前1.10版本的jar包,然后简单把flink1.10的一些jar包换成1.11版本, 结果没有替换干净,
> 所以导致启动不了. 把挂载目录里面全部删除,换成标准的lib, 再添加自己需要的jar就可以正常启动.
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/

Re: flink1.11.2基于官网在k8s上部署是正常的,但是加了volume配置之后报错Read-only file system

Posted by "cxydevelop@163.com" <cx...@163.com>.
问题已经解决,挂载的lib目录是之前1.10版本的jar包,然后简单把flink1.10的一些jar包换成1.11版本, 结果没有替换干净,
所以导致启动不了. 把挂载目录里面全部删除,换成标准的lib, 再添加自己需要的jar就可以正常启动.



--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: flink1.11.2基于官网在k8s上部署是正常的,但是加了volume配置之后报错Read-only file system

Posted by "cxydevelop@163.com" <cx...@163.com>.
官网例子[1]没有改动之前,是可以正常启动的
原来的配置文件内容如下
...
"volumes": [
  {
	"name": "flink-config-volume",
	"configMap": {
	  "name": "flink-config",
	  "items": [
		{
		  "key": "flink-conf.yaml",
		  "path": "flink-conf.yaml"
		},
		{
		  "key": "log4j-console.properties",
		  "path": "log4j-console.properties"
		}
	  ],
	  "defaultMode": 420
	}
  }
],
...
"volumeMounts": [
  {
	"name": "flink-config-volume",
	"mountPath": "/opt/flink/conf"
  }
],
...

后面在k8s上修改了yaml文件,就是增加了其他volumes
然后增加uploadjar, completed-jobs和lib的挂载,如下
...
"volumes": [
  {
	"name": "flink-config-volume",
	"configMap": {
	  "name": "flink-config",
	  "items": [
		{
		  "key": "flink-conf.yaml",
		  "path": "flink-conf.yaml"
		},
		{
		  "key": "log4j-console.properties",
		  "path": "log4j-console.properties"
		}
	  ],
	  "defaultMode": 420
	}
  },
  {
	"name": "flink-uploadjar-volume",
	"hostPath": {
	  "path": "/data/volumes/flink/jobmanager/uploadjar",
	  "type": ""
	}
  },
  {
	"name": "flink-completejobs-volume",
	"hostPath": {
	  "path": "/data/volumes/flink/jobmanager/completed-jobs/",
	  "type": ""
	}
  },
  {
	"name": "libs-volume",
	"hostPath": {
	  "path": "/data/volumes/flink/jobmanager/lib",
	  "type": ""
	}
  }
],
...
"volumeMounts": [
  {
	"name": "flink-config-volume",
	"mountPath": "/opt/flink/conf"
  },
  {
	"name": "flink-uploadjar-volume",
	"mountPath": "/opt/flink/flink-uploadjar"
  },
  {
	"name": "flink-completejobs-volume",
	"mountPath": "/opt/flink/completed-jobs/"
  },
  {
	"name": "libs-volume",
	"mountPath": "/opt/flink/lib"
  }
],
...
增加了volume之后就报了错误
2020-09-29T12:09:33.055804861Z Starting Job Manager
2020-09-29T12:09:33.061359237Z sed: couldn't open temporary file
/opt/flink/conf/sedVef7YR: Read-only file system
2020-09-29T12:09:33.06561576Z sed: couldn't open temporary file
/opt/flink/conf/sed4a7zGR: Read-only file system
2020-09-29T12:09:33.068683501Z /docker-entrypoint.sh: 72:
/docker-entrypoint.sh: cannot create /opt/flink/conf/flink-conf.yaml:
Permission denied
2020-09-29T12:09:33.068700999Z /docker-entrypoint.sh: 91:
/docker-entrypoint.sh: cannot create /opt/flink/conf/flink-conf.yaml.tmp:
Read-only file system
2020-09-29T12:09:33.919147511Z Starting standalonesession as a console
application on host flink-jobmanager-6d5bc45dbb-jjs4f.
2020-09-29T12:09:34.154740531Z log4j:WARN No appenders could be found for
logger (org.apache.flink.runtime.entrypoint.ClusterEntrypoint).
2020-09-29T12:09:34.154769537Z log4j:WARN Please initialize the log4j system
properly.
2020-09-29T12:09:34.154776198Z log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

经过排查,发现是在挂载了/opt/flink/lib之后就会出错,挂载其他目录是可以正常运行的/opt/flink/flink-uploadjar,/opt/flink/completed-jobs/

就是增加了下面这个配置出现的错误
"volumeMounts": [
  ...
  {
	"name": "libs-volume",
	"mountPath": "/opt/flink/lib"
  }
  ...
],



[1]:https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html#session-cluster-resource-definitions




--
Sent from: http://apache-flink.147419.n8.nabble.com/