等保测评命令——flink

各位大佬,想看那种网络设备/操作系统/数据库/中间件的测评命令清单,可在留言区留言!

依据 GB/T 22239-2019《信息安全技术 网络安全等级保护基本要求》第三级”安全计算环境” 条款,结合Flink官方安全指南及现场测评实践。

适用版本:Flink 1.12.x / 1.13.x / 1.14.x / 1.15.x / 1.16.x / 1.17.x / 1.18.x


一、身份鉴别

1.1 Web UI与REST API认证

控制项测评命令/配置达标判据
Web UI认证flink-conf.yaml security.ssl.rest.enabled启用HTTPS
用户认证flink-conf.yaml security.authentication.enabled启用认证
Kerberos认证flink-conf.yaml security.kerberos.login.keytab启用Kerberos
证书认证flink-conf.yaml security.ssl.rest.keystore证书认证
会话管理Web UI Cookie配置启用HttpOnly/Secure

Flink特有配置:

# 查看Flink配置目录
ls-la$FLINK_HOME/conf/

# 查看核心配置文件
cat$FLINK_HOME/conf/flink-conf.yaml

# 关键安全配置检查
grep-E"security|ssl|auth|kerberos|password"$FLINK_HOME/conf/flink-conf.yaml

# 关键配置项:
# security.ssl.rest.enabled: true
# security.ssl.internal.enabled: true
# security.authentication.enabled: true
# security.kerberos.login.keytab: /path/to/keytab
# security.kerberos.login.principal: flink-user@EXAMPLE.COM
# security.ssl.rest.keystore: /path/to/keystore.jks
# security.ssl.rest.truststore: /path/to/truststore.jks
# security.ssl.rest.keystore-password: ${keystore_password}
# security.ssl.rest.key-password: ${key_password}
# security.ssl.rest.truststore-password: ${truststore_password}

# 查看历史服务器配置(History Server)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"historyserver|history-server"

# 查看Web UI绑定配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"rest.bind-address|rest.port|webui"

# 查看是否绑定特定IP(而非0.0.0.0)
cat$FLINK_HOME/conf/flink-conf.yaml |grep"rest.bind-address"
# 应配置为具体IP,而非0.0.0.0

# 查看REST API访问控制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"rest.address|rest.port|rest.ssl"

# 查看Cookie安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"session|cookie"

1.2 Kerberos认证配置(生产环境推荐)

# 查看Kerberos配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-A5"kerberos"

# 关键配置:
# security.kerberos.login.keytab: /etc/flink/conf/flink.keytab
# security.kerberos.login.principal: flink@EXAMPLE.COM
# security.kerberos.login.use-ticket-cache: false
# security.kerberos.login.contexts: Client,KafkaClient

# 查看Hadoop安全集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hadoop.security|dfs.namenode.kerberos"

# 查看ZooKeeper安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"zookeeper.sasl|zk.sasl"

# 查看Kafka安全集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"kafka.security|kafka.sasl|kafka.ssl"

# 查看HBase安全集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hbase.security|hbase.kerberos"

# 查看Hive安全集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hive.security|hive.metastore.kerberos"

二、访问控制

2.1 网络与端口访问控制

# 查看Flink端口配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"port|bind|address"

# 关键端口说明:
# jobmanager.rpc.port: 6123 (JobManager RPC)
# taskmanager.rpc.port: 6122 (TaskManager RPC)
# rest.port: 8081 (Web UI/REST API)
# blob.server.port: 0 (随机) 或指定端口
# query.server.port: 0 (随机) 或指定端口

# 查看RPC绑定地址
cat$FLINK_HOME/conf/flink-conf.yaml |grep"jobmanager.rpc.address"
cat$FLINK_HOME/conf/flink-conf.yaml |grep"taskmanager.host"

# 查看Blob服务器配置(代码分发安全)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"blob.server|blob.storage"

# 查看查询服务器配置(Table SQL查询)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"query.server|sql-gateway"

# 查看防火墙规则(应限制Flink端口)
iptables -L-n|grep-E"6122|6123|8081"
firewall-cmd --list-all |grep-E"6122|6123|8081"

# 查看masters文件(高可用配置)
cat$FLINK_HOME/conf/masters

# 查看workers文件(TaskManager配置)
cat$FLINK_HOME/conf/workers

2.2 文件系统与存储访问控制

# 查看Checkpoint/Savepoint存储安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.backend|checkpoint|savepoint"

# HDFS安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"fs.hdfs.hadoopconf|hdfs.security"

# S3/OSS安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"s3.access-key|s3.secret-key|oss.accessKeyId"
# 应使用IAM角色或加密存储,避免明文AK/SK

# 查看RocksDB状态后端配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.backend.rocksdb|rocksdb.security"

# 查看增量Checkpoint配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.incremental|execution.checkpointing.incremental"

# 查看状态TTL配置(数据保留策略)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"table.exec.state.ttl|state.ttl"

# 查看状态加密配置(Flink 1.16+)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.backend.encryption|state.changelog.encryption"

2.3 作业提交与执行控制

# 查看作业提交权限控制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"web.submit.enable|web.cancel.enable"

# 建议生产环境:
# web.submit.enable: false  # 禁止Web提交
# web.cancel.enable: false  # 禁止Web取消

# 查看作业恢复配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"execution.restart|restart-strategy"

# 查看最大并行度限制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"pipeline.max-parallelism|slot.number"

# 查看资源配额配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"taskmanager.memory|jobmanager.memory|slot.number"

# 查看类加载隔离
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"classloader.resolve-order|classloader.parent-first-patterns"

# 查看SQL客户端安全配置
cat$FLINK_HOME/conf/sql-client-defaults.yaml |grep-E"security|auth|password"

三、安全审计

3.1 日志与审计配置

控制项测评命令/配置达标判据
日志级别log4j.properties rootLoggerINFO或WARN
审计日志flink-conf.yaml historyserver.archive.fs.dir启用历史记录
访问日志Web UI访问日志记录访问IP/用户
作业日志flink-conf.yaml env.log.dir独立存储
日志保留log4j.appender.file.strategy≥6个月

Flink特有配置:

# 查看日志配置
cat$FLINK_HOME/conf/log4j.properties
cat$FLINK_HOME/conf/log4j-cli.properties
cat$FLINK_HOME/conf/log4j-session.properties

# 关键配置:
# rootLogger.level = INFO
# appender.file.type = RollingFile
# appender.file.strategy.type = DefaultRolloverStrategy
# appender.file.strategy.max = 30

# 查看日志目录
ls-la$FLINK_HOME/log/
ls-la /var/log/flink/ 2>/dev/null

# 查看日志文件权限
ls-la$FLINK_HOME/log/*.log |head-5

# 查看历史服务器配置(审计关键)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"historyserver"

# 关键配置:
# historyserver.web.address: 0.0.0.0
# historyserver.web.port: 8082
# historyserver.archive.fs.dir: hdfs:///completed-jobs/
# historyserver.archive.fs.refresh-interval: 10000

# 查看审计事件日志(自定义)
cat$FLINK_HOME/conf/log4j.properties |grep-i"audit"

# 查看Metrics配置(性能审计)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"metrics|prometheus|influxdb"

# 查看JMX配置(监控审计)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"jmx|metrics.jmx"

3.2 作业执行审计

# 查看已完成作业(通过History Server)
curl-k https://localhost:8082/jobs/overview 2>/dev/null | jq '.'

# 查看正在运行的作业
curl-k https://localhost:8081/jobs 2>/dev/null | jq '.'

# 查看作业异常(通过REST API)
curl-k https://localhost:8081/jobs/<job-id>/exceptions 2>/dev/null

# 查看Checkpoints(数据一致性审计)
curl-k https://localhost:8081/jobs/<job-id>/checkpoints 2>/dev/null

# 查看Savepoints(手动备份审计)
hdfs dfs -ls /flink/savepoints/ 2>/dev/null

# 查看作业配置(参数审计)
curl-k https://localhost:8081/jobs/<job-id>/config 2>/dev/null

# 查看TaskManager日志(执行审计)
ls-la$FLINK_HOME/log/flink-*-taskmanager-*.log

# 查看JobManager日志(调度审计)
ls-la$FLINK_HOME/log/flink-*-jobmanager-*.log

四、入侵防范

4.1 系统加固

# 查看Flink版本(确认最新补丁)
cat$FLINK_HOME/RELEASE 2>/dev/null
cat$FLINK_HOME/lib/flink-dist-*.jar |grep-oP'\d+\.\d+\.\d+'

# 查看CVE漏洞(需外部扫描)
# 检查是否使用存在漏洞的组件版本

# 查看依赖组件安全
ls-la$FLINK_HOME/lib/ |grep-E"log4j|jackson|netty|zookeeper|kafka"

# Log4j漏洞检查(Log4Shell)
ls-la$FLINK_HOME/lib/log4j-core-*.jar
# 应使用2.17.1+版本

# 查看序列化安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"akka.serialization|pipeline.serialization"

# 查看Akka配置(RPC框架安全)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"akka|rpc.akka"

# 查看JMX配置(限制远程访问)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"jmx|com.sun.management"

# 查看调试配置(生产环境禁用)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"debug|trace|logging.level"

# 查看环境变量(敏感信息检查)
env|grep-E"FLINK|KEYTAB|PASSWORD|SECRET|TOKEN"|grep-v"HIST"

# 查看启动脚本安全
cat$FLINK_HOME/bin/flink-daemon.sh |grep-E"umask|chmod|chown"

4.2 资源限制与防护

# 查看内存配置(防止OOM攻击)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"memory|heap|off-heap"

# JobManager内存
# jobmanager.memory.process.size: 1600m
# jobmanager.memory.heap.size: 1024m
# jobmanager.memory.off-heap.size: 256m

# TaskManager内存
# taskmanager.memory.process.size: 1728m
# taskmanager.memory.flink.size: 1280m
# taskmanager.memory.managed.size: 512m

# 查看网络缓冲区限制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"network.memory|buffer-size"

# 查看Slot共享组限制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"slot.sharing|slot.number"

# 查看最大并发度限制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"max-parallelism|pipeline.max-parallelism"

# 查看文件句柄限制
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"fs.allowed-fallback-filesystems|fs.file-limit"

# 查看GC配置(防止GC攻击)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"gc|garbage|jvm-opts"

# 查看超时配置(防止资源占用)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"timeout|heartbeat|interval"

# 关键超时:
# akka.ask.timeout: 10s
# akka.lookup.timeout: 10s
# heartbeat.interval: 10000
# heartbeat.timeout: 50000

五、数据安全与加密

5.1 传输加密(SSL/TLS)

# 查看内部通信SSL配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"security.ssl.internal"

# 关键配置:
# security.ssl.internal.enabled: true
# security.ssl.internal.keystore: /path/to/internal.keystore
# security.ssl.internal.truststore: /path/to/internal.truststore
# security.ssl.internal.keystore-password: ${internal_keystore_password}
# security.ssl.internal.truststore-password: ${internal_truststore_password}

# 查看REST API SSL配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"security.ssl.rest"

# 关键配置:
# security.ssl.rest.enabled: true
# security.ssl.rest.keystore: /path/to/rest.keystore
# security.ssl.rest.truststore: /path/to/rest.truststore
# security.ssl.rest.keystore-password: ${rest_keystore_password}

# 查看证书详情
keytool -list-v-keystore$FLINK_HOME/conf/flink.keystore 2>/dev/null |head-20

# 查看证书有效期
keytool -list-v-keystore$FLINK_HOME/conf/flink.keystore 2>/dev/null |grep-E"Valid from|until"

# 查看协议版本(应禁用TLSv1.0/1.1)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"ssl.protocol|ssl.algorithms"

# 查看国密SSL配置(如使用国密版JDK)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"gmssl|sm2|sm3|sm4"

5.2 数据存储加密

# 查看Checkpoint加密(Flink 1.16+)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.backend.encryption|state.changelog.encryption"

# 查看RocksDB加密配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"rocksdb.encryption|state.backend.rocksdb.options"

# 查看HDFS透明加密集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hdfs.encryption|dfs.encryption"

# 查看S3加密配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"s3.server-side-encryption|s3.sse"

# 查看Kafka加密配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"kafka.ssl|kafka.sasl|kafka.security"

# 查看数据脱敏配置(通过SQL)
cat$FLINK_HOME/conf/sql-client-defaults.yaml |grep-E"mask|masking|desensit"

# 查看敏感数据检测(自定义UDF)
ls-la$FLINK_HOME/udfs/ 2>/dev/null |grep-i"mask|encrypt|desensit"

六、高可用与灾备

# 查看高可用配置(HA)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"high-availability|ha."

# 关键配置:
# high-availability: zookeeper
# high-availability.zookeeper.quorum: zk1:2181,zk2:2181,zk3:2181
# high-availability.zookeeper.path.root: /flink
# high-availability.cluster-id: /default_ns
# high-availability.storageDir: hdfs:///flink/ha/

# 查看ZooKeeper安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"zookeeper.sasl|zookeeper.ssl"

# 查看Kubernetes集成(云原生HA)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"kubernetes|k8s"

# 查看Checkpoint配置(数据恢复)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"execution.checkpointing"

# 关键配置:
# execution.checkpointing.interval: 60000
# execution.checkpointing.mode: EXACTLY_ONCE
# execution.checkpointing.externalized-checkpoint-retention: RETAIN_ON_CANCELLATION
# state.checkpoints.dir: hdfs:///flink/checkpoints

# 查看Savepoint配置(手动备份)
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.savepoints.dir"

# 查看状态后端配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"state.backend|state.checkpoints.num-retained"

# 查看备份验证(通过REST API)
curl-k https://localhost:8081/jobs/<job-id>/checkpoints/config 2>/dev/null

# 查看恢复测试记录(运维记录)
ls-la$FLINK_HOME/restore-tests/ 2>/dev/null
cat$FLINK_HOME/logs/restore-test.log 2>/dev/null

七、Flink与大数据生态安全集成

7.1 Hadoop生态安全

# 查看Hadoop配置集成
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"fs.hdfs.hadoopconf|HADOOP_CONF_DIR"

# 查看HDFS安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"dfs.namenode.kerberos|hadoop.security.authentication"

# 查看YARN安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"yarn|security.kerberos"

# 查看Hive安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hive.metastore.kerberos|hive.security"

# 查看HBase安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"hbase.security|hbase.kerberos"

7.2 消息队列安全

# 查看Kafka安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"kafka.security.protocol|kafka.sasl"

# 关键配置:
# kafka.security.protocol: SASL_SSL
# kafka.sasl.mechanism: GSSAPI
# kafka.sasl.kerberos.service.name: kafka

# 查看Pulsar安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"pulsar.auth|pulsar.tls"

# 查看RabbitMQ安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"rabbitmq.ssl|rabbitmq.username|rabbitmq.password"

# 查看RocketMQ安全配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"rocketmq.access-key|rocketmq.secret-key"

7.3 数据库安全

# 查看JDBC连接安全
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"jdbc.username|jdbc.password"
# 应使用连接池或加密存储,避免明文密码

# 查看MySQL SSL配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"mysql.ssl|jdbc:mysql://.*ssl"

# 查看PostgreSQL SSL配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"postgresql.ssl|jdbc:postgresql://.*ssl"

# 查看Oracle SSL配置
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"oracle.ssl|jdbc:oracle:.*ssl"

# 查看Elasticsearch安全
cat$FLINK_HOME/conf/flink-conf.yaml |grep-E"elasticsearch.security|es.security"

八、一键巡检脚本(Flink)

#!/bin/bash
# Apache Flink 等保三级一键巡检脚本
# 适用:Flink 1.12.x - 1.18.x

FLINK_HOME=${1:-/opt/flink}
echo"===== Apache Flink 等保三级巡检 ====="
echo"巡检时间:$(date)"
echo"主机名:$(hostname)"
echo"FLINK_HOME: $FLINK_HOME"
echo""

if[!-d"$FLINK_HOME"];then
echo"错误:未找到Flink安装目录 $FLINK_HOME"
echo"请指定正确的Flink安装路径"
exit1
fi

echo"===== 1 身份鉴别 ====="
echo"--- 版本信息 ---"
cat${FLINK_HOME}/RELEASE 2>/dev/null ||echo"未找到RELEASE文件"
ls${FLINK_HOME}/lib/flink-dist-*.jar 2>/dev/null |head-1

echo"--- SSL/HTTPS配置 ---"
grep-E"security.ssl.rest.enabled|security.ssl.internal.enabled"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo"--- 认证配置 ---"
grep-E"security.authentication.enabled|security.kerberos"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- Kerberos配置 ---"
grep"security.kerberos.login"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- Web UI绑定 ---"
grep"rest.bind-address"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null
echo"[警告] 如果为0.0.0.0,建议绑定具体IP"

echo""
echo"===== 2 访问控制 ====="
echo"--- RPC端口配置 ---"
grep-E"jobmanager.rpc.port|taskmanager.rpc.port"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo"--- REST端口配置 ---"
grep-E"rest.port|rest.bind-port"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo"--- 作业提交控制 ---"
grep-E"web.submit.enable|web.cancel.enable"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null
echo"[建议] 生产环境应设置为false"

echo"--- 高可用配置 ---"
grep-E"high-availability|ha\."${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- Checkpoint/Savepoint目录 ---"
grep-E"state.checkpoints.dir|state.savepoints.dir"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo""
echo"===== 3 安全审计 ====="
echo"--- 日志配置 ---"
grep-E"rootLogger.level|appender.file"${FLINK_HOME}/conf/log4j.properties 2>/dev/null |head-5

echo"--- 日志目录 ---"
ls-la${FLINK_HOME}/log/ 2>/dev/null |head-5

echo"--- 历史服务器配置 ---"
grep"historyserver"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- Metrics配置 ---"
grep-E"metrics.reporter|prometheus|influxdb"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo""
echo"===== 4 入侵防范 ====="
echo"--- 内存配置 ---"
grep-E"jobmanager.memory|taskmanager.memory"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- 序列化配置 ---"
grep-E"akka.serialization|pipeline.serialization"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo"--- 超时配置 ---"
grep-E"akka.ask.timeout|heartbeat"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- 依赖组件检查 ---"
ls${FLINK_HOME}/lib/log4j-core-*.jar 2>/dev/null |head-1
echo"[检查] 确认Log4j版本>=2.17.1(防Log4Shell)"

echo""
echo"===== 5 数据安全 ====="
echo"--- SSL内部通信 ---"
grep"security.ssl.internal"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- SSL REST API ---"
grep"security.ssl.rest"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo"--- 证书配置 ---"
ls-la${FLINK_HOME}/conf/*.jks ${FLINK_HOME}/conf/*.p12 2>/dev/null |head-3

echo"--- 状态加密 ---"
grep-E"state.backend.encryption|state.changelog.encryption"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null

echo"--- Kafka安全 ---"
grep-E"kafka.security|kafka.sasl|kafka.ssl"${FLINK_HOME}/conf/flink-conf.yaml 2>/dev/null |head-5

echo""
echo"===== 6 通用安全检查 ====="
echo"--- 进程运行用户 ---"
ps-ef|grep-E"flink|jobmanager|taskmanager"|grep-vgrep|grepjava|awk'{print $1}'|sort|uniq-c

echo"--- 配置文件权限 ---"
ls-la${FLINK_HOME}/conf/flink-conf.yaml ${FLINK_HOME}/conf/log4j.properties 2>/dev/null |awk'{print $1, $3, $4, $9}'

echo"--- 日志文件权限 ---"
ls-la${FLINK_HOME}/log/*.log 2>/dev/null |head-3|awk'{print $1, $3, $4, $9}'

echo"--- 端口监听 ---"
ss -tulnp|grep-E"8081|6122|6123"2>/dev/null |head-5

echo"--- 环境变量检查 ---"
env|grep-E"FLINK|KEYTAB|PASSWORD|SECRET"|grep-v"HIST"|awk-F='{print $1}'|head-5
echo"[警告] 避免在环境变量中存储敏感信息"

echo""
echo"===== 巡检完成 ====="
echo"重点关注以下高风险项:"
echo"1. Web UI未启用HTTPS(security.ssl.rest.enabled=false)"
echo"2. 未启用Kerberos认证(生产环境推荐)"
echo"3. REST API绑定0.0.0.0(rest.bind-address)"
echo"4. 允许Web提交作业(web.submit.enable=true)"
echo"5. 未配置内部SSL加密(security.ssl.internal.enabled=false)"
echo"6. 使用存在漏洞的Log4j版本"
echo"7. 明文存储密码或AK/SK"
echo"8. Checkpoint/Savepoint未加密"
echo"9. 未配置高可用(单点故障风险)"
echo"10. 日志保留期限不足"

九、高风险项重点核查清单

检查项验证命令不合规判定整改建议
Web UI未启用HTTPSgrep security.ssl.rest.enabledfalse或未配置启用SSL并配置证书
REST API绑定0.0.0.0grep rest.bind-address0.0.0.0或注释绑定具体IP地址
允许Web提交作业grep web.submit.enabletrue生产环境设为false
未启用内部SSLgrep security.ssl.internal.enabledfalse或未配置启用内部通信加密
未启用Kerberosgrep security.kerberos.login.keytab未配置启用Kerberos认证
明文存储密码`grep -E “passwordsecret-key” flink-conf.yaml`存在明文使用变量或密钥管理系统
Log4j版本过低ls lib/log4j-core-*.jar<2.17.1升级Log4j版本
未配置HAgrep high-availability未配置或none配置ZooKeeper/K8s HA
Checkpoint未加密grep state.backend.encryption未配置启用状态加密
日志保留不足grep max log4j.properties<30或未配置配置日志轮转保留≥6个月

十、Flink与Spark/Storm对比

对比项Apache FlinkApache SparkApache Storm
处理模型流处理(原生)批处理(微批流)流处理
安全认证Kerberos/SSLKerberos/SSLSASL/SSL
细粒度安全较简单较完善较简单
数据加密支持(1.16+)支持有限
审计日志基础较完善基础
与K8s集成优秀良好一般
等保合规难度
国密支持需配置需配置需配置

十一、等保测评执行要点

1. 部署模式安全差异

# Standalone模式(开发测试)
# 安全风险高,不建议生产使用

# YARN模式(企业常用)
# 依赖YARN安全,需配置Kerberos

# Kubernetes模式(云原生)
# 依赖K8s RBAC和网络策略

# Native Kubernetes模式(推荐)
# Flink 1.10+支持,安全性更好

2. 关键安全配置检查

# flink-conf.yaml 生产环境推荐配置
security.ssl.rest.enabled:true
security.ssl.internal.enabled:true
security.authentication.enabled:true
security.kerberos.login.keytab: /etc/flink/conf/flink.keytab
security.kerberos.login.principal: flink@EXAMPLE.COM

rest.bind-address: 192.168.1.100  # 具体IP,非0.0.0.0
web.submit.enable:false# 禁止Web提交
web.cancel.enable:false# 禁止Web取消

state.checkpoints.dir: hdfs:///flink/checkpoints
state.savepoints.dir: hdfs:///flink/savepoints
state.backend.incremental:true
state.backend.rocksdb.memory.managed:true

execution.checkpointing.interval:60000
execution.checkpointing.mode: EXACTLY_ONCE
execution.checkpointing.externalized-checkpoint-retention: RETAIN_ON_CANCELLATION

3. 现场访谈要点

  • 是否启用Kerberos认证(生产环境必须)
  • 是否配置内部SSL加密(防止网络嗅探)
  • 是否限制Web UI访问(绑定IP+HTTPS)
  • 是否禁止Web提交作业(使用CLI/API提交)
  • 是否配置Checkpoint/Savepoint加密
  • 是否定期备份作业配置和状态
  • 是否监控异常作业提交和执行

4. 版本差异

功能项Flink 1.12Flink 1.14Flink 1.16Flink 1.18
原生K8s支持增强完善推荐
状态加密不支持实验性支持增强
SQL Gateway不支持支持增强完善
自适应调度不支持实验性支持增强
检查点改进基础增强增量增强完善

参考标准:GB/T 22239-2019、GB/T 28448-2019、Apache Flink Security Documentation

适用版本:Flink 1.12.x / 1.13.x / 1.14.x / 1.15.x / 1.16.x / 1.17.x / 1.18.x

验证环境:x86_64 / ARM64 / 国产化芯片(飞腾/鲲鹏/龙芯/海光/兆芯/申威)

声明:来自汪汪虚拟空间,仅代表创作者观点。链接:https://eyangzhen.com/7666.html

汪汪虚拟空间的头像汪汪虚拟空间

相关推荐

添加微信
添加微信
Ai学习群
返回顶部