Compare commits

...

3 Commits

Author SHA1 Message Date
孙超
6c62b019a6 build dependencies version lock 2023-02-23 14:55:58 +08:00
fengqiongfeng
d78512f6b7 HA-添加高可用相关表结构 2023-02-23 11:31:55 +08:00
zengqiao
e81c0f3040 v2.8.0_e初始化
1、测试代码,开源用户尽量不要使用;
2、包含Kafka-HA的相关功能;
3、并非基于2.6.0拉的分支,是基于master分支的 commit-id: 462303fca0 拉的2.8.0_e的分支。出现这个情况的原因是v2.6.0的代码并不是最新的,2.x最新的代码是 462303fca0 这个commit对应的代码;
2023-02-13 16:35:43 +08:00
179 changed files with 10000 additions and 1675 deletions

View File

@@ -1,20 +1,21 @@
---
![KnowStreaing](https://user-images.githubusercontent.com/71620349/183546097-71451983-d00e-4ad4-afb0-43fb597c69a9.png)
**一站式`Apache Kafka`管控平台**
![logikm_logo](https://user-images.githubusercontent.com/71620349/125024570-9e07a100-e0b3-11eb-8ebc-22e73e056771.png)
`LogiKM开源至今备受关注考虑到开源项目应该更贴合Apache Kafka未来发展方向经项目组慎重考虑我们将其品牌升级成Know Streaming新的大版本更新马上就绪感谢大家一如既往的支持也欢迎Kafka爱好者一起共建社区`
**一站式`Apache Kafka`集群指标监控与运维管控平台**
阅读本README文档您可以了解到滴滴Know Streaming的用户群体、产品定位等信息并通过体验地址快速体验Kafka集群指标监控与运维管控的全流程。
`LogiKM开源至今备受关注考虑到开源项目应该更贴合Apache Kafka未来发展方向经项目组慎重考虑预计22年下半年将其品牌升级成Know Streaming届时项目名称和Logo也将统一更新感谢大家一如既往的支持敬请期待`
阅读本README文档您可以了解到滴滴Logi-KafkaManager的用户群体、产品定位等信息并通过体验地址快速体验Kafka集群指标监控与运维管控的全流程。
## 1 产品简介
滴滴Know Streaming脱胎于滴滴内部多年的Kafka运营实践经验是面向Kafka用户、Kafka运维人员打造的共享多租户Kafka云平台。专注于Kafka运维管控、监控告警、资源治理等核心场景经历过大规模集群、海量大数据的考验。内部满意度高达90%的同时,还与多家知名企业达成商业化合作。
滴滴Logi-KafkaManager脱胎于滴滴内部多年的Kafka运营实践经验是面向Kafka用户、Kafka运维人员打造的共享多租户Kafka云平台。专注于Kafka运维管控、监控告警、资源治理等核心场景经历过大规模集群、海量大数据的考验。内部满意度高达90%的同时,还与多家知名企业达成商业化合作。
### 1.1 快速体验地址
- 体验地址(新的体验地址马上就来) http://117.51.150.133:8080 账号密码 admin/admin
- 体验地址 http://117.51.150.133:8080 账号密码 admin/admin
### 1.2 体验地图
相比较于同类产品的用户视角单一大多为管理员视角滴滴Logi-KafkaManager建立了基于分角色、多场景视角的体验地图。分别是**用户体验地图、运维体验地图、运营体验地图**
@@ -44,7 +45,7 @@
-        :监控多项核心指标,统计不同分位数据,提供种类丰富的指标监控报表,帮助用户、运维人员快速高效定位问题
- 便        按照Region定义集群资源划分单位将逻辑集群根据保障等级划分。在方便资源隔离、提高扩展能力的同时实现对服务端的强管控
-        基于滴滴内部多年运营实践沉淀资源治理方法建立健康分体系。针对Topic分区热点、分区不足等高频常见问题实现资源治理专家化
-        :与Prometheus、Grafana、滴滴夜莺监控告警系统打通,集成指标分析、监控告警、集群部署、集群升级等能力。形成运维生态,凝练专家服务,使运维更高效
-        :与滴滴夜莺监控告警系统打通,集成监控告警、集群部署、集群升级等能力。形成运维生态,凝练专家服务,使运维更高效
### 1.4 滴滴Logi-KafkaManager架构图
@@ -54,29 +55,29 @@
## 2 相关文档
### 2.1 产品文档
- [滴滴Know Streaming 安装手册](docs/install_guide/install_guide_cn.md)
- [滴滴Know Streaming 接入集群](docs/user_guide/add_cluster/add_cluster.md)
- [滴滴Know Streaming 用户使用手册](docs/user_guide/user_guide_cn.md)
- [滴滴Know Streaming FAQ](docs/user_guide/faq.md)
- [滴滴LogiKM 安装手册](docs/install_guide/install_guide_cn.md)
- [滴滴LogiKM 接入集群](docs/user_guide/add_cluster/add_cluster.md)
- [滴滴LogiKM 用户使用手册](docs/user_guide/user_guide_cn.md)
- [滴滴LogiKM FAQ](docs/user_guide/faq.md)
### 2.2 社区文章
- [滴滴云官网产品介绍](https://www.didiyun.com/production/logi-KafkaManager.html)
- [7年沉淀之作--滴滴Logi日志服务套件](https://mp.weixin.qq.com/s/-KQp-Qo3WKEOc9wIR2iFnw)
- [滴滴Know Streaming 一站式Kafka管控平台](https://mp.weixin.qq.com/s/9qSZIkqCnU6u9nLMvOOjIQ)
- [滴滴Know Streaming 开源之路](https://xie.infoq.cn/article/0223091a99e697412073c0d64)
- [滴滴Know Streaming 系列视频教程](https://space.bilibili.com/442531657/channel/seriesdetail?sid=571649)
- [滴滴LogiKM 一站式Kafka监控与管控平台](https://mp.weixin.qq.com/s/9qSZIkqCnU6u9nLMvOOjIQ)
- [滴滴LogiKM 开源之路](https://xie.infoq.cn/article/0223091a99e697412073c0d64)
- [滴滴LogiKM 系列视频教程](https://space.bilibili.com/442531657/channel/seriesdetail?sid=571649)
- [kafka最强最全知识图谱](https://www.szzdzhp.com/kafka/)
- [滴滴Know Streaming新用户入门系列文章专栏 --石臻臻](https://www.szzdzhp.com/categories/LogIKM/)
- [kafka实践十五滴滴开源Kafka管控平台 Know Streaming研究--A叶子叶来](https://blog.csdn.net/yezonggang/article/details/113106244)
- [基于云原生应用管理平台Rainbond安装 滴滴Know Streaming](https://www.rainbond.com/docs/opensource-app/logikm/?channel=logikm)
- [滴滴LogiKM新用户入门系列文章专栏 --石臻臻](https://www.szzdzhp.com/categories/LogIKM/)
- [kafka实践十五滴滴开源Kafka管控平台 LogiKM研究--A叶子叶来](https://blog.csdn.net/yezonggang/article/details/113106244)
- [基于云原生应用管理平台Rainbond安装 滴滴LogiKM](https://www.rainbond.com/docs/opensource-app/logikm/?channel=logikm)
## 3 Know Streaming开源用户交流群
## 3 滴滴Logi开源用户交流群
![image](https://user-images.githubusercontent.com/5287750/111266722-e531d800-8665-11eb-9242-3484da5a3099.png)
想跟各个大佬交流Kafka Es 等中间件/大数据相关技术请 加微信进群。
微信加群:添加<font color=red>mike_zhangliang</font><font color=red>PenceXie</font>的微信号备注Know Streaming加群或关注公众号 云原生可观测性 回复 "Know Streaming加群"
微信加群:添加<font color=red>mike_zhangliang</font><font color=red>danke-x</font>的微信号备注Logi加群或关注公众号 云原生可观测性 回复 "Logi加群"
## 4 知识星球
@@ -113,9 +114,4 @@ PS:提问请尽量把问题一次性描述清楚,并告知环境信息情况
## 6 协议
`Know Streaming`基于`Apache-2.0`协议进行分发和使用,更多信息参见[协议文件](./LICENSE)
## 7 Star History
[![Star History Chart](https://api.star-history.com/svg?repos=didi/KnowStreaming&type=Date)](https://star-history.com/#didi/KnowStreaming&Date)
`LogiKM`基于`Apache-2.0`协议进行分发和使用,更多信息参见[协议文件](./LICENSE)

View File

@@ -592,3 +592,62 @@ CREATE TABLE `work_order` (
`gmt_modify` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT '修改时间',
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='工单表';
create table ha_active_standby_relation
(
id bigint unsigned auto_increment comment 'id'
primary key,
active_cluster_phy_id bigint default -1 not null comment '主集群ID',
active_res_name varchar(192) collate utf8_bin default '' not null comment '主资源名称',
standby_cluster_phy_id bigint default -1 not null comment '备集群ID',
standby_res_name varchar(192) collate utf8_bin default '' not null comment '备资源名称',
res_type int default -1 not null comment '资源类型',
status int default -1 not null comment '关系状态',
unique_field varchar(1024) default '' not null comment '唯一字段',
create_time timestamp default CURRENT_TIMESTAMP not null comment '创建时间',
modify_time timestamp default CURRENT_TIMESTAMP not null on update CURRENT_TIMESTAMP comment '修改时间',
kafka_status int default 0 null comment '高可用配置是否完全建立 1:Kafka上该主备关系正常0:Kafka上该主备关系异常',
constraint uniq_unique_field
unique (unique_field)
)
comment 'HA主备关系表' charset = utf8;
create index idx_type_active
on ha_active_standby_relation (res_type, active_cluster_phy_id);
create index idx_type_standby
on ha_active_standby_relation (res_type, standby_cluster_phy_id);
create table ha_active_standby_switch_job
(
id bigint unsigned auto_increment comment 'id'
primary key,
active_cluster_phy_id bigint default -1 not null comment '主集群ID',
standby_cluster_phy_id bigint default -1 not null comment '备集群ID',
job_status int default -1 not null comment '任务状态',
operator varchar(256) default '' not null comment '操作人',
create_time timestamp default CURRENT_TIMESTAMP not null comment '创建时间',
modify_time timestamp default CURRENT_TIMESTAMP not null on update CURRENT_TIMESTAMP comment '修改时间',
type int default 5 not null comment '1:topic 2:实例 3逻辑集群 4物理集群',
active_business_id varchar(100) default '-1' not null comment '主业务id(topicName,实例id,逻辑集群id,物理集群id)',
standby_business_id varchar(100) default '-1' not null comment '备业务id(topicName,实例id,逻辑集群id,物理集群id)'
)
comment 'HA主备关系切换-子任务表' charset = utf8;
create table ha_active_standby_switch_sub_job
(
id bigint unsigned auto_increment comment 'id'
primary key,
job_id bigint default -1 not null comment '任务ID',
active_cluster_phy_id bigint default -1 not null comment '主集群ID',
active_res_name varchar(192) collate utf8_bin default '' not null comment '主资源名称',
standby_cluster_phy_id bigint default -1 not null comment '备集群ID',
standby_res_name varchar(192) collate utf8_bin default '' not null comment '备资源名称',
res_type int default -1 not null comment '资源类型',
job_status int default -1 not null comment '任务状态',
extend_data text null comment '扩展数据',
create_time timestamp default CURRENT_TIMESTAMP not null comment '创建时间',
modify_time timestamp default CURRENT_TIMESTAMP not null on update CURRENT_TIMESTAMP comment '修改时间'
)
comment 'HA主备关系-切换任务表' charset = utf8;

View File

@@ -0,0 +1,97 @@
---
![kafka-manager-logo](../assets/images/common/logo_name.png)
**一站式`Apache Kafka`集群指标监控与运维管控平台**
---
# Kafka主备切换流程简介
## 1、客户端读写流程
在介绍Kafka主备切换流程之前我们先来了解一下客户端通过我们自研的网关的大致读写流程。
![基于网关的生产消费流程](./assets/Kafka基于网关的生产消费流程.png)
如上图所示,客户端读写流程大致为:
1. 客户端向网关请求Topic元信息
2. 网关发现客户端使用的KafkaUser是A集群的KafkaUser因此将Topic元信息请求转发到A集群
3. A集群收到网关的Topic元信息处理并返回给网关
4. 网关将集群A返回的结果返回给客户端
5. 客户端从Topic元信息中获取到Topic实际位于集群A然后客户端会连接集群A进行生产消费
**备注客户端为Kafka原生客户端无任何定制。**
---
## 2、主备切换流程
介绍完基于网关的客户端读写流程之后我们再来看一下主备高可用版的Kafka需要如何进行主备切换。
### 2.1、大体流程
![Kafka主备切换流程](./assets/Kafka主备切换流程.png)
图有点多,总结起来就是:
1. 先阻止客户端数据的读写;
2. 等待主备数据同步完成;
3. 调整主备集群数据同步方向;
4. 调整配置,引导客户端到备集群进行读写;
### 2.2、详细操作
看完大体流程,我们再来看一下实际操作的命令。
```bash
1. 阻止用户生产和消费
bin/kafka-configs.sh --zookeeper ${主集群A的ZK地址} --entity-type users --entity-name ${客户端使用的kafkaUser} --add-config didi.ha.active.cluster=None --alter
2. 等待FetcherLag 和 Offset 同步
无需操作仅需检查主备Topic的Offset是否一致了。
3. 取消备集群B向主集群A进行同步数据的配置
bin/kafka-configs.sh --zookeeper ${备集群B的ZK地址} --entity-type ha-topics --entity-name ${Topic名称} --delete-config didi.ha.remote.cluster --alter
4. 增加主集群A向备集群B进行同步数据的配置
bin/kafka-configs.sh --zookeeper ${主集群A的ZK地址} --entity-type ha-topics --entity-name ${Topic名称} --add-config didi.ha.remote.cluster=${备集群B的集群ID} --alter
5. 修改主集群A备集群B网关中kafkaUser对应的集群从而引导请求走向备集群
bin/kafka-configs.sh --zookeeper ${主集群A的ZK地址} --entity-type users --entity-name ${客户端使用的kafkaUser} --add-config didi.ha.active.cluster=${备集群B的集群ID} --alter
bin/kafka-configs.sh --zookeeper ${备集群B的ZK地址} --entity-type users --entity-name ${客户端使用的kafkaUser} --add-config didi.ha.active.cluster=${备集群B的集群ID} --alter
bin/kafka-configs.sh --zookeeper ${网关的ZK地址} --entity-type users --entity-name ${客户端使用的kafkaUser} --add-config didi.ha.active.cluster=${备集群B的集群ID} --alter
```
---
## 3、FAQ
**问题一:使用中,有没有什么需要注意的地方?**
1. 主备切换是按照KafkaUser维度进行切换的因此建议**不同服务之间使用不同的KafkaUser**。这不仅有助于主备切换,也有助于做权限管控等。
2. 在建立主备关系的过程中如果主Topic的数据量比较大建议逐步建立主备关系避免一次性建立太多主备关系的Topic导致主集群需要被同步大量数据从而出现压力。
&nbsp;
**问题二:消费客户端如果重启之后,会不会导致变成从最旧或者最新的数据开始消费?**
不会。主备集群会相互同步__consumer_offsets这个Topic的数据因此客户端在主集群的消费进度信息也会被同步到备集群客户端在备集群进行消费时也会从上次提交在主集群Topic的位置开始消费。
&nbsp;
**问题三如果是类似Flink任务是自己维护消费进度的程序在主备切换之后会不会存在数据丢失或者重复消费的情况**
如果Flink自己管理好了消费进度那么就不会。因为主备集群之间的数据同步就和一个集群内的副本同步一样备集群会将主集群Topic中的Offset信息等都同步过来因此不会。
&nbsp;
**问题四:可否做到不重启客户端?**
即将开发完成的高可用版Kafka二期将具备该能力敬请期待。
&nbsp;

Binary file not shown.

After

Width:  |  Height:  |  Size: 254 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

View File

@@ -0,0 +1,367 @@
<mxfile host="65bd71144e">
<diagram id="bhaMuW99Q1BzDTtcfRXp" name="Page-1">
<mxGraphModel dx="1384" dy="785" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="1169" pageHeight="827" math="0" shadow="0">
<root>
<mxCell id="0"/>
<mxCell id="1" parent="0"/>
<mxCell id="81" value="1、主集群拒绝客户端的写入" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="630" y="70" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="79" value="主备高可用集群稳定时的状态" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="30" y="70" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="27" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" parent="1" vertex="1">
<mxGeometry x="200" y="100" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="32" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" parent="1" vertex="1">
<mxGeometry x="210" y="110" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="33" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" parent="1" vertex="1">
<mxGeometry x="210" y="150" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="36" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" parent="1" vertex="1">
<mxGeometry x="200" y="220" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="37" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" parent="1" vertex="1">
<mxGeometry x="210" y="230" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="38" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" parent="1" vertex="1">
<mxGeometry x="210" y="270" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="63" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;" edge="1" parent="1" source="39" target="27">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="440" y="380"/>
<mxPoint x="440" y="140"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="64" value="备集群B 不断向 主集群A &lt;br&gt;发送Fetch请求&lt;br&gt;从而同步主集群A的数据" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="63">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="6" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="39" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" parent="1" vertex="1">
<mxGeometry x="200" y="340" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="40" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" parent="1" vertex="1">
<mxGeometry x="210" y="350" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="41" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" parent="1" vertex="1">
<mxGeometry x="210" y="390" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="57" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=default;startArrow=classic;startFill=1;" parent="1" source="42" target="27" edge="1">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="58" value="对主集群进行读写" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="57" vertex="1" connectable="0">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="42" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" parent="1" vertex="1">
<mxGeometry x="40" y="240" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="65" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="800" y="100" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="66" value="Zookeeper(修改ZK)" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FF3333;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="110" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="67" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="150" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="68" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="220" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="69" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="230" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="70" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="270" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="71" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;" edge="1" parent="1" source="73" target="65">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="1040" y="380"/>
<mxPoint x="1040" y="140"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="72" value="备集群B 不断向 主集群A&lt;br&gt;发送Fetch请求&lt;br&gt;从而同步主集群A的数据" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="71">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="6" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="73" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="340" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="74" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="350" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="75" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="390" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="76" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=#FF3333;startArrow=none;startFill=0;strokeWidth=3;endArrow=none;endFill=0;dashed=1;" edge="1" parent="1" source="78" target="65">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="77" value="对主集群进行读写会出现失败" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="76">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="78" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="640" y="240" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="82" value="2、等待主备同步完成避免丢数据" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="630" y="590" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="83" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="800" y="620" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="84" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="630" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="85" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="670" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="86" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="740" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="87" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="750" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="88" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="790" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="89" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;" edge="1" parent="1" source="91" target="83">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="1040" y="900"/>
<mxPoint x="1040" y="660"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="90" value="备集群B 不断向 主集群A&lt;br&gt;发送Fetch请求&lt;br&gt;从而同步主集群A的&lt;br&gt;指定Topic的数据" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="89">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="6" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="91" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="860" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="92" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="870" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="93" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="910" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="94" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=#FF3333;startArrow=none;startFill=0;strokeWidth=3;endArrow=none;endFill=0;dashed=1;" edge="1" parent="1" source="96" target="83">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="95" value="对主集群进行读写会出现失败" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="94">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="96" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="640" y="760" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="97" value="3、Topic粒度数据同步方向调整由主集群A向备集群B同步数据" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="30" y="590" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="98" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="200" y="620" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="99" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="630" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="100" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="670" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="101" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="200" y="740" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="102" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="750" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="103" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="790" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="104" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;endArrow=none;endFill=0;strokeColor=#FF3333;strokeWidth=1;startArrow=classic;startFill=1;" edge="1" parent="1" source="106" target="98">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="440" y="900"/>
<mxPoint x="440" y="660"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="105" value="&lt;span style=&quot;font-size: 11px;&quot;&gt;主集群A 不断向 备集群B&lt;/span&gt;&lt;br style=&quot;font-size: 11px;&quot;&gt;&lt;span style=&quot;font-size: 11px;&quot;&gt;发送Fetch请求&lt;/span&gt;&lt;br style=&quot;font-size: 11px;&quot;&gt;&lt;span style=&quot;font-size: 11px;&quot;&gt;从而同步备集群B的&lt;br&gt;指定Topic的数据&lt;/span&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="104">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="-4" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="106" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="200" y="860" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="107" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="870" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="108" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="910" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="109" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=#FF3333;startArrow=none;startFill=0;strokeWidth=3;endArrow=none;endFill=0;dashed=1;" edge="1" parent="1" source="111" target="98">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="110" value="对主集群进行读写会出现失败" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="109">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="111" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="40" y="760" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="127" value="4、修改ZK使得客户端使用的KafkaUser对应的集群为备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="30" y="1110" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="128" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="200" y="1140" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="130" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="1190" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="131" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="200" y="1260" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="132" value="Zookeeper(修改ZK)" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FF3333;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="1270" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="133" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="1310" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="134" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;endArrow=none;endFill=0;strokeColor=#000000;strokeWidth=1;startArrow=classic;startFill=1;" edge="1" parent="1" source="136" target="128">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="440" y="1420"/>
<mxPoint x="440" y="1180"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="135" value="&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;主集群A 不断向 备集群B&lt;/span&gt;&lt;br style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;发送Fetch请求&lt;/span&gt;&lt;br style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;从而同步备集群B的&lt;br&gt;指定Topic的数据&lt;/span&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="134">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="-4" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="136" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="200" y="1380" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="138" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="210" y="1430" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="139" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=#FF3333;startArrow=none;startFill=0;strokeWidth=3;endArrow=none;endFill=0;dashed=1;" edge="1" parent="1" source="141" target="128">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="140" value="对主集群进行读写会出现失败" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="139">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="141" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="40" y="1280" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="142" value="5、重启客户端网关将请求转向集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FFFFFF;strokeColor=#d79b00;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;fontSize=16;" vertex="1" parent="1">
<mxGeometry x="630" y="1110" width="490" height="380" as="geometry"/>
</mxCell>
<mxCell id="143" value="Kafka——主集群A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="800" y="1140" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="144" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="1150" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="145" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="1190" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="146" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="1260" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="148" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="1310" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="149" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;endArrow=none;endFill=0;strokeColor=#000000;strokeWidth=1;startArrow=classic;startFill=1;" edge="1" parent="1" source="151" target="143">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="1040" y="1420"/>
<mxPoint x="1040" y="1180"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="150" value="&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;主集群A 不断向 备集群B&lt;/span&gt;&lt;br style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;发送Fetch请求&lt;/span&gt;&lt;br style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;&lt;span style=&quot;color: rgb(0 , 0 , 0) ; font-size: 11px&quot;&gt;从而同步备集群B的&lt;br&gt;指定Topic的数据&lt;/span&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontColor=#FF3333;fontSize=13;" vertex="1" connectable="0" parent="149">
<mxGeometry x="-0.05" y="-4" relative="1" as="geometry">
<mxPoint x="-4" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="151" value="Kafka——备集群B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="800" y="1380" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="152" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="1390" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="153" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="810" y="1430" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="156" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="640" y="1280" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="157" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=default;startArrow=classic;startFill=1;exitX=0.5;exitY=1;exitDx=0;exitDy=0;" edge="1" parent="1" source="156" target="151">
<mxGeometry relative="1" as="geometry">
<mxPoint x="529.9966666666667" y="1400" as="sourcePoint"/>
<mxPoint x="613.3299999999999" y="1300" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="158" value="对B集群进行读写" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="157">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="159" value="Zookeeper(修改ZK)" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FF3333;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="1150" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="160" value="Zookeeper(修改ZK)" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#FF3333;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="210" y="1390" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="161" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="810" y="1270" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="162" value="" style="shape=flexArrow;endArrow=classic;html=1;fontSize=13;fontColor=#FF3333;strokeColor=#000000;strokeWidth=1;fillColor=#9999FF;" edge="1" parent="1">
<mxGeometry width="50" height="50" relative="1" as="geometry">
<mxPoint x="550" y="259.5" as="sourcePoint"/>
<mxPoint x="600" y="259.5" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="163" value="" style="shape=flexArrow;endArrow=classic;html=1;fontSize=13;fontColor=#FF3333;strokeColor=#000000;strokeWidth=1;fillColor=#9999FF;" edge="1" parent="1">
<mxGeometry width="50" height="50" relative="1" as="geometry">
<mxPoint x="879.5" y="490" as="sourcePoint"/>
<mxPoint x="879.5" y="540" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="164" value="" style="shape=flexArrow;endArrow=classic;html=1;fontSize=13;fontColor=#FF3333;strokeColor=#000000;strokeWidth=1;fillColor=#9999FF;" edge="1" parent="1">
<mxGeometry width="50" height="50" relative="1" as="geometry">
<mxPoint x="274.5" y="1010" as="sourcePoint"/>
<mxPoint x="274.5" y="1060" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="165" value="" style="shape=flexArrow;endArrow=classic;html=1;fontSize=13;fontColor=#FF3333;strokeColor=#000000;strokeWidth=1;fillColor=#9999FF;" edge="1" parent="1">
<mxGeometry width="50" height="50" relative="1" as="geometry">
<mxPoint x="550" y="1309" as="sourcePoint"/>
<mxPoint x="600" y="1309" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="167" value="" style="shape=flexArrow;endArrow=classic;html=1;fontSize=13;fontColor=#FF3333;strokeColor=#000000;strokeWidth=1;fillColor=#9999FF;" edge="1" parent="1">
<mxGeometry width="50" height="50" relative="1" as="geometry">
<mxPoint x="606" y="779.5" as="sourcePoint"/>
<mxPoint x="550" y="779.5" as="targetPoint"/>
</mxGeometry>
</mxCell>
</root>
</mxGraphModel>
</diagram>
</mxfile>

View File

@@ -0,0 +1,95 @@
<mxfile host="65bd71144e">
<diagram id="bhaMuW99Q1BzDTtcfRXp" name="Page-1">
<mxGraphModel dx="1344" dy="785" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="1169" pageHeight="827" math="0" shadow="0">
<root>
<mxCell id="0"/>
<mxCell id="1" parent="0"/>
<mxCell id="27" value="Kafka集群--A" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=top;align=center;verticalAlign=bottom;" vertex="1" parent="1">
<mxGeometry x="320" y="40" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="32" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="330" y="50" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="33" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="330" y="90" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="47" style="edgeStyle=orthogonalEdgeStyle;html=1;entryX=1;entryY=0.25;entryDx=0;entryDy=0;exitX=1;exitY=0.75;exitDx=0;exitDy=0;" edge="1" parent="1" source="36" target="27">
<mxGeometry relative="1" as="geometry">
<Array as="points">
<mxPoint x="560" y="260"/>
<mxPoint x="560" y="60"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="51" value="2、网关发现是A集群的KafkaUser&lt;br&gt;网关将请求转发到A集群" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="47">
<mxGeometry x="-0.0444" y="-1" relative="1" as="geometry">
<mxPoint x="49" y="72" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="55" style="edgeStyle=orthogonalEdgeStyle;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;" edge="1" parent="1" source="36" target="42">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="56" value="4、网关返回Topic元信息" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="55">
<mxGeometry x="0.2125" relative="1" as="geometry">
<mxPoint x="17" y="-10" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="36" value="Kafka网关" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="320" y="200" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="37" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="330" y="210" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="38" value="Kafka-Gateways" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="330" y="250" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="39" value="Kafka集群--B" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#cdeb8b;strokeColor=#36393d;labelPosition=center;verticalLabelPosition=bottom;align=center;verticalAlign=top;" vertex="1" parent="1">
<mxGeometry x="320" y="360" width="160" height="80" as="geometry"/>
</mxCell>
<mxCell id="40" value="Zookeeper" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#dae8fc;strokeColor=#6c8ebf;" vertex="1" parent="1">
<mxGeometry x="330" y="370" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="41" value="Kafka-Brokers" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="330" y="410" width="140" height="20" as="geometry"/>
</mxCell>
<mxCell id="57" style="html=1;entryX=0;entryY=0.5;entryDx=0;entryDy=0;strokeColor=default;startArrow=classic;startFill=1;" edge="1" parent="1" source="42" target="27">
<mxGeometry relative="1" as="geometry"/>
</mxCell>
<mxCell id="58" value="5、通过Topic元信息&lt;br&gt;客户端直接访问A集群进行生产消费" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="57">
<mxGeometry x="-0.0724" y="1" relative="1" as="geometry">
<mxPoint x="-6" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="42" value="Kafka-Client" style="rounded=0;whiteSpace=wrap;html=1;absoluteArcSize=1;arcSize=14;strokeWidth=1;fillColor=#ffe6cc;strokeColor=#d79b00;" vertex="1" parent="1">
<mxGeometry x="40" y="220" width="120" height="40" as="geometry"/>
</mxCell>
<mxCell id="48" style="html=1;entryX=0;entryY=0.75;entryDx=0;entryDy=0;exitX=0.5;exitY=1;exitDx=0;exitDy=0;edgeStyle=orthogonalEdgeStyle;" edge="1" parent="1" source="42" target="36">
<mxGeometry relative="1" as="geometry">
<mxPoint x="490" y="250" as="sourcePoint"/>
<mxPoint x="490" y="90" as="targetPoint"/>
</mxGeometry>
</mxCell>
<mxCell id="50" value="1、请求Topic元信息" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="48">
<mxGeometry x="-0.3373" y="-1" relative="1" as="geometry">
<mxPoint x="17" y="7" as="offset"/>
</mxGeometry>
</mxCell>
<mxCell id="49" style="edgeStyle=orthogonalEdgeStyle;html=1;entryX=1;entryY=0.25;entryDx=0;entryDy=0;exitX=1;exitY=0.75;exitDx=0;exitDy=0;" edge="1" parent="1" source="27" target="36">
<mxGeometry relative="1" as="geometry">
<mxPoint x="640" y="60" as="sourcePoint"/>
<mxPoint x="490" y="70" as="targetPoint"/>
<Array as="points">
<mxPoint x="520" y="100"/>
<mxPoint x="520" y="220"/>
</Array>
</mxGeometry>
</mxCell>
<mxCell id="52" value="3、A集群返回&lt;br&gt;Topic元信息给网关" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="49">
<mxGeometry x="-0.03" y="-1" relative="1" as="geometry">
<mxPoint x="-19" y="3" as="offset"/>
</mxGeometry>
</mxCell>
</root>
</mxGraphModel>
</diagram>
</mxfile>

View File

@@ -112,5 +112,15 @@
<artifactId>lombok</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.hibernate.validator</groupId>
<artifactId>hibernate-validator</artifactId>
</dependency>
</dependencies>
</project>

View File

@@ -0,0 +1,21 @@
package com.xiaojukeji.kafka.manager.common.bizenum;
import lombok.Getter;
@Getter
public enum JobLogBizTypEnum {
HA_SWITCH_JOB_LOG(100, "HA-主备切换日志"),
UNKNOWN(-1, "unknown"),
;
JobLogBizTypEnum(int code, String msg) {
this.code = code;
this.msg = msg;
}
private final int code;
private final String msg;
}

View File

@@ -1,11 +1,11 @@
package com.xiaojukeji.kafka.manager.kcm.common.bizenum;
package com.xiaojukeji.kafka.manager.common.bizenum;
/**
* 任务动作
* @author zengqiao
* @date 20/4/26
*/
public enum ClusterTaskActionEnum {
public enum TaskActionEnum {
UNKNOWN("unknown"),
START("start"),
@@ -17,13 +17,15 @@ public enum ClusterTaskActionEnum {
REDO("redo"),
KILL("kill"),
FORCE("force"),
ROLLBACK("rollback"),
;
private String action;
private final String action;
ClusterTaskActionEnum(String action) {
TaskActionEnum(String action) {
this.action = action;
}

View File

@@ -1,10 +1,13 @@
package com.xiaojukeji.kafka.manager.common.bizenum;
import lombok.Getter;
/**
* 任务状态
* @author zengqiao
* @date 2017/6/29.
*/
@Getter
public enum TaskStatusEnum {
UNKNOWN( -1, "未知"),
@@ -15,6 +18,7 @@ public enum TaskStatusEnum {
RUNNING( 30, "运行中"),
KILLING( 31, "杀死中"),
RUNNING_IN_TIMEOUT( 32, "超时运行中"),
BLOCKED( 40, "暂停"),
@@ -30,31 +34,15 @@ public enum TaskStatusEnum {
;
private Integer code;
private final Integer code;
private String message;
private final String message;
TaskStatusEnum(Integer code, String message) {
this.code = code;
this.message = message;
}
public Integer getCode() {
return code;
}
public String getMessage() {
return message;
}
@Override
public String toString() {
return "TaskStatusEnum{" +
"code=" + code +
", message='" + message + '\'' +
'}';
}
public static Boolean isFinished(Integer code) {
return code >= FINISHED.getCode();
}

View File

@@ -17,9 +17,9 @@ public enum TopicAuthorityEnum {
OWNER(4, "可管理"),
;
private Integer code;
private final Integer code;
private String message;
private final String message;
TopicAuthorityEnum(Integer code, String message) {
this.code = code;
@@ -34,6 +34,16 @@ public enum TopicAuthorityEnum {
return message;
}
public static String getMsgByCode(Integer code) {
for (TopicAuthorityEnum authorityEnum: TopicAuthorityEnum.values()) {
if (authorityEnum.getCode().equals(code)) {
return authorityEnum.message;
}
}
return DENY.message;
}
@Override
public String toString() {
return "TopicAuthorityEnum{" +

View File

@@ -10,12 +10,11 @@ public enum GatewayConfigKeyEnum {
SD_APP_RATE("SD_APP_RATE", "SD_APP_RATE"),
SD_IP_RATE("SD_IP_RATE", "SD_IP_RATE"),
SD_SP_RATE("SD_SP_RATE", "SD_SP_RATE"),
;
private String configType;
private final String configType;
private String configName;
private final String configName;
GatewayConfigKeyEnum(String configType, String configName) {
this.configType = configType;

View File

@@ -0,0 +1,27 @@
package com.xiaojukeji.kafka.manager.common.bizenum.ha;
import lombok.Getter;
/**
* @author zengqiao
* @date 20/7/28
*/
@Getter
public enum HaRelationTypeEnum {
UNKNOWN(-1, "非高可用"),
STANDBY(0, ""),
ACTIVE(1, ""),
MUTUAL_BACKUP(2 , "互备");
private final int code;
private final String msg;
HaRelationTypeEnum(int code, String msg) {
this.code = code;
this.msg = msg;
}
}

View File

@@ -0,0 +1,25 @@
package com.xiaojukeji.kafka.manager.common.bizenum.ha;
import lombok.Getter;
/**
* @author zengqiao
* @date 20/7/28
*/
@Getter
public enum HaResTypeEnum {
CLUSTER(0, "Cluster"),
TOPIC(1, "Topic"),
KAFKA_USER(2, "KafkaUser"),
;
private final int code;
private final String msg;
HaResTypeEnum(int code, String msg) {
this.code = code;
this.msg = msg;
}
}

View File

@@ -0,0 +1,75 @@
package com.xiaojukeji.kafka.manager.common.bizenum.ha;
/**
* @author zengqiao
* @date 20/7/28
*/
public enum HaStatusEnum {
UNKNOWN(-1, "未知状态"),
STABLE(HaStatusEnum.STABLE_CODE, "稳定状态"),
// SWITCHING(HaStatusEnum.SWITCHING_CODE, "切换中"),
SWITCHING_PREPARE(
HaStatusEnum.SWITCHING_PREPARE_CODE,
"主备切换--源集群[%s]--预处理(阻止当前主Topic写入)"),
SWITCHING_WAITING_IN_SYNC(
HaStatusEnum.SWITCHING_WAITING_IN_SYNC_CODE,
"主备切换--目标集群[%s]--等待主与备Topic数据同步完成"),
SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH(
HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE,
"主备切换--目标集群[%s]--关闭旧的备Topic的副本同步"),
SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH(
HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH_CODE,
"主备切换--源集群[%s]--开启新的备Topic的副本同步"),
SWITCHING_CLOSEOUT(
HaStatusEnum.SWITCHING_CLOSEOUT_CODE,
"主备切换--目标集群[%s]--收尾(允许新的主Topic写入)"),
;
public static final int UNKNOWN_CODE = -1;
public static final int STABLE_CODE = 0;
public static final int SWITCHING_CODE = 100;
public static final int SWITCHING_PREPARE_CODE = 101;
public static final int SWITCHING_WAITING_IN_SYNC_CODE = 102;
public static final int SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE = 103;
public static final int SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH_CODE = 104;
public static final int SWITCHING_CLOSEOUT_CODE = 105;
private final int code;
private final String msg;
public int getCode() {
return code;
}
public String getMsg(String clusterName) {
if (this.code == UNKNOWN_CODE || this.code == STABLE_CODE) {
return this.msg;
}
return String.format(msg, clusterName);
}
HaStatusEnum(int code, String msg) {
this.code = code;
this.msg = msg;
}
public static Integer calProgress(Integer status) {
if (status == null || status == HaStatusEnum.STABLE_CODE || status == UNKNOWN_CODE) {
return 100;
}
// 最小进度为 1%
return Math.max(1, (status - 101) * 100 / 5);
}
}

View File

@@ -0,0 +1,44 @@
package com.xiaojukeji.kafka.manager.common.bizenum.ha.job;
public enum HaJobActionEnum {
/**
*
*/
START(1,"start"),
STOP(2, "stop"),
CANCEL(3,"cancel"),
CONTINUE(4,"continue"),
UNKNOWN(-1, "unknown");
HaJobActionEnum(int status, String value) {
this.status = status;
this.value = value;
}
private final int status;
private final String value;
public int getStatus() {
return status;
}
public String getValue() {
return value;
}
public static HaJobActionEnum valueOfStatus(int status) {
for (HaJobActionEnum statusEnum : HaJobActionEnum.values()) {
if (status == statusEnum.getStatus()) {
return statusEnum;
}
}
return HaJobActionEnum.UNKNOWN;
}
}

View File

@@ -0,0 +1,75 @@
package com.xiaojukeji.kafka.manager.common.bizenum.ha.job;
import com.xiaojukeji.kafka.manager.common.bizenum.TaskStatusEnum;
public enum HaJobStatusEnum {
/**执行中*/
RUNNING(TaskStatusEnum.RUNNING),
RUNNING_IN_TIMEOUT(TaskStatusEnum.RUNNING_IN_TIMEOUT),
SUCCESS(TaskStatusEnum.SUCCEED),
FAILED(TaskStatusEnum.FAILED),
UNKNOWN(TaskStatusEnum.UNKNOWN);
HaJobStatusEnum(TaskStatusEnum taskStatusEnum) {
this.status = taskStatusEnum.getCode();
this.value = taskStatusEnum.getMessage();
}
private final int status;
private final String value;
public int getStatus() {
return status;
}
public String getValue() {
return value;
}
public static HaJobStatusEnum valueOfStatus(int status) {
for (HaJobStatusEnum statusEnum : HaJobStatusEnum.values()) {
if (status == statusEnum.getStatus()) {
return statusEnum;
}
}
return HaJobStatusEnum.UNKNOWN;
}
public static HaJobStatusEnum getStatusBySubStatus(int totalJobNum,
int successJobNu,
int failedJobNu,
int runningJobNu,
int runningInTimeoutJobNu,
int unknownJobNu) {
if (unknownJobNu > 0) {
return UNKNOWN;
}
if((failedJobNu + runningJobNu + runningInTimeoutJobNu + unknownJobNu) == 0) {
return SUCCESS;
}
if((runningJobNu + runningInTimeoutJobNu + unknownJobNu) == 0 && failedJobNu > 0) {
return FAILED;
}
if (runningInTimeoutJobNu > 0) {
return RUNNING_IN_TIMEOUT;
}
return RUNNING;
}
public static boolean isRunning(Integer jobStatus) {
return jobStatus != null && (RUNNING.status == jobStatus || RUNNING_IN_TIMEOUT.status == jobStatus);
}
public static boolean isFinished(Integer jobStatus) {
return jobStatus != null && (SUCCESS.status == jobStatus || FAILED.status == jobStatus);
}
}

View File

@@ -31,6 +31,8 @@ public class ConfigConstant {
public static final String KAFKA_CLUSTER_DO_CONFIG_KEY = "KAFKA_CLUSTER_DO_CONFIG";
public static final String HA_SWITCH_JOB_TIMEOUT_UNIT_SEC_CONFIG_PREFIX = "HA_SWITCH_JOB_TIMEOUT_UNIT_SEC_CONFIG_CLUSTER";
private ConfigConstant() {
}
}

View File

@@ -21,6 +21,32 @@ public class KafkaConstant {
public static final String INTERNAL_KEY = "INTERNAL";
public static final String BOOTSTRAP_SERVERS = "bootstrap.servers";
/**
* HA
*/
public static final String DIDI_KAFKA_ENABLE = "didi.kafka.enable";
public static final String DIDI_HA_REMOTE_CLUSTER = "didi.ha.remote.cluster";
// TODO 平台来管理配置,不需要底层来管理,因此可以删除该配置
public static final String DIDI_HA_SYNC_TOPIC_CONFIGS_ENABLED = "didi.ha.sync.topic.configs.enabled";
public static final String DIDI_HA_ACTIVE_CLUSTER = "didi.ha.active.cluster";
public static final String DIDI_HA_REMOTE_TOPIC = "didi.ha.remote.topic";
public static final String SECURITY_PROTOCOL = "security.protocol";
public static final String SASL_MECHANISM = "sasl.mechanism";
public static final String SASL_JAAS_CONFIG = "sasl.jaas.config";
public static final String NONE = "None";
private KafkaConstant() {
}
}

View File

@@ -0,0 +1,96 @@
package com.xiaojukeji.kafka.manager.common.constant;
/**
* 信息模版Constant
* @author zengqiao
* @date 22/03/03
*/
public class MsgConstant {
private MsgConstant() {
}
/**************************************************** Cluster ****************************************************/
public static String getClusterBizStr(Long clusterPhyId, String clusterName){
return String.format("集群ID:[%d] 集群名称:[%s]", clusterPhyId, clusterName);
}
public static String getClusterPhyNotExist(Long clusterPhyId) {
return String.format("集群ID:[%d] 不存在或者未加载", clusterPhyId);
}
/**************************************************** Broker ****************************************************/
public static String getBrokerNotExist(Long clusterPhyId, Integer brokerId) {
return String.format("集群ID:[%d] brokerId:[%d] 不存在或未存活", clusterPhyId, brokerId);
}
public static String getBrokerBizStr(Long clusterPhyId, Integer brokerId) {
return String.format("集群ID:[%d] brokerId:[%d]", clusterPhyId, brokerId);
}
/**************************************************** Topic ****************************************************/
public static String getTopicNotExist(Long clusterPhyId, String topicName) {
return String.format("集群ID:[%d] Topic名称:[%s] 不存在", clusterPhyId, topicName);
}
public static String getTopicBizStr(Long clusterPhyId, String topicName) {
return String.format("集群ID:[%d] Topic名称:[%s]", clusterPhyId, topicName);
}
public static String getTopicExtend(Long existPartitionNum, Long totalPartitionNum,String expandParam){
return String.format("新增分区, 从:[%d] 增加到:[%d], 详细参数信息:[%s]", existPartitionNum,totalPartitionNum,expandParam);
}
public static String getClusterTopicKey(Long clusterPhyId, String topicName) {
return String.format("%d@%s", clusterPhyId, topicName);
}
/**************************************************** Partition ****************************************************/
public static String getPartitionNotExist(Long clusterPhyId, String topicName) {
return String.format("集群ID:[%d] Topic名称:[%s] 存在非法的分区ID", clusterPhyId, topicName);
}
public static String getPartitionNotExist(Long clusterPhyId, String topicName, Integer partitionId) {
return String.format("集群ID:[%d] Topic名称:[%s] 分区Id:[%d] 不存在", clusterPhyId, topicName, partitionId);
}
/**************************************************** KafkaUser ****************************************************/
public static String getKafkaUserBizStr(Long clusterPhyId, String kafkaUser) {
return String.format("集群ID:[%d] kafkaUser:[%s]", clusterPhyId, kafkaUser);
}
public static String getKafkaUserNotExist(Long clusterPhyId, String kafkaUser) {
return String.format("集群ID:[%d] kafkaUser:[%s] 不存在", clusterPhyId, kafkaUser);
}
public static String getKafkaUserDuplicate(Long clusterPhyId, String kafkaUser) {
return String.format("集群ID:[%d] kafkaUser:[%s] 已存在", clusterPhyId, kafkaUser);
}
/**************************************************** ha-Cluster ****************************************************/
public static String getActiveClusterDuplicate(Long clusterPhyId, String clusterName) {
return String.format("集群ID:[%d] 主集群:[%s] 已存在", clusterPhyId, clusterName);
}
/**************************************************** reassign ****************************************************/
public static String getReassignJobBizStr(Long jobId, Long clusterPhyId) {
return String.format("任务Id:[%d] 集群ID:[%s]", jobId, clusterPhyId);
}
public static String getJobIdCanNotNull() {
return "jobId不允许为空";
}
public static String getJobNotExist(Long jobId) {
return String.format("jobId:[%d] 不存在", jobId);
}
}

View File

@@ -0,0 +1,28 @@
package com.xiaojukeji.kafka.manager.common.entity;
import com.xiaojukeji.kafka.manager.common.constant.Constant;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import lombok.ToString;
import java.io.Serializable;
@Data
@ToString
public class BaseResult implements Serializable {
private static final long serialVersionUID = -5771016784021901099L;
@ApiModelProperty(value = "信息", example = "成功")
protected String message;
@ApiModelProperty(value = "状态", example = "0")
protected int code;
public boolean successful() {
return !this.failed();
}
public boolean failed() {
return !Constant.SUCCESS.equals(code);
}
}

View File

@@ -1,21 +1,23 @@
package com.xiaojukeji.kafka.manager.common.entity;
import com.alibaba.fastjson.JSON;
import com.xiaojukeji.kafka.manager.common.constant.Constant;
import java.io.Serializable;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author huangyiminghappy@163.com
* @date 2019-07-08
*/
public class Result<T> implements Serializable {
private static final long serialVersionUID = -2772975319944108658L;
@Data
@ApiModel(description = "调用结果")
public class Result<T> extends BaseResult {
@ApiModelProperty(value = "数据")
protected T data;
private T data;
private String message;
private String tips;
private int code;
public Result() {
this.code = ResultStatus.SUCCESS.getCode();
this.message = ResultStatus.SUCCESS.getMessage();
}
public Result(T data) {
this.data = data;
@@ -23,10 +25,6 @@ public class Result<T> implements Serializable {
this.message = ResultStatus.SUCCESS.getMessage();
}
public Result() {
this(null);
}
public Result(Integer code, String message) {
this.message = message;
this.code = code;
@@ -38,48 +36,31 @@ public class Result<T> implements Serializable {
this.code = code;
}
public T getData()
{
return (T)this.data;
public static <T> Result<T> build(boolean succ) {
if (succ) {
return buildSuc();
}
return buildFail();
}
public void setData(T data)
{
this.data = data;
public static <T> Result<T> buildFail() {
Result<T> result = new Result<>();
result.setCode(ResultStatus.FAIL.getCode());
result.setMessage(ResultStatus.FAIL.getMessage());
return result;
}
public String getMessage()
{
return this.message;
}
public void setMessage(String message)
{
this.message = message;
}
public String getTips() {
return tips;
}
public void setTips(String tips) {
this.tips = tips;
}
public int getCode()
{
return this.code;
}
public void setCode(int code)
{
this.code = code;
}
@Override
public String toString()
{
return JSON.toJSONString(this);
public static <T> Result<T> build(boolean succ, T data) {
Result<T> result = new Result<>();
if (succ) {
result.setCode(ResultStatus.SUCCESS.getCode());
result.setMessage(ResultStatus.SUCCESS.getMessage());
result.setData(data);
} else {
result.setCode(ResultStatus.FAIL.getCode());
result.setMessage(ResultStatus.FAIL.getMessage());
}
return result;
}
public static <T> Result<T> buildSuc() {
@@ -97,14 +78,6 @@ public class Result<T> implements Serializable {
return result;
}
public static <T> Result<T> buildGatewayFailure(String message) {
Result<T> result = new Result<>();
result.setCode(ResultStatus.GATEWAY_INVALID_REQUEST.getCode());
result.setMessage(message);
result.setData(null);
return result;
}
public static <T> Result<T> buildFailure(String message) {
Result<T> result = new Result<>();
result.setCode(ResultStatus.FAIL.getCode());
@@ -113,10 +86,34 @@ public class Result<T> implements Serializable {
return result;
}
public static <T> Result<T> buildFrom(ResultStatus resultStatus) {
public static <T> Result<T> buildFailure(String message, T data) {
Result<T> result = new Result<>();
result.setCode(resultStatus.getCode());
result.setMessage(resultStatus.getMessage());
result.setCode(ResultStatus.FAIL.getCode());
result.setMessage(message);
result.setData(data);
return result;
}
public static <T> Result<T> buildFailure(ResultStatus rs) {
Result<T> result = new Result<>();
result.setCode(rs.getCode());
result.setMessage(rs.getMessage());
result.setData(null);
return result;
}
public static <T> Result<T> buildGatewayFailure(String message) {
Result<T> result = new Result<>();
result.setCode(ResultStatus.GATEWAY_INVALID_REQUEST.getCode());
result.setMessage(message);
result.setData(null);
return result;
}
public static <T> Result<T> buildFrom(ResultStatus rs) {
Result<T> result = new Result<>();
result.setCode(rs.getCode());
result.setMessage(rs.getMessage());
return result;
}
@@ -128,8 +125,46 @@ public class Result<T> implements Serializable {
return result;
}
public boolean failed() {
return !Constant.SUCCESS.equals(code);
public static <T> Result<T> buildFromRSAndMsg(ResultStatus resultStatus, String message) {
Result<T> result = new Result<>();
result.setCode(resultStatus.getCode());
result.setMessage(message);
result.setData(null);
return result;
}
public static <T> Result<T> buildFromRSAndData(ResultStatus rs, T data) {
Result<T> result = new Result<>();
result.setCode(rs.getCode());
result.setMessage(rs.getMessage());
result.setData(data);
return result;
}
public static <T, U> Result<T> buildFromIgnoreData(Result<U> anotherResult) {
Result<T> result = new Result<>();
result.setCode(anotherResult.getCode());
result.setMessage(anotherResult.getMessage());
return result;
}
public static <T> Result<T> buildParamIllegal(String msg) {
Result<T> result = new Result<>();
result.setCode(ResultStatus.PARAM_ILLEGAL.getCode());
result.setMessage(ResultStatus.PARAM_ILLEGAL.getMessage() + ":" + msg + ",请检查后再提交!");
return result;
}
public boolean hasData(){
return !failed() && this.data != null;
}
@Override
public String toString() {
return "Result{" +
"message='" + message + '\'' +
", code=" + code +
", data=" + data +
'}';
}
}

View File

@@ -23,6 +23,8 @@ public enum ResultStatus {
API_CALL_EXCEED_LIMIT(1403, "api call exceed limit"),
USER_WITHOUT_AUTHORITY(1404, "user without authority"),
CHANGE_ZOOKEEPER_FORBIDDEN(1405, "change zookeeper forbidden"),
HA_CLUSTER_DELETE_FORBIDDEN(1409, "先删除主topic才能删除该集群"),
HA_TOPIC_DELETE_FORBIDDEN(1410, "先解除高可用关系才能删除该topic"),
APP_OFFLINE_FORBIDDEN(1406, "先下线topic才能下线应用"),
@@ -76,6 +78,8 @@ public enum ResultStatus {
QUOTA_NOT_EXIST(7113, "quota not exist, please check clusterId, topicName and appId"),
CONSUMER_GROUP_NOT_EXIST(7114, "consumerGroup not exist"),
TOPIC_BIZ_DATA_NOT_EXIST(7115, "topic biz data not exist, please sync topic to db"),
SD_ZK_NOT_EXIST(7116, "SD_ZK未配置"),
// 资源已存在
RESOURCE_ALREADY_EXISTED(7200, "资源已经存在"),
@@ -88,6 +92,7 @@ public enum ResultStatus {
RESOURCE_ALREADY_USED(7400, "资源早已被使用"),
/**
* 因为外部系统的问题, 操作时引起的错误, [8000, 9000)
* ------------------------------------------------------------------------------------------
@@ -98,6 +103,7 @@ public enum ResultStatus {
ZOOKEEPER_READ_FAILED(8021, "zookeeper read failed"),
ZOOKEEPER_WRITE_FAILED(8022, "zookeeper write failed"),
ZOOKEEPER_DELETE_FAILED(8023, "zookeeper delete failed"),
ZOOKEEPER_OPERATE_FAILED(8024, "zookeeper operate failed"),
// 调用集群任务里面的agent失败
CALL_CLUSTER_TASK_AGENT_FAILED(8030, " call cluster task agent failed"),

View File

@@ -1,11 +1,14 @@
package com.xiaojukeji.kafka.manager.common.entity.ao;
import lombok.Data;
import java.util.Date;
/**
* @author zengqiao
* @date 20/4/23
*/
@Data
public class ClusterDetailDTO {
private Long clusterId;
@@ -41,141 +44,9 @@ public class ClusterDetailDTO {
private Integer regionNum;
public Long getClusterId() {
return clusterId;
}
private Integer haRelation;
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getClusterName() {
return clusterName;
}
public void setClusterName(String clusterName) {
this.clusterName = clusterName;
}
public String getZookeeper() {
return zookeeper;
}
public void setZookeeper(String zookeeper) {
this.zookeeper = zookeeper;
}
public String getBootstrapServers() {
return bootstrapServers;
}
public void setBootstrapServers(String bootstrapServers) {
this.bootstrapServers = bootstrapServers;
}
public String getKafkaVersion() {
return kafkaVersion;
}
public void setKafkaVersion(String kafkaVersion) {
this.kafkaVersion = kafkaVersion;
}
public String getIdc() {
return idc;
}
public void setIdc(String idc) {
this.idc = idc;
}
public Integer getMode() {
return mode;
}
public void setMode(Integer mode) {
this.mode = mode;
}
public String getSecurityProperties() {
return securityProperties;
}
public void setSecurityProperties(String securityProperties) {
this.securityProperties = securityProperties;
}
public String getJmxProperties() {
return jmxProperties;
}
public void setJmxProperties(String jmxProperties) {
this.jmxProperties = jmxProperties;
}
public Integer getStatus() {
return status;
}
public void setStatus(Integer status) {
this.status = status;
}
public Date getGmtCreate() {
return gmtCreate;
}
public void setGmtCreate(Date gmtCreate) {
this.gmtCreate = gmtCreate;
}
public Date getGmtModify() {
return gmtModify;
}
public void setGmtModify(Date gmtModify) {
this.gmtModify = gmtModify;
}
public Integer getBrokerNum() {
return brokerNum;
}
public void setBrokerNum(Integer brokerNum) {
this.brokerNum = brokerNum;
}
public Integer getTopicNum() {
return topicNum;
}
public void setTopicNum(Integer topicNum) {
this.topicNum = topicNum;
}
public Integer getConsumerGroupNum() {
return consumerGroupNum;
}
public void setConsumerGroupNum(Integer consumerGroupNum) {
this.consumerGroupNum = consumerGroupNum;
}
public Integer getControllerId() {
return controllerId;
}
public void setControllerId(Integer controllerId) {
this.controllerId = controllerId;
}
public Integer getRegionNum() {
return regionNum;
}
public void setRegionNum(Integer regionNum) {
this.regionNum = regionNum;
}
private String mutualBackupClusterName;
@Override
public String toString() {
@@ -197,6 +68,8 @@ public class ClusterDetailDTO {
", consumerGroupNum=" + consumerGroupNum +
", controllerId=" + controllerId +
", regionNum=" + regionNum +
", haRelation=" + haRelation +
", mutualBackupClusterName='" + mutualBackupClusterName + '\'' +
'}';
}
}

View File

@@ -1,5 +1,7 @@
package com.xiaojukeji.kafka.manager.common.entity.ao;
import lombok.Data;
import java.util.List;
import java.util.Properties;
@@ -7,6 +9,7 @@ import java.util.Properties;
* @author zengqiao
* @date 20/6/10
*/
@Data
public class RdTopicBasic {
private Long clusterId;
@@ -26,77 +29,7 @@ public class RdTopicBasic {
private List<String> regionNameList;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getClusterName() {
return clusterName;
}
public void setClusterName(String clusterName) {
this.clusterName = clusterName;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public Properties getProperties() {
return properties;
}
public void setProperties(Properties properties) {
this.properties = properties;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public List<String> getRegionNameList() {
return regionNameList;
}
public void setRegionNameList(List<String> regionNameList) {
this.regionNameList = regionNameList;
}
private Integer haRelation;
@Override
public String toString() {
@@ -109,7 +42,8 @@ public class RdTopicBasic {
", appName='" + appName + '\'' +
", properties=" + properties +
", description='" + description + '\'' +
", regionNameList='" + regionNameList + '\'' +
", regionNameList=" + regionNameList +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -0,0 +1,54 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.ha;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaStatusEnum;
import lombok.Data;
import java.util.HashMap;
import java.util.Map;
@Data
public class HaSwitchTopic {
/**
* 是否完成
*/
private boolean finished;
/**
* 每一个Topic的状态
*/
private Map<String, Integer> activeTopicSwitchStatusMap;
public HaSwitchTopic(boolean finished) {
this.finished = finished;
this.activeTopicSwitchStatusMap = new HashMap<>();
}
public void addHaSwitchTopic(HaSwitchTopic haSwitchTopic) {
this.finished &= haSwitchTopic.finished;
}
public boolean isFinished() {
return this.finished;
}
public void addActiveTopicStatus(String activeTopicName, Integer status) {
activeTopicSwitchStatusMap.put(activeTopicName, status);
}
public boolean isActiveTopicSwitchFinished(String activeTopicName) {
Integer status = activeTopicSwitchStatusMap.get(activeTopicName);
if (status == null) {
return false;
}
return status.equals(HaStatusEnum.STABLE.getCode());
}
@Override
public String toString() {
return "HaSwitchTopic{" +
"finished=" + finished +
", activeTopicSwitchStatusMap=" + activeTopicSwitchStatusMap +
'}';
}
}

View File

@@ -0,0 +1,28 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.ha.job;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job详情")
public class HaJobDetail {
@ApiModelProperty(value = "Topic名称")
private String topicName;
@ApiModelProperty(value="主集群ID")
private Long activeClusterPhyId;
@ApiModelProperty(value="备集群ID")
private Long standbyClusterPhyId;
@ApiModelProperty(value="Lag和")
private Long sumLag;
@ApiModelProperty(value="状态")
private Integer status;
}

View File

@@ -0,0 +1,16 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.ha.job;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job日志")
public class HaJobLog {
@ApiModelProperty(value = "日志信息")
private String log;
}

View File

@@ -0,0 +1,70 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.ha.job;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.job.HaJobStatusEnum;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.List;
@Data
@NoArgsConstructor
public class HaJobState {
/**
* @see com.xiaojukeji.kafka.manager.common.bizenum.ha.job.HaJobStatusEnum
*/
private int status;
private int total;
private int success;
private int failed;
private int doing;
private int doingInTimeout;
private int unknown;
private Integer progress;
/**
* 按照状态,直接进行聚合
*/
public HaJobState(List<Integer> jobStatusList, Integer progress) {
this.total = jobStatusList.size();
this.success = 0;
this.failed = 0;
this.doing = 0;
this.doingInTimeout = 0;
this.unknown = 0;
for (Integer jobStatus: jobStatusList) {
if (HaJobStatusEnum.SUCCESS.getStatus() == jobStatus) {
success += 1;
} else if (HaJobStatusEnum.FAILED.getStatus() == jobStatus) {
failed += 1;
} else if (HaJobStatusEnum.RUNNING.getStatus() == jobStatus) {
doing += 1;
} else if (HaJobStatusEnum.RUNNING_IN_TIMEOUT.getStatus() == jobStatus) {
doingInTimeout += 1;
} else {
unknown += 1;
}
}
this.status = HaJobStatusEnum.getStatusBySubStatus(this.total, this.success, this.failed, this.doing, this.doingInTimeout, this.unknown).getStatus();
this.progress = progress;
}
public HaJobState(Integer doingSize, Integer progress) {
this.total = doingSize;
this.success = 0;
this.failed = 0;
this.doing = doingSize;
this.doingInTimeout = 0;
this.unknown = 0;
this.progress = progress;
}
}

View File

@@ -0,0 +1,12 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.ha.job;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
public class HaSubJobExtendData {
private Long sumLag;
}

View File

@@ -1,11 +1,14 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.topic;
import lombok.Data;
import java.util.List;
/**
* @author arthur
* @date 2018/09/03
*/
@Data
public class TopicBasicDTO {
private Long clusterId;
@@ -39,133 +42,7 @@ public class TopicBasicDTO {
private Long retentionBytes;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public String getPrincipals() {
return principals;
}
public void setPrincipals(String principals) {
this.principals = principals;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public List<String> getRegionNameList() {
return regionNameList;
}
public void setRegionNameList(List<String> regionNameList) {
this.regionNameList = regionNameList;
}
public Integer getScore() {
return score;
}
public void setScore(Integer score) {
this.score = score;
}
public String getTopicCodeC() {
return topicCodeC;
}
public void setTopicCodeC(String topicCodeC) {
this.topicCodeC = topicCodeC;
}
public Integer getPartitionNum() {
return partitionNum;
}
public void setPartitionNum(Integer partitionNum) {
this.partitionNum = partitionNum;
}
public Integer getReplicaNum() {
return replicaNum;
}
public void setReplicaNum(Integer replicaNum) {
this.replicaNum = replicaNum;
}
public Integer getBrokerNum() {
return brokerNum;
}
public void setBrokerNum(Integer brokerNum) {
this.brokerNum = brokerNum;
}
public Long getModifyTime() {
return modifyTime;
}
public void setModifyTime(Long modifyTime) {
this.modifyTime = modifyTime;
}
public Long getCreateTime() {
return createTime;
}
public void setCreateTime(Long createTime) {
this.createTime = createTime;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public Long getRetentionBytes() {
return retentionBytes;
}
public void setRetentionBytes(Long retentionBytes) {
this.retentionBytes = retentionBytes;
}
private Integer haRelation;
@Override
public String toString() {
@@ -186,6 +63,7 @@ public class TopicBasicDTO {
", createTime=" + createTime +
", retentionTime=" + retentionTime +
", retentionBytes=" + retentionBytes +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -1,10 +1,13 @@
package com.xiaojukeji.kafka.manager.common.entity.ao.topic;
import lombok.Data;
/**
* Topic概览信息
* @author zengqiao
* @date 20/5/14
*/
@Data
public class TopicOverview {
private Long clusterId;
@@ -32,109 +35,7 @@ public class TopicOverview {
private Long logicalClusterId;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public Integer getReplicaNum() {
return replicaNum;
}
public void setReplicaNum(Integer replicaNum) {
this.replicaNum = replicaNum;
}
public Integer getPartitionNum() {
return partitionNum;
}
public void setPartitionNum(Integer partitionNum) {
this.partitionNum = partitionNum;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public Object getByteIn() {
return byteIn;
}
public void setByteIn(Object byteIn) {
this.byteIn = byteIn;
}
public Object getByteOut() {
return byteOut;
}
public void setByteOut(Object byteOut) {
this.byteOut = byteOut;
}
public Object getProduceRequest() {
return produceRequest;
}
public void setProduceRequest(Object produceRequest) {
this.produceRequest = produceRequest;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Long getUpdateTime() {
return updateTime;
}
public void setUpdateTime(Long updateTime) {
this.updateTime = updateTime;
}
public Long getLogicalClusterId() {
return logicalClusterId;
}
public void setLogicalClusterId(Long logicalClusterId) {
this.logicalClusterId = logicalClusterId;
}
private Integer haRelation;
@Override
public String toString() {
@@ -152,6 +53,7 @@ public class TopicOverview {
", description='" + description + '\'' +
", updateTime=" + updateTime +
", logicalClusterId=" + logicalClusterId +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -0,0 +1,26 @@
package com.xiaojukeji.kafka.manager.common.entity.dto.ha;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import javax.validation.constraints.NotBlank;
@Data
@ApiModel(description="Topic信息")
public class ASSwitchJobActionDTO {
/**
* @see com.xiaojukeji.kafka.manager.common.bizenum.TaskActionEnum
*/
@NotBlank(message = "action不允许为空")
@ApiModelProperty(value = "动作, force")
private String action;
// @NotNull(message = "all不允许为NULL")
// @ApiModelProperty(value = "所有的Topic")
// private Boolean allJumpWaitInSync;
//
// @NotNull(message = "jumpWaitInSyncActiveTopicList不允许为NULL")
// @ApiModelProperty(value = "操作的Topic")
// private List<String> jumpWaitInSyncActiveTopicList;
}

View File

@@ -0,0 +1,31 @@
package com.xiaojukeji.kafka.manager.common.entity.dto.ha;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import javax.validation.constraints.NotNull;
import java.util.List;
@Data
@ApiModel(description="主备切换任务")
public class ASSwitchJobDTO {
@NotNull(message = "all不允许为NULL")
@ApiModelProperty(value = "所有Topic")
private Boolean all;
@NotNull(message = "mustContainAllKafkaUserTopics不允许为NULL")
@ApiModelProperty(value = "是否需要包含KafkaUser关联的所有Topic")
private Boolean mustContainAllKafkaUserTopics;
@NotNull(message = "activeClusterPhyId不允许为NULL")
@ApiModelProperty(value="主集群ID")
private Long activeClusterPhyId;
@NotNull(message = "standbyClusterPhyId不允许为NULL")
@ApiModelProperty(value="备集群ID")
private Long standbyClusterPhyId;
@NotNull(message = "topicNameList不允许为NULL")
private List<String> topicNameList;
}

View File

@@ -0,0 +1,51 @@
package com.xiaojukeji.kafka.manager.common.entity.dto.op.topic;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.xiaojukeji.kafka.manager.common.utils.ValidateUtils;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import javax.validation.constraints.NotNull;
import java.util.List;
/**
* @author huangyiminghappy@163.com, zengqiao
* @date 2022-06-29
*/
@Data
@JsonIgnoreProperties(ignoreUnknown = true)
@ApiModel(description = "Topic高可用关联|解绑")
public class HaTopicRelationDTO {
@NotNull(message = "主集群id不能为空")
@ApiModelProperty(value = "主集群id")
private Long activeClusterId;
@NotNull(message = "备集群id不能为空")
@ApiModelProperty(value = "备集群id")
private Long standbyClusterId;
@NotNull(message = "是否应用于所有topic")
@ApiModelProperty(value = "是否应用于所有topic")
private Boolean all;
@ApiModelProperty(value = "需要关联|解绑的topic名称列表")
private List<String> topicNames;
@Override
public String toString() {
return "HaTopicRelationDTO{" +
", activeClusterId=" + activeClusterId +
", standbyClusterId=" + standbyClusterId +
", all=" + all +
", topicNames=" + topicNames +
'}';
}
public boolean paramLegal() {
if(!all && ValidateUtils.isEmptyList(topicNames)) {
return false;
}
return true;
}
}

View File

@@ -0,0 +1,24 @@
package com.xiaojukeji.kafka.manager.common.entity.dto.rd;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import javax.validation.constraints.NotNull;
import java.util.List;
/**
* @author zengqiao
* @date 20/5/4
*/
@Data
@ApiModel(description="App关联Topic信息")
public class AppRelateTopicsDTO {
@NotNull(message = "clusterPhyId不允许为NULL")
@ApiModelProperty(value="物理集群ID")
private Long clusterPhyId;
@NotNull(message = "filterTopicNameList不允许为NULL")
@ApiModelProperty(value="过滤的Topic列表")
private List<String> filterTopicNameList;
}

View File

@@ -4,11 +4,13 @@ import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import com.xiaojukeji.kafka.manager.common.utils.ValidateUtils;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/23
*/
@Data
@ApiModel(description = "集群接入&修改")
@JsonIgnoreProperties(ignoreUnknown = true)
public class ClusterDTO {
@@ -33,60 +35,21 @@ public class ClusterDTO {
@ApiModelProperty(value="Jmx配置")
private String jmxProperties;
public Long getClusterId() {
return clusterId;
}
@ApiModelProperty(value="主集群Id")
private Long activeClusterId;
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
@ApiModelProperty(value="是否高可用")
private boolean isHa;
public String getClusterName() {
return clusterName;
}
public void setClusterName(String clusterName) {
this.clusterName = clusterName;
}
public String getZookeeper() {
return zookeeper;
}
public void setZookeeper(String zookeeper) {
this.zookeeper = zookeeper;
}
public String getBootstrapServers() {
return bootstrapServers;
}
public void setBootstrapServers(String bootstrapServers) {
this.bootstrapServers = bootstrapServers;
}
public String getIdc() {
return idc;
}
public void setIdc(String idc) {
this.idc = idc;
}
public String getSecurityProperties() {
return securityProperties;
}
public void setSecurityProperties(String securityProperties) {
this.securityProperties = securityProperties;
}
public String getJmxProperties() {
return jmxProperties;
}
public void setJmxProperties(String jmxProperties) {
this.jmxProperties = jmxProperties;
public boolean legal() {
if (ValidateUtils.isNull(clusterName)
|| ValidateUtils.isNull(zookeeper)
|| ValidateUtils.isNull(idc)
|| ValidateUtils.isNull(bootstrapServers)
|| (isHa && ValidateUtils.isNull(activeClusterId))) {
return false;
}
return true;
}
@Override
@@ -99,16 +62,8 @@ public class ClusterDTO {
", idc='" + idc + '\'' +
", securityProperties='" + securityProperties + '\'' +
", jmxProperties='" + jmxProperties + '\'' +
", activeClusterId=" + activeClusterId +
", isHa=" + isHa +
'}';
}
public boolean legal() {
if (ValidateUtils.isNull(clusterName)
|| ValidateUtils.isNull(zookeeper)
|| ValidateUtils.isNull(idc)
|| ValidateUtils.isNull(bootstrapServers)) {
return false;
}
return true;
}
}

View File

@@ -0,0 +1,24 @@
package com.xiaojukeji.kafka.manager.common.entity.pagination;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
@Data
@ApiModel(description = "分页信息")
public class Pagination {
@ApiModelProperty(value = "总记录数", example = "100")
private long total;
@ApiModelProperty(value = "当前页码", example = "0")
private long pageNo;
@ApiModelProperty(value = "单页大小", example = "10")
private long pageSize;
public Pagination(long total, long pageNo, long pageSize) {
this.total = total;
this.pageNo = pageNo;
this.pageSize = pageSize;
}
}

View File

@@ -0,0 +1,17 @@
package com.xiaojukeji.kafka.manager.common.entity.pagination;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import java.util.List;
@Data
@ApiModel(description = "分页数据")
public class PaginationData<T> {
@ApiModelProperty(value = "业务数据")
private List<T> bizData;
@ApiModelProperty(value = "分页信息")
private Pagination pagination;
}

View File

@@ -0,0 +1,30 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo;
import lombok.Data;
import java.io.Serializable;
import java.util.Date;
/**
* @author zengqiao
* @date 21/07/19
*/
@Data
public class BaseDO implements Serializable {
private static final long serialVersionUID = 8782560709154468485L;
/**
* 主键ID
*/
protected Long id;
/**
* 创建时间
*/
protected Date createTime;
/**
* 更新时间
*/
protected Date modifyTime;
}

View File

@@ -1,11 +1,18 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;
import java.util.Date;
/**
* @author zengqiao
* @date 20/6/29
*/
@Data
@ToString
@NoArgsConstructor
public class LogicalClusterDO {
private Long id;
@@ -27,99 +34,17 @@ public class LogicalClusterDO {
private Date gmtModify;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
public LogicalClusterDO(String name,
String identification,
Integer mode,
String appId,
Long clusterId,
String regionList) {
this.name = name;
}
public String getIdentification() {
return identification;
}
public void setIdentification(String identification) {
this.identification = identification;
}
public Integer getMode() {
return mode;
}
public void setMode(Integer mode) {
this.mode = mode;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getRegionList() {
return regionList;
}
public void setRegionList(String regionList) {
this.regionList = regionList;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Date getGmtCreate() {
return gmtCreate;
}
public void setGmtCreate(Date gmtCreate) {
this.gmtCreate = gmtCreate;
}
public Date getGmtModify() {
return gmtModify;
}
public void setGmtModify(Date gmtModify) {
this.gmtModify = gmtModify;
}
@Override
public String toString() {
return "LogicalClusterDO{" +
"id=" + id +
", name='" + name + '\'' +
", identification='" + identification + '\'' +
", mode=" + mode +
", appId='" + appId + '\'' +
", clusterId=" + clusterId +
", regionList='" + regionList + '\'' +
", description='" + description + '\'' +
", gmtCreate=" + gmtCreate +
", gmtModify=" + gmtModify +
'}';
}
}

View File

@@ -1,7 +1,14 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;
import java.util.Date;
@Data
@ToString
@NoArgsConstructor
public class RegionDO implements Comparable<RegionDO> {
private Long id;
@@ -25,111 +32,13 @@ public class RegionDO implements Comparable<RegionDO> {
private String description;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Integer getStatus() {
return status;
}
public void setStatus(Integer status) {
public RegionDO(Integer status, String name, Long clusterId, String brokerList) {
this.status = status;
}
public Date getGmtCreate() {
return gmtCreate;
}
public void setGmtCreate(Date gmtCreate) {
this.gmtCreate = gmtCreate;
}
public Date getGmtModify() {
return gmtModify;
}
public void setGmtModify(Date gmtModify) {
this.gmtModify = gmtModify;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getBrokerList() {
return brokerList;
}
public void setBrokerList(String brokerList) {
this.brokerList = brokerList;
}
public Long getCapacity() {
return capacity;
}
public void setCapacity(Long capacity) {
this.capacity = capacity;
}
public Long getRealUsed() {
return realUsed;
}
public void setRealUsed(Long realUsed) {
this.realUsed = realUsed;
}
public Long getEstimateUsed() {
return estimateUsed;
}
public void setEstimateUsed(Long estimateUsed) {
this.estimateUsed = estimateUsed;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
@Override
public String toString() {
return "RegionDO{" +
"id=" + id +
", status=" + status +
", gmtCreate=" + gmtCreate +
", gmtModify=" + gmtModify +
", name='" + name + '\'' +
", clusterId=" + clusterId +
", brokerList='" + brokerList + '\'' +
", capacity=" + capacity +
", realUsed=" + realUsed +
", estimateUsed=" + estimateUsed +
", description='" + description + '\'' +
'}';
}
@Override
public int compareTo(RegionDO regionDO) {
return this.id.compareTo(regionDO.id);

View File

@@ -2,6 +2,8 @@ package com.xiaojukeji.kafka.manager.common.entity.pojo;
import com.xiaojukeji.kafka.manager.common.entity.dto.op.topic.TopicCreationDTO;
import com.xiaojukeji.kafka.manager.common.utils.ValidateUtils;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Date;
@@ -9,6 +11,8 @@ import java.util.Date;
* @author zengqiao
* @date 20/4/24
*/
@Data
@NoArgsConstructor
public class TopicDO {
private Long id;
@@ -26,70 +30,14 @@ public class TopicDO {
private Long peakBytesIn;
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
public TopicDO(String appId, Long clusterId, String topicName, String description, Long peakBytesIn) {
this.appId = appId;
}
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Long getPeakBytesIn() {
return peakBytesIn;
}
public void setPeakBytesIn(Long peakBytesIn) {
this.peakBytesIn = peakBytesIn;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public Date getGmtCreate() {
return gmtCreate;
}
public void setGmtCreate(Date gmtCreate) {
this.gmtCreate = gmtCreate;
}
public Date getGmtModify() {
return gmtModify;
}
public void setGmtModify(Date gmtModify) {
this.gmtModify = gmtModify;
}
public static TopicDO buildFrom(TopicCreationDTO dto) {
TopicDO topicDO = new TopicDO();
topicDO.setAppId(dto.getAppId());

View File

@@ -0,0 +1,69 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo.ha;
import com.baomidou.mybatisplus.annotation.TableName;
import com.xiaojukeji.kafka.manager.common.entity.pojo.BaseDO;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
/**
* HA-主备关系表
*/
@Data
@NoArgsConstructor
@AllArgsConstructor
@TableName("ha_active_standby_relation")
public class HaASRelationDO extends BaseDO {
/**
* 主集群ID
*/
private Long activeClusterPhyId;
/**
* 主集群资源名称
*/
private String activeResName;
/**
* 备集群ID
*/
private Long standbyClusterPhyId;
/**
* 备集群资源名称
*/
private String standbyResName;
/**
* 资源类型
*/
private Integer resType;
/**
* 主备状态
*/
private Integer status;
/**
* 主备关系中的唯一性字段
*/
private String uniqueField;
public HaASRelationDO(Long id, Integer status) {
this.id = id;
this.status = status;
}
public HaASRelationDO(Long activeClusterPhyId, String activeResName, Long standbyClusterPhyId, String standbyResName, Integer resType, Integer status) {
this.activeClusterPhyId = activeClusterPhyId;
this.activeResName = activeResName;
this.standbyClusterPhyId = standbyClusterPhyId;
this.standbyResName = standbyResName;
this.resType = resType;
this.status = status;
// 主备两个资源之间唯一,但是不保证两个资源之间,只存在主备关系,也可能存在双活关系,及各自都为对方的主备
this.uniqueField = String.format("%d_%s||%d_%s||%d", activeClusterPhyId, activeResName, standbyClusterPhyId, standbyResName, resType);
}
}

View File

@@ -0,0 +1,42 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo.ha;
import com.baomidou.mybatisplus.annotation.TableName;
import com.xiaojukeji.kafka.manager.common.entity.pojo.BaseDO;
import lombok.Data;
import lombok.NoArgsConstructor;
/**
* HA-主备关系切换任务表
*/
@Data
@NoArgsConstructor
@TableName("ha_active_standby_switch_job")
public class HaASSwitchJobDO extends BaseDO {
/**
* 主集群ID
*/
private Long activeClusterPhyId;
/**
* 备集群ID
*/
private Long standbyClusterPhyId;
/**
* 主备状态
*/
private Integer jobStatus;
/**
* 操作人
*/
private String operator;
public HaASSwitchJobDO(Long activeClusterPhyId, Long standbyClusterPhyId, Integer jobStatus, String operator) {
this.activeClusterPhyId = activeClusterPhyId;
this.standbyClusterPhyId = standbyClusterPhyId;
this.jobStatus = jobStatus;
this.operator = operator;
}
}

View File

@@ -0,0 +1,67 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo.ha;
import com.baomidou.mybatisplus.annotation.TableName;
import com.xiaojukeji.kafka.manager.common.entity.pojo.BaseDO;
import lombok.Data;
import lombok.NoArgsConstructor;
/**
* HA-主备关系切换子任务表
*/
@Data
@NoArgsConstructor
@TableName("ha_active_standby_switch_sub_job")
public class HaASSwitchSubJobDO extends BaseDO {
/**
* 任务ID
*/
private Long jobId;
/**
* 主集群ID
*/
private Long activeClusterPhyId;
/**
* 主集群资源名称
*/
private String activeResName;
/**
* 备集群ID
*/
private Long standbyClusterPhyId;
/**
* 备集群资源名称
*/
private String standbyResName;
/**
* 资源类型
*/
private Integer resType;
/**
* 任务状态
*/
private Integer jobStatus;
/**
* 扩展数据
* @see com.xiaojukeji.kafka.manager.common.entity.ao.ha.job.HaSubJobExtendData
*/
private String extendData;
public HaASSwitchSubJobDO(Long jobId, Long activeClusterPhyId, String activeResName, Long standbyClusterPhyId, String standbyResName, Integer resType, Integer jobStatus, String extendData) {
this.jobId = jobId;
this.activeClusterPhyId = activeClusterPhyId;
this.activeResName = activeResName;
this.standbyClusterPhyId = standbyClusterPhyId;
this.standbyResName = standbyResName;
this.resType = resType;
this.jobStatus = jobStatus;
this.extendData = extendData;
}
}

View File

@@ -0,0 +1,50 @@
package com.xiaojukeji.kafka.manager.common.entity.pojo.ha;
import com.baomidou.mybatisplus.annotation.TableName;
import com.xiaojukeji.kafka.manager.common.entity.pojo.BaseDO;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Date;
@Data
@NoArgsConstructor
@TableName("job_log")
public class JobLogDO extends BaseDO {
/**
* 业务类型
*/
private Integer bizType;
/**
* 业务关键字
*/
private String bizKeyword;
/**
* 打印时间
*/
private Date printTime;
/**
* 内容
*/
private String content;
public JobLogDO(Integer bizType, String bizKeyword) {
this.bizType = bizType;
this.bizKeyword = bizKeyword;
}
public JobLogDO(Integer bizType, String bizKeyword, Date printTime, String content) {
this.bizType = bizType;
this.bizKeyword = bizKeyword;
this.printTime = printTime;
this.content = content;
}
public JobLogDO setAndCopyNew(Date printTime, String content) {
return new JobLogDO(this.bizType, this.bizKeyword, printTime, content);
}
}

View File

@@ -2,12 +2,14 @@ package com.xiaojukeji.kafka.manager.common.entity.vo.common;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* Topic信息
* @author zengqiao
* @date 19/4/1
*/
@Data
@ApiModel(description = "Topic信息概览")
public class TopicOverviewVO {
@ApiModelProperty(value = "集群ID")
@@ -49,109 +51,8 @@ public class TopicOverviewVO {
@ApiModelProperty(value = "逻辑集群id")
private Long logicalClusterId;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public Integer getReplicaNum() {
return replicaNum;
}
public void setReplicaNum(Integer replicaNum) {
this.replicaNum = replicaNum;
}
public Integer getPartitionNum() {
return partitionNum;
}
public void setPartitionNum(Integer partitionNum) {
this.partitionNum = partitionNum;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public Object getByteIn() {
return byteIn;
}
public void setByteIn(Object byteIn) {
this.byteIn = byteIn;
}
public Object getByteOut() {
return byteOut;
}
public void setByteOut(Object byteOut) {
this.byteOut = byteOut;
}
public Object getProduceRequest() {
return produceRequest;
}
public void setProduceRequest(Object produceRequest) {
this.produceRequest = produceRequest;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public Long getUpdateTime() {
return updateTime;
}
public void setUpdateTime(Long updateTime) {
this.updateTime = updateTime;
}
public Long getLogicalClusterId() {
return logicalClusterId;
}
public void setLogicalClusterId(Long logicalClusterId) {
this.logicalClusterId = logicalClusterId;
}
@ApiModelProperty(value = "高可用关系1:主topic, 0:备topic , 其他:非高可用topic")
private Integer haRelation;
@Override
public String toString() {
@@ -169,6 +70,7 @@ public class TopicOverviewVO {
", description='" + description + '\'' +
", updateTime=" + updateTime +
", logicalClusterId=" + logicalClusterId +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -0,0 +1,34 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.ha;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/29
*/
@Data
@ApiModel(description="HA集群-Topic信息")
public class HaClusterTopicVO {
@ApiModelProperty(value="当前查询的集群ID")
private Long clusterId;
@ApiModelProperty(value="Topic名称")
private String topicName;
@ApiModelProperty(value="生产Acl数量")
private Integer produceAclNum;
@ApiModelProperty(value="消费Acl数量")
private Integer consumeAclNum;
@ApiModelProperty(value="主集群ID")
private Long activeClusterId;
@ApiModelProperty(value="备集群ID")
private Long standbyClusterId;
@ApiModelProperty(value="主备状态")
private Integer status;
}

View File

@@ -0,0 +1,48 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.ha;
import com.xiaojukeji.kafka.manager.common.entity.vo.rd.cluster.ClusterBaseVO;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/29
*/
@Data
@ApiModel(description="HA集群-集群信息")
public class HaClusterVO extends ClusterBaseVO {
@ApiModelProperty(value="broker数量")
private Integer brokerNum;
@ApiModelProperty(value="topic数量")
private Integer topicNum;
@ApiModelProperty(value="消费组数")
private Integer consumerGroupNum;
@ApiModelProperty(value="region数")
private Integer regionNum;
@ApiModelProperty(value="ControllerID")
private Integer controllerId;
/**
* @see com.xiaojukeji.kafka.manager.common.bizenum.ha.HaStatusEnum
*/
@ApiModelProperty(value="主备状态")
private Integer haStatus;
@ApiModelProperty(value="主topic数")
private Long activeTopicCount;
@ApiModelProperty(value="备topic数")
private Long standbyTopicCount;
@ApiModelProperty(value="备集群信息")
private HaClusterVO haClusterVO;
@ApiModelProperty(value="切换任务id")
private Long haASSwitchJobId;
}

View File

@@ -0,0 +1,37 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.ha.job;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job详情")
public class HaJobDetailVO {
@ApiModelProperty(value = "Topic名称")
private String topicName;
@ApiModelProperty(value="主物理集群ID")
private Long activeClusterPhyId;
@ApiModelProperty(value="主物理集群名称")
private String activeClusterPhyName;
@ApiModelProperty(value="备物理集群ID")
private Long standbyClusterPhyId;
@ApiModelProperty(value="备物理集群名称")
private String standbyClusterPhyName;
@ApiModelProperty(value="Lag和")
private Long sumLag;
@ApiModelProperty(value="状态")
private Integer status;
@ApiModelProperty(value="超时时间配置")
private Long timeoutUnitSecConfig;
}

View File

@@ -0,0 +1,46 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.ha.job;
import com.xiaojukeji.kafka.manager.common.entity.ao.ha.job.HaJobState;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job状态")
public class HaJobStateVO {
@ApiModelProperty(value = "任务总数")
private Integer jobNu;
@ApiModelProperty(value = "运行中的任务数")
private Integer runningNu;
@ApiModelProperty(value = "超时运行中的任务数")
private Integer runningInTimeoutNu;
@ApiModelProperty(value = "准备好待运行的任务数")
private Integer waitingNu;
@ApiModelProperty(value = "运行成功的任务数")
private Integer successNu;
@ApiModelProperty(value = "运行失败的任务数")
private Integer failedNu;
@ApiModelProperty(value = "进度,[0 - 100]")
private Integer progress;
public HaJobStateVO(HaJobState jobState) {
this.jobNu = jobState.getTotal();
this.runningNu = jobState.getDoing();
this.runningInTimeoutNu = jobState.getDoingInTimeout();
this.waitingNu = 0;
this.successNu = jobState.getSuccess();
this.failedNu = jobState.getFailed();
this.progress = jobState.getProgress();
}
}

View File

@@ -0,0 +1,26 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.normal.topic;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/8
*/
@Data
@ApiModel(value = "集群的topic高可用状态")
public class HaClusterTopicHaStatusVO {
@ApiModelProperty(value = "物理集群ID")
private Long clusterId;
@ApiModelProperty(value = "物理集群名称")
private String clusterName;
@ApiModelProperty(value = "Topic名称")
private String topicName;
@ApiModelProperty(value = "高可用关系1:主topic, 0:备topic , 其他:非高可用topic")
private Integer haRelation;
}

View File

@@ -2,6 +2,7 @@ package com.xiaojukeji.kafka.manager.common.entity.vo.normal.topic;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import java.util.List;
@@ -10,6 +11,7 @@ import java.util.List;
* @author zengqiao
* @date 19/4/1
*/
@Data
@ApiModel(description = "Topic基本信息")
public class TopicBasicVO {
@ApiModelProperty(value = "集群id")
@@ -57,125 +59,8 @@ public class TopicBasicVO {
@ApiModelProperty(value = "所属region")
private List<String> regionNameList;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public Integer getPartitionNum() {
return partitionNum;
}
public void setPartitionNum(Integer partitionNum) {
this.partitionNum = partitionNum;
}
public Integer getReplicaNum() {
return replicaNum;
}
public void setReplicaNum(Integer replicaNum) {
this.replicaNum = replicaNum;
}
public String getPrincipals() {
return principals;
}
public void setPrincipals(String principals) {
this.principals = principals;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public Long getRetentionBytes() {
return retentionBytes;
}
public void setRetentionBytes(Long retentionBytes) {
this.retentionBytes = retentionBytes;
}
public Long getCreateTime() {
return createTime;
}
public void setCreateTime(Long createTime) {
this.createTime = createTime;
}
public Long getModifyTime() {
return modifyTime;
}
public void setModifyTime(Long modifyTime) {
this.modifyTime = modifyTime;
}
public Integer getScore() {
return score;
}
public void setScore(Integer score) {
this.score = score;
}
public String getTopicCodeC() {
return topicCodeC;
}
public void setTopicCodeC(String topicCodeC) {
this.topicCodeC = topicCodeC;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public String getBootstrapServers() {
return bootstrapServers;
}
public void setBootstrapServers(String bootstrapServers) {
this.bootstrapServers = bootstrapServers;
}
public List<String> getRegionNameList() {
return regionNameList;
}
public void setRegionNameList(List<String> regionNameList) {
this.regionNameList = regionNameList;
}
@ApiModelProperty(value = "高可用关系1:主topic, 0:备topic , 其他:非主备topic")
private Integer haRelation;
@Override
public String toString() {
@@ -195,6 +80,7 @@ public class TopicBasicVO {
", description='" + description + '\'' +
", bootstrapServers='" + bootstrapServers + '\'' +
", regionNameList=" + regionNameList +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -0,0 +1,26 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.normal.topic;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/8
*/
@Data
@ApiModel(value = "Topic信息")
public class TopicHaVO {
@ApiModelProperty(value = "物理集群ID")
private Long clusterId;
@ApiModelProperty(value = "物理集群名称")
private String clusterName;
@ApiModelProperty(value = "Topic名称")
private String topicName;
@ApiModelProperty(value = "高可用关系1:主topic, 0:备topic , 其他:非高可用topic")
private Integer haRelation;
}

View File

@@ -2,6 +2,7 @@ package com.xiaojukeji.kafka.manager.common.entity.vo.rd;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import java.util.List;
import java.util.Properties;
@@ -10,6 +11,7 @@ import java.util.Properties;
* @author zengqiao
* @date 20/6/10
*/
@Data
@ApiModel(description = "Topic基本信息(RD视角)")
public class RdTopicBasicVO {
@ApiModelProperty(value = "集群ID")
@@ -39,77 +41,8 @@ public class RdTopicBasicVO {
@ApiModelProperty(value = "所属region")
private List<String> regionNameList;
public Long getClusterId() {
return clusterId;
}
public void setClusterId(Long clusterId) {
this.clusterId = clusterId;
}
public String getClusterName() {
return clusterName;
}
public void setClusterName(String clusterName) {
this.clusterName = clusterName;
}
public String getTopicName() {
return topicName;
}
public void setTopicName(String topicName) {
this.topicName = topicName;
}
public Long getRetentionTime() {
return retentionTime;
}
public void setRetentionTime(Long retentionTime) {
this.retentionTime = retentionTime;
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
public String getAppName() {
return appName;
}
public void setAppName(String appName) {
this.appName = appName;
}
public Properties getProperties() {
return properties;
}
public void setProperties(Properties properties) {
this.properties = properties;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public List<String> getRegionNameList() {
return regionNameList;
}
public void setRegionNameList(List<String> regionNameList) {
this.regionNameList = regionNameList;
}
@ApiModelProperty(value = "高可用关系1:主topic, 0:备topic , 其他:非主备topic")
private Integer haRelation;
@Override
public String toString() {
@@ -122,7 +55,8 @@ public class RdTopicBasicVO {
", appName='" + appName + '\'' +
", properties=" + properties +
", description='" + description + '\'' +
", regionNameList='" + regionNameList + '\'' +
", regionNameList=" + regionNameList +
", haRelation=" + haRelation +
'}';
}
}

View File

@@ -0,0 +1,30 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.rd.app;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
import java.util.List;
/**
* @author zengqiao
* @date 20/5/4
*/
@Data
@ApiModel(description="App关联Topic信息")
public class AppRelateTopicsVO {
@ApiModelProperty(value="物理集群ID")
private Long clusterPhyId;
@ApiModelProperty(value="kafkaUser")
private String kafkaUser;
@ApiModelProperty(value="选中的Topic列表")
private List<String> selectedTopicNameList;
@ApiModelProperty(value="未选中的Topic列表")
private List<String> notSelectTopicNameList;
@ApiModelProperty(value="未建立HA的Topic列表")
private List<String> notHaTopicNameList;
}

View File

@@ -2,11 +2,13 @@ package com.xiaojukeji.kafka.manager.common.entity.vo.rd.cluster;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
/**
* @author zengqiao
* @date 20/4/23
*/
@Data
@ApiModel(description="集群信息")
public class ClusterDetailVO extends ClusterBaseVO {
@ApiModelProperty(value="Broker数")
@@ -24,45 +26,11 @@ public class ClusterDetailVO extends ClusterBaseVO {
@ApiModelProperty(value="Region数")
private Integer regionNum;
public Integer getBrokerNum() {
return brokerNum;
}
@ApiModelProperty(value = "高可用关系1:主, 0:备 , 其他:非高可用")
private Integer haRelation;
public void setBrokerNum(Integer brokerNum) {
this.brokerNum = brokerNum;
}
public Integer getTopicNum() {
return topicNum;
}
public void setTopicNum(Integer topicNum) {
this.topicNum = topicNum;
}
public Integer getConsumerGroupNum() {
return consumerGroupNum;
}
public void setConsumerGroupNum(Integer consumerGroupNum) {
this.consumerGroupNum = consumerGroupNum;
}
public Integer getControllerId() {
return controllerId;
}
public void setControllerId(Integer controllerId) {
this.controllerId = controllerId;
}
public Integer getRegionNum() {
return regionNum;
}
public void setRegionNum(Integer regionNum) {
this.regionNum = regionNum;
}
@ApiModelProperty(value = "互备集群名称")
private String mutualBackupClusterName;
@Override
public String toString() {
@@ -72,6 +40,8 @@ public class ClusterDetailVO extends ClusterBaseVO {
", consumerGroupNum=" + consumerGroupNum +
", controllerId=" + controllerId +
", regionNum=" + regionNum +
"} " + super.toString();
", haRelation=" + haRelation +
", mutualBackupClusterName='" + mutualBackupClusterName + '\'' +
'}';
}
}

View File

@@ -0,0 +1,30 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.rd.job;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.Date;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job日志")
public class JobLogVO {
@ApiModelProperty(value = "日志ID")
protected Long id;
@ApiModelProperty(value = "业务类型")
private Integer bizType;
@ApiModelProperty(value = "业务关键字")
private String bizKeyword;
@ApiModelProperty(value = "打印时间")
private Date printTime;
@ApiModelProperty(value = "内容")
private String content;
}

View File

@@ -0,0 +1,31 @@
package com.xiaojukeji.kafka.manager.common.entity.vo.rd.job;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.ArrayList;
import java.util.List;
@Data
@NoArgsConstructor
@AllArgsConstructor
@ApiModel(description = "Job日志")
public class JobMulLogVO {
@ApiModelProperty(value = "末尾日志ID")
private Long endLogId;
@ApiModelProperty(value = "日志信息")
private List<JobLogVO> logList;
public JobMulLogVO(List<JobLogVO> logList, Long startLogId) {
this.logList = logList == null? new ArrayList<>(): logList;
if (!this.logList.isEmpty()) {
this.endLogId = this.logList.stream().map(elem -> elem.id).reduce(Long::max).get() + 1;
} else {
this.endLogId = startLogId;
}
}
}

View File

@@ -0,0 +1,404 @@
package com.xiaojukeji.kafka.manager.common.utils;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import com.alibaba.fastjson.TypeReference;
import com.alibaba.fastjson.serializer.SerializerFeature;
import com.google.common.collect.*;
import org.apache.commons.collections.CollectionUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeanUtils;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
import java.lang.reflect.Type;
import java.util.*;
import java.util.Map.Entry;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.Consumer;
import java.util.function.Function;
public class ConvertUtil {
private static final Logger LOGGER = LoggerFactory.getLogger(ConvertUtil.class);
private ConvertUtil(){}
public static <T> T toObj(String json, Type resultType) {
if (resultType instanceof Class) {
Class<T> clazz = (Class<T>) resultType;
return str2ObjByJson(json, clazz);
}
return JSON.parseObject(json, resultType);
}
public static <T> T str2ObjByJson(String srcStr, Class<T> tgtClass) {
return JSON.parseObject(srcStr, tgtClass);
}
public static <T> T str2ObjByJson(String srcStr, TypeReference<T> tt) {
return JSON.parseObject(srcStr, tt);
}
public static String obj2Json(Object srcObj) {
if (srcObj == null) {
return null;
}
if (srcObj instanceof String) {
return (String) srcObj;
} else {
return JSON.toJSONString(srcObj);
}
}
public static String obj2JsonWithIgnoreCircularReferenceDetect(Object srcObj) {
return JSON.toJSONString(srcObj, SerializerFeature.DisableCircularReferenceDetect);
}
public static <T> List<T> str2ObjArrayByJson(String srcStr, Class<T> tgtClass) {
return JSON.parseArray(srcStr, tgtClass);
}
public static <T> T obj2ObjByJSON(Object srcObj, Class<T> tgtClass) {
return JSON.parseObject( JSON.toJSONString(srcObj), tgtClass);
}
public static String list2String(List<?> list, String separator) {
if (list == null || list.isEmpty()) {
return "";
}
StringBuilder sb = new StringBuilder();
for (Object item : list) {
sb.append(item).append(separator);
}
return sb.deleteCharAt(sb.length() - 1).toString();
}
public static <K, V> Map<K, V> list2Map(List<V> list, Function<? super V, ? extends K> mapper) {
Map<K, V> map = Maps.newHashMap();
if (CollectionUtils.isNotEmpty(list)) {
for (V v : list) {
map.put(mapper.apply(v), v);
}
}
return map;
}
public static <K, V> Map<K, V> list2MapParallel(List<V> list, Function<? super V, ? extends K> mapper) {
Map<K, V> map = new ConcurrentHashMap<>();
if (CollectionUtils.isNotEmpty(list)) {
list.parallelStream().forEach(v -> map.put(mapper.apply(v), v));
}
return map;
}
public static <K, V, O> Map<K, V> list2Map(List<O> list, Function<? super O, ? extends K> keyMapper,
Function<? super O, ? extends V> valueMapper) {
Map<K, V> map = Maps.newHashMap();
if (CollectionUtils.isNotEmpty(list)) {
for (O o : list) {
map.put(keyMapper.apply(o), valueMapper.apply(o));
}
}
return map;
}
public static <K, V> Multimap<K, V> list2MulMap(List<V> list, Function<? super V, ? extends K> mapper) {
Multimap<K, V> multimap = ArrayListMultimap.create();
if (CollectionUtils.isNotEmpty(list)) {
for (V v : list) {
multimap.put(mapper.apply(v), v);
}
}
return multimap;
}
public static <K, V, O> Multimap<K, V> list2MulMap(List<O> list, Function<? super O, ? extends K> keyMapper,
Function<? super O, ? extends V> valueMapper) {
Multimap<K, V> multimap = ArrayListMultimap.create();
if (CollectionUtils.isNotEmpty(list)) {
for (O o : list) {
multimap.put(keyMapper.apply(o), valueMapper.apply(o));
}
}
return multimap;
}
public static <K, V, O> Map<K, List<V>> list2MapOfList(List<O> list, Function<? super O, ? extends K> keyMapper,
Function<? super O, ? extends V> valueMapper) {
ArrayListMultimap<K, V> multimap = ArrayListMultimap.create();
if (CollectionUtils.isNotEmpty(list)) {
for (O o : list) {
multimap.put(keyMapper.apply(o), valueMapper.apply(o));
}
}
return Multimaps.asMap(multimap);
}
public static <K, V> Set<K> list2Set(List<V> list, Function<? super V, ? extends K> mapper) {
Set<K> set = Sets.newHashSet();
if (CollectionUtils.isNotEmpty(list)) {
for (V v : list) {
set.add(mapper.apply(v));
}
}
return set;
}
public static <T> Set<T> set2Set(Set<? extends Object> set, Class<T> tClass) {
if (CollectionUtils.isEmpty(set)) {
return new HashSet<>();
}
Set<T> result = new HashSet<>();
for (Object o : set) {
T t = obj2Obj(o, tClass);
if (t != null) {
result.add(t);
}
}
return result;
}
public static <T> List<T> list2List(List<? extends Object> list, Class<T> tClass) {
return list2List(list, tClass, (t) -> {
});
}
public static <T> List<T> list2List(List<? extends Object> list, Class<T> tClass, Consumer<T> consumer) {
if (CollectionUtils.isEmpty(list)) {
return Lists.newArrayList();
}
List<T> result = Lists.newArrayList();
for (Object object : list) {
T t = obj2Obj(object, tClass, consumer);
if (t != null) {
result.add(t);
}
}
return result;
}
/**
* 对象转换工具
* @param srcObj 元对象
* @param tgtClass 目标对象类
* @param <T> 泛型
* @return 目标对象
*/
public static <T> T obj2Obj(final Object srcObj, Class<T> tgtClass) {
return obj2Obj(srcObj, tgtClass, (t) -> {
});
}
public static <T> T obj2Obj(final Object srcObj, Class<T> tgtClass, Consumer<T> consumer) {
if (srcObj == null) {
return null;
}
T tgt = null;
try {
tgt = tgtClass.newInstance();
BeanUtils.copyProperties(srcObj, tgt);
consumer.accept(tgt);
} catch (Exception e) {
LOGGER.warn("class=ConvertUtil||method=obj2Obj||msg={}", e.getMessage());
}
return tgt;
}
public static <K, V> Map<K, V> mergeMapList(List<Map<K, V>> mapList) {
Map<K, V> result = Maps.newHashMap();
for (Map<K, V> map : mapList) {
result.putAll(map);
}
return result;
}
public static Map<String, Object> Obj2Map(Object obj) {
if (null == obj) {
return null;
}
Map<String, Object> map = new HashMap<>();
Field[] fields = obj.getClass().getDeclaredFields();
for (Field field : fields) {
field.setAccessible(true);
try {
map.put(field.getName(), field.get(obj));
} catch (IllegalAccessException e) {
LOGGER.warn("class=ConvertUtil||method=Obj2Map||msg={}", e.getMessage(), e);
}
}
return map;
}
public static Object map2Obj(Map<String, Object> map, Class<?> clz) {
Object obj = null;
try {
obj = clz.newInstance();
Field[] declaredFields = obj.getClass().getDeclaredFields();
for (Field field : declaredFields) {
int mod = field.getModifiers();
if (Modifier.isStatic(mod) || Modifier.isFinal(mod)) {
continue;
}
field.setAccessible(true);
field.set(obj, map.get(field.getName()));
}
} catch (Exception e) {
LOGGER.warn("class=ConvertUtil||method=map2Obj||msg={}", e.getMessage(), e);
}
return obj;
}
public static Map<String, Double> sortMapByValue(Map<String, Double> map) {
List<Entry<String, Double>> data = new ArrayList<>(map.entrySet());
data.sort((o1, o2) -> {
if ((o2.getValue() - o1.getValue()) > 0) {
return 1;
} else if ((o2.getValue() - o1.getValue()) == 0) {
return 0;
} else {
return -1;
}
});
Map<String, Double> result = Maps.newLinkedHashMap();
for (Entry<String, Double> next : data) {
result.put(next.getKey(), next.getValue());
}
return result;
}
public static Map<String, Object> directFlatObject(JSONObject obj) {
Map<String, Object> ret = new HashMap<>();
if(obj==null) {
return ret;
}
for (Entry<String, Object> entry : obj.entrySet()) {
String key = entry.getKey();
Object o = entry.getValue();
if (o instanceof JSONObject) {
Map<String, Object> m = directFlatObject((JSONObject) o);
for (Entry<String, Object> e : m.entrySet()) {
ret.put(key + "." + e.getKey(), e.getValue());
}
} else {
ret.put(key, o);
}
}
return ret;
}
public static Long string2Long(String s) {
if (ValidateUtils.isNull(s)) {
return null;
}
try {
return Long.parseLong(s);
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Float string2Float(String s) {
if (ValidateUtils.isNull(s)) {
return null;
}
try {
return Float.parseFloat(s);
} catch (Exception e) {
// ignore exception
}
return null;
}
public static String float2String(Float f) {
if (ValidateUtils.isNull(f)) {
return null;
}
try {
return String.valueOf(f);
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Integer string2Integer(String s) {
if (null == s) {
return null;
}
try {
return Integer.parseInt(s);
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Double string2Double(String s) {
if (null == s) {
return null;
}
try {
return Double.parseDouble(s);
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Long double2Long(Double d) {
if (null == d) {
return null;
}
try {
return d.longValue();
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Integer double2Int(Double d) {
if (null == d) {
return null;
}
try {
return d.intValue();
} catch (Exception e) {
// ignore exception
}
return null;
}
public static Long Float2Long(Float f) {
if (null == f) {
return null;
}
try {
return f.longValue();
} catch (Exception e) {
// ignore exception
}
return null;
}
}

View File

@@ -15,6 +15,7 @@ import java.util.concurrent.ConcurrentHashMap;
* @author huangyiminghappy@163.com
* @date 2019/3/15
*/
@Deprecated
public class CopyUtils {
@SuppressWarnings({"unchecked", "rawtypes"})

View File

@@ -40,6 +40,14 @@ public class FutureUtil<T> {
return futureUtil;
}
public Future<T> directSubmitTask(Callable<T> callable) {
return executor.submit(callable);
}
public Future<T> directSubmitTask(Runnable runnable) {
return (Future<T>) executor.submit(runnable);
}
/**
* 必须配合 waitExecute使用 否则容易会撑爆内存
*/

View File

@@ -8,6 +8,8 @@ package com.xiaojukeji.kafka.manager.common.zookeeper;
public class ZkPathUtil {
private static final String ZOOKEEPER_SEPARATOR = "/";
public static final String CLUSTER_ID_NODE = ZOOKEEPER_SEPARATOR + "cluster/id";
public static final String BROKER_ROOT_NODE = ZOOKEEPER_SEPARATOR + "brokers";
public static final String CONTROLLER_ROOT_NODE = ZOOKEEPER_SEPARATOR + "controller";

View File

@@ -1,6 +1,6 @@
{
"name": "logi-kafka",
"version": "2.6.1",
"version": "2.8.0",
"description": "",
"scripts": {
"prestart": "npm install --save-dev webpack-dev-server",
@@ -16,8 +16,10 @@
"@hot-loader/react-dom": "^16.8.6",
"@types/events": "^3.0.0",
"@types/lodash.debounce": "^4.0.6",
"@types/node": "18.7.13",
"@types/react": "^16.8.8",
"@types/react-dom": "^16.8.2",
"@types/react-router": "4.4.5",
"@types/react-router-dom": "^4.3.1",
"@types/spark-md5": "^3.0.2",
"@webpack-cli/serve": "^1.6.0",

View File

@@ -8,7 +8,7 @@ export class XFormWrapper extends React.Component<IXFormWrapper> {
public state = {
confirmLoading: false,
formMap: this.props.formMap || [] as any,
formData: this.props.formData || {}
formData: this.props.formData || {},
};
private $formRef: any;
@@ -121,7 +121,8 @@ export class XFormWrapper extends React.Component<IXFormWrapper> {
this.closeModalWrapper();
}).catch((err: any) => {
const { formMap, formData } = wrapper.xFormWrapper;
onSubmitFaild(err, this.$formRef, formData, formMap);
// tslint:disable-next-line:no-unused-expression
onSubmitFaild && onSubmitFaild(err, this.$formRef, formData, formMap);
}).finally(() => {
this.setState({
confirmLoading: false,

View File

@@ -1,4 +1,5 @@
.ant-input-number, .ant-form-item-children .ant-select {
.ant-input-number,
.ant-form-item-children .ant-select {
width: 314px
}
@@ -9,3 +10,35 @@
margin-right: 16px;
}
}
.x-form {
.ant-form-item-label {
line-height: 32px;
}
.ant-form-item-control {
line-height: 32px;
}
}
.prompt-info {
color: #ccc;
font-size: 12px;
line-height: 20px;
display: block;
&.inline {
margin-left: 16px;
display: inline-block;
font-family: PingFangSC-Regular;
font-size: 12px;
color: #042866;
letter-spacing: 0;
text-align: justify;
.anticon {
margin-right: 6px;
}
}
}

View File

@@ -85,6 +85,10 @@ class XForm extends React.Component<IXFormProps> {
initialValue = false;
}
if (formItem.type === FormItemType.select) {
initialValue = initialValue || undefined;
}
// if (formItem.type === FormItemType.select && formItem.attrs
// && ['tags'].includes(formItem.attrs.mode)) {
// initialValue = formItem.defaultValue ? [formItem.defaultValue] : [];
@@ -105,7 +109,7 @@ class XForm extends React.Component<IXFormProps> {
const { form, formData, formMap, formLayout, layout } = this.props;
const { getFieldDecorator } = form;
return (
<Form layout={layout || 'horizontal'} onSubmit={() => ({})}>
<Form className="x-form" layout={layout || 'horizontal'} onSubmit={() => ({})}>
{formMap.map(formItem => {
const { initialValue, valuePropName } = this.handleFormItem(formItem, formData);
const getFieldValue = {
@@ -131,7 +135,13 @@ class XForm extends React.Component<IXFormProps> {
)}
{formItem.renderExtraElement ? formItem.renderExtraElement() : null}
{/* 添加保存时间提示文案 */}
{formItem.attrs?.prompttype ? <span style={{ color: "#cccccc", fontSize: '12px', lineHeight: '20px', display: 'block' }}>{formItem.attrs.prompttype}</span> : null}
{formItem.attrs?.prompttype ?
<span className={`prompt-info ${formItem.attrs?.promptclass || ''}`}>
{formItem.attrs?.prompticon ?
<Icon type="info-circle" theme="twoTone" twoToneColor="#0A70F5" className={formItem.attrs?.prompticomclass} /> : null}
{formItem.attrs.prompttype}
</span>
: null}
</Form.Item>
);
})}

View File

@@ -30,7 +30,7 @@ export class ClusterOverview extends React.Component<IOverview> {
const content = this.props.basicInfo as IMetaData;
const gmtCreate = moment(content.gmtCreate).format(timeFormat);
const clusterContent = [{
value: content.clusterName,
value: `${content.clusterName}${content.haRelation === 0 ? '(备)' : content.haRelation === 1 ? '(主)' : content.haRelation === 2 ? '(主&备)' : ''}`,
label: '集群名称',
},
// {
@@ -50,6 +50,9 @@ export class ClusterOverview extends React.Component<IOverview> {
}, {
value: content.zookeeper,
label: 'Zookeeper',
}, {
value: `${content.mutualBackupClusterName || '-'}${content.haRelation === 0 ? '(主)' : content.haRelation === 1 ? '(备)' : content.haRelation === 2 ? '(主&备)' : ''}`,
label: '互备集群',
}];
return (
<>
@@ -64,18 +67,18 @@ export class ClusterOverview extends React.Component<IOverview> {
</Descriptions.Item>
))}
{clusterInfo.map((item: ILabelValue, index: number) => (
<Descriptions.Item key={index} label={item.label}>
<Tooltip placement="bottomLeft" title={item.value}>
<span className="overview-bootstrap">
<Icon
onClick={() => copyString(item.value)}
type="copy"
className="didi-theme overview-theme"
/>
<i className="overview-boot">{item.value}</i>
</span>
</Tooltip>
</Descriptions.Item>
<Descriptions.Item key={index} label={item.label}>
<Tooltip placement="bottomLeft" title={item.value}>
<span className="overview-bootstrap">
<Icon
onClick={() => copyString(item.value)}
type="copy"
className="didi-theme overview-theme"
/>
<i className="overview-boot">{item.value}</i>
</span>
</Tooltip>
</Descriptions.Item>
))}
</Descriptions>
</PageHeader>

View File

@@ -118,10 +118,10 @@ export class ClusterTopic extends SearchAndFilterContainer {
public renderClusterTopicList() {
const clusterColumns = [
{
title: 'Topic名称',
title: `Topic名称`,
dataIndex: 'topicName',
key: 'topicName',
width: '120px',
width: '140px',
sorter: (a: IClusterTopics, b: IClusterTopics) => a.topicName.charCodeAt(0) - b.topicName.charCodeAt(0),
render: (text: string, record: IClusterTopics) => {
return (
@@ -130,7 +130,7 @@ export class ClusterTopic extends SearchAndFilterContainer {
// tslint:disable-next-line:max-line-length
href={`${urlPrefix}/topic/topic-detail?clusterId=${record.clusterId || ''}&topic=${record.topicName || ''}&isPhysicalClusterId=true&region=${region.currentRegion}`}
>
{text}
{text}{record.haRelation === 0 ? '(备)' : record.haRelation === 1 ? '(主)' : record.haRelation === 2 ? '(主&备)' : ''}
</a>
</Tooltip>);
},
@@ -208,23 +208,27 @@ export class ClusterTopic extends SearchAndFilterContainer {
{
title: '操作',
width: '120px',
render: (value: string, item: IClusterTopics) => (
<>
<a onClick={() => this.getBaseInfo(item)} className="action-button"></a>
<a onClick={() => this.expandPartition(item)} className="action-button"></a>
{/* <a onClick={() => this.expandPartition(item)} className="action-button">删除</a> */}
<Popconfirm
title="确定删除?"
// 运维管控集群列表Topic列表修改删除业务逻辑
onConfirm={() => this.confirmDetailTopic(item)}
// onConfirm={() => this.deleteTopic(item)}
cancelText="取消"
okText="确认"
>
<a></a>
</Popconfirm>
</>
),
render: (value: string, item: IClusterTopics) => {
if (item.haRelation === 0) return '-';
return (
<>
<a onClick={() => this.getBaseInfo(item)} className="action-button"></a>
<a onClick={() => this.expandPartition(item)} className="action-button"></a>
{/* <a onClick={() => this.expandPartition(item)} className="action-button">删除</a> */}
<Popconfirm
title="确定删除?"
// 运维管控集群列表Topic列表修改删除业务逻辑
onConfirm={() => this.confirmDetailTopic(item)}
// onConfirm={() => this.deleteTopic(item)}
cancelText="取消"
okText="确认"
>
<a></a>
</Popconfirm>
</>
);
},
},
];
if (users.currentUser.role !== 2) {

View File

@@ -73,6 +73,7 @@ export class LogicalCluster extends SearchAndFilterContainer {
key: 'mode',
render: (value: number) => {
let val = '';
// tslint:disable-next-line:no-unused-expression
cluster.clusterModes && cluster.clusterModes.forEach((ele: any) => {
if (value === ele.code) {
val = ele.message;
@@ -206,6 +207,7 @@ export class LogicalCluster extends SearchAndFilterContainer {
}
public render() {
const clusterModes = cluster.clusterModes;
return (
<div className="k-row">
<ul className="k-tab">

View File

@@ -0,0 +1,381 @@
.switch-style {
&.ant-switch {
min-width: 32px;
height: 20px;
line-height: 18px;
::after {
height: 16px;
width: 16px;
}
}
&.ant-switch-loading-icon,
&.ant-switch::after {
height: 16px;
width: 16px;
}
}
.expanded-table {
width: auto ! important;
.ant-table-thead {
// visibility: hidden;
display: none;
}
.ant-table-tbody>tr>td {
background-color: #FAFAFA;
border-bottom: none;
}
}
tr.ant-table-expanded-row td>.expanded-table {
padding: 10px;
// margin: -13px 0px -14px ! important;
border: none;
}
.cluster-tag {
background: #27D687;
border-radius: 2px;
font-family: PingFangSC-Medium;
color: #FFFFFF;
letter-spacing: 0;
text-align: justify;
-webkit-transform: scale(0.5);
margin-right: 0px;
}
.no-padding {
.ant-modal-body {
padding: 0;
.attribute-content {
.tag-gray {
font-family: PingFangSC-Regular;
font-size: 12px;
color: #575757;
text-align: center;
line-height: 18px;
padding: 0 4px;
margin: 3px;
height: 20px;
background: #EEEEEE;
border-radius: 5px;
}
.icon {
zoom: 0.8;
}
.tag-num {
font-family: PingFangSC-Medium;
text-align: right;
line-height: 13px;
margin-left: 6px;
transform: scale(0.8333);
}
}
.attribute-tag {
.ant-popover-inner-content {
padding: 12px;
max-width: 480px;
}
.ant-popover-arrow {
display: none;
}
.ant-popover-placement-bottom,
.ant-popover-placement-bottomLeft,
.ant-popover-placement-bottomRight {
top: 23px !important;
border-radius: 2px;
}
.tag-gray {
font-family: PingFangSC-Regular;
font-size: 12px;
color: #575757;
text-align: center;
line-height: 12px;
padding: 0 4px;
margin: 3px;
height: 20px;
background: #EEEEEE;
border-radius: 5px;
}
}
.col-status {
font-family: PingFangSC-Regular;
font-size: 12px;
letter-spacing: 0;
text-align: justify;
&.green {
.ant-badge-status-text {
color: #2FC25B;
}
}
&.black {
.ant-badge-status-text {
color: #575757;
}
}
&.red {
.ant-badge-status-text {
color: #F5202E;
}
}
}
.ant-alert-message {
font-family: PingFangSC-Regular;
font-size: 12px;
letter-spacing: 0;
text-align: justify;
}
.ant-alert-warning {
border: none;
color: #592D00;
padding: 7px 15px 7px 41px;
background: #FFFAE0;
.ant-alert-message {
color: #592D00
}
}
.ant-alert-info {
border: none;
padding: 7px 15px 7px 41px;
color: #042866;
background: #EFF8FF;
.ant-alert-message {
color: #042866;
}
}
.ant-alert-icon {
left: 24px;
top: 10px;
}
.switch-warning {
.btn {
position: absolute;
top: 60px;
right: 24px;
height: 22px;
width: 64px;
padding: 0px;
&.disabled {
top: 77px;
}
button {
height: 22px;
width: 64px;
padding: 0px;
}
&.loading {
width: 80px;
button {
height: 22px;
width: 88px;
padding: 0px 0px 0px 12px;
}
}
}
}
.modal-table-content {
padding: 0px 24px 16px;
.ant-table-small {
border: none;
border-top: 1px solid #e8e8e8;
.ant-table-thead {
background: #FAFAFA;
}
}
}
.modal-table-download {
height: 40px;
line-height: 40px;
text-align: center;
border-top: 1px solid #e8e8e8;
}
.ant-form {
padding: 18px 24px 0px;
.ant-col-3 {
width: 9.5%;
}
.ant-form-item-label {
text-align: left;
}
.no-label {
.ant-col-21 {
width: 100%;
}
.transfe-list {
.ant-transfer-list {
height: 359px;
}
}
.ant-transfer-list {
width: 249px;
border: 1px solid #E8E8E8;
border-radius: 8px;
.ant-transfer-list-header-title {
font-family: PingFangSC-Regular;
font-size: 12px;
color: #252525;
letter-spacing: 0;
text-align: right;
}
.ant-transfer-list-body-search-wrapper {
padding: 19px 16px 6px;
input {
height: 27px;
background: #FAFAFA;
border-radius: 8px;
border: none;
}
.ant-transfer-list-search-action {
line-height: 27px;
height: 27px;
top: 19px;
}
}
}
.ant-transfer-list-header {
border-radius: 8px 8px 0px 0px;
padding: 16px;
}
}
.ant-transfer-customize-list .ant-transfer-list-body-customize-wrapper {
padding: 0px;
margin: 0px 16px;
background: #FAFAFA;
border-radius: 8px;
.ant-table-header-column {
font-family: PingFangSC-Regular;
font-size: 12px;
color: #575757;
letter-spacing: 0;
text-align: justify;
}
.ant-table-thead>tr {
border: none;
background: #FAFAFA;
}
.ant-table-tbody>tr>td {
border: none;
background: #FAFAFA;
}
.ant-table-body {
background: #FAFAFA;
}
}
.ant-table-selection-column {
.ant-table-header-column {
opacity: 0;
}
}
}
.log-process {
height: 56px;
background: #FAFAFA;
padding: 6px 8px;
margin-bottom: 15px;
.name {
display: flex;
color: #575757;
justify-content: space-between;
}
}
.log-panel {
padding: 24px;
font-family: PingFangSC-Regular;
font-size: 12px;
.title {
color: #252525;
letter-spacing: 0;
text-align: justify;
margin-bottom: 15px;
.divider {
display: inline-block;
border-left: 2px solid #F38031;
height: 9px;
margin-right: 6px;
}
}
.log-info {
color: #575757;
letter-spacing: 0;
text-align: justify;
margin-bottom: 10px;
.text-num {
font-size: 14px;
}
.warning-num {
color: #F38031;
font-size: 14px;
}
}
.log-table {
margin-bottom: 24px;
.ant-table-small {
border: none;
border-top: 1px solid #e8e8e8;
.ant-table-thead {
background: #FAFAFA;
}
}
}
}
}
}

View File

@@ -1,8 +1,8 @@
import * as React from 'react';
import { Modal, Table, Button, notification, message, Tooltip, Icon, Popconfirm, Alert, Popover } from 'component/antd';
import { Modal, Table, Button, notification, message, Tooltip, Icon, Popconfirm, Alert, Dropdown } from 'component/antd';
import { wrapper } from 'store';
import { observer } from 'mobx-react';
import { IXFormWrapper, IMetaData, IRegister } from 'types/base-type';
import { IXFormWrapper, IMetaData, IRegister, ILabelValue } from 'types/base-type';
import { admin } from 'store/admin';
import { users } from 'store/users';
import { registerCluster, createCluster, pauseMonitoring } from 'lib/api';
@@ -10,11 +10,14 @@ import { SearchAndFilterContainer } from 'container/search-filter';
import { cluster } from 'store/cluster';
import { customPagination } from 'constants/table';
import { urlPrefix } from 'constants/left-menu';
import { indexUrl } from 'constants/strategy'
import { indexUrl } from 'constants/strategy';
import { region } from 'store';
import './index.less';
import Monacoeditor from 'component/editor/monacoEditor';
import { getAdminClusterColumns } from '../config';
import { FormItemType } from 'component/x-form';
import { TopicHaRelationWrapper } from 'container/modal/admin/TopicHaRelation';
import { TopicSwitchWrapper } from 'container/modal/admin/TopicHaSwitch';
import { TopicSwitchLog } from 'container/modal/admin/SwitchTaskLog';
const { confirm } = Modal;
@@ -22,6 +25,10 @@ const { confirm } = Modal;
export class ClusterList extends SearchAndFilterContainer {
public state = {
searchKey: '',
haVisible: false,
switchVisible: false,
logVisible: false,
currentCluster: {} as IMetaData,
};
private xFormModal: IXFormWrapper;
@@ -36,7 +43,26 @@ export class ClusterList extends SearchAndFilterContainer {
);
}
public updateFormModal(value: boolean, metaList: ILabelValue[]) {
const formMap = wrapper.xFormWrapper.formMap;
formMap[1].attrs.prompttype = !value ? '' : metaList.length ? '已设置为高可用集群,请选择所关联的主集群' : '当前暂无可用集群进行关联高可用关系,请先添加集群';
formMap[1].attrs.prompticon = 'true';
formMap[2].invisible = !value;
formMap[2].attrs.disabled = !metaList.length;
formMap[6].rules[0].required = value;
// tslint:disable-next-line:no-unused-expression
wrapper.ref && wrapper.ref.updateFormMap$(formMap, wrapper.xFormWrapper.formData);
}
public createOrRegisterCluster(item: IMetaData) {
const self = this;
const metaList = Array.from(admin.metaList).filter(item => item.haRelation === null).map(item => ({
label: item.clusterName,
value: item.clusterId,
}));
this.xFormModal = {
formMap: [
{
@@ -51,6 +77,38 @@ export class ClusterList extends SearchAndFilterContainer {
disabled: item ? true : false,
},
},
{
key: 'ha',
label: '高可用',
type: FormItemType._switch,
invisible: item ? true : false,
rules: [{
required: false,
}],
attrs: {
className: 'switch-style',
prompttype: '',
prompticon: '',
prompticomclass: '',
promptclass: 'inline',
onChange(value: boolean) {
self.updateFormModal(value, metaList);
},
},
},
{
key: 'activeClusterId',
label: '主集群',
type: FormItemType.select,
options: metaList,
invisible: true,
rules: [{
required: false,
}],
attrs: {
placeholder: '请选择主集群',
},
},
{
key: 'zookeeper',
label: 'zookeeper地址',
@@ -162,17 +220,18 @@ export class ClusterList extends SearchAndFilterContainer {
visible: true,
width: 590,
title: item ? '编辑' : '接入集群',
isWaitting: true,
onSubmit: (value: IRegister) => {
value.idc = region.currentRegion;
if (item) {
value.clusterId = item.clusterId;
registerCluster(value).then(data => {
admin.getMetaData(true);
return registerCluster(value).then(data => {
admin.getHaMetaData();
notification.success({ message: '编辑集群成功' });
});
} else {
createCluster(value).then(data => {
admin.getMetaData(true);
return createCluster(value).then(data => {
admin.getHaMetaData();
notification.success({ message: '接入集群成功' });
});
}
@@ -186,7 +245,7 @@ export class ClusterList extends SearchAndFilterContainer {
const info = item.status === 1 ? '暂停监控' : '开始监控';
const status = item.status === 1 ? 0 : 1;
pauseMonitoring(item.clusterId, status).then(data => {
admin.getMetaData(true);
admin.getHaMetaData();
notification.success({ message: `${info}成功` });
});
}
@@ -198,7 +257,7 @@ export class ClusterList extends SearchAndFilterContainer {
title: <>
<span className="offline_span">
&nbsp;
<a>
<a>
<Tooltip placement="right" title={'若当前集群存在逻辑集群,则无法删除'} >
<Icon type="question-circle" />
</Tooltip>
@@ -216,12 +275,34 @@ export class ClusterList extends SearchAndFilterContainer {
}
admin.deleteCluster(record.clusterId).then(data => {
notification.success({ message: '删除成功' });
admin.getHaMetaData();
});
},
});
});
}
public showDelStandModal = (record: IMetaData) => {
confirm({
// tslint:disable-next-line:jsx-wrap-multiline
title: '删除集群',
// icon: 'none',
content: <>{record.activeTopicCount ? `当前集群含有主topic无法删除` : record.haStatus !== 0 ? `当前集群正在进行主备切换,无法删除!` : `确认删除集群${record.clusterName}吗?`}</>,
width: 500,
okText: '确认',
cancelText: '取消',
onOk() {
if (record.activeTopicCount || record.haStatus !== 0) {
return;
}
admin.deleteCluster(record.clusterId).then(data => {
notification.success({ message: '删除成功' });
admin.getHaMetaData();
});
},
});
}
public deleteMonitorModal = (source: any) => {
const cellStyle = {
overflow: 'hidden',
@@ -275,11 +356,105 @@ export class ClusterList extends SearchAndFilterContainer {
return data;
}
public expandedRowRender = (record: IMetaData) => {
const dataSource: any = record.haClusterVO ? [record.haClusterVO] : [];
const cols = getAdminClusterColumns(false);
const role = users.currentUser.role;
if (!record.haClusterVO) return null;
const haRecord = record.haClusterVO;
const btnsMenu = (
<>
<ul className="dropdown-menu">
<li>
<a onClick={this.createOrRegisterCluster.bind(this, haRecord)} className="action-button">
</a>
</li>
<li>
<Popconfirm
title={`确定${haRecord.status === 1 ? '暂停' : '开始'}${haRecord.clusterName}监控?`}
onConfirm={() => this.pauseMonitor(haRecord)}
cancelText="取消"
okText="确认"
>
<Tooltip placement="left" title="暂停监控将无法正常监控指标信息,建议开启监控">
<a
className="action-button"
>
{haRecord.status === 1 ? '暂停监控' : '开始监控'}
</a>
</Tooltip>
</Popconfirm>
</li>
<li>
<a onClick={this.showDelStandModal.bind(this, haRecord)}>
</a>
</li>
</ul>
</>);
const noAuthMenu = (
<ul className="dropdown-menu">
<Tooltip placement="left" title="该功能只对运维人员开放">
<li><a style={{ color: '#a0a0a0' }} className="action-button"></a></li>
<li><a className="action-button" style={{ color: '#a0a0a0' }}>{record.status === 1 ? '暂停监控' : '开始监控'}</a></li>
<li><a style={{ color: '#a0a0a0' }}></a></li>
</Tooltip>
</ul>
);
const col = {
title: '操作',
width: 270,
render: (value: string, item: IMetaData) => (
<>
<a
onClick={this.openModal.bind(this, 'haVisible', record)}
className="action-button"
>
Topic高可用关联
</a>
{item.haStatus !== 0 ? null : <a onClick={this.openModal.bind(this, 'switchVisible', record)} className="action-button">
Topic主备切换
</a>}
{item.haASSwitchJobId ? <a className="action-button" onClick={this.openModal.bind(this, 'logVisible', record)}>
</a> : null}
<Dropdown
overlay={role === 2 ? btnsMenu : noAuthMenu}
trigger={['click', 'hover']}
placement="bottomLeft"
>
<span className="didi-theme ml-10">
···
</span>
</Dropdown>
</>
),
};
cols.push(col as any);
return (
<Table
className="expanded-table"
rowKey="clusterId"
style={{ width: '500px' }}
columns={cols}
dataSource={dataSource}
pagination={false}
/>
);
}
public getColumns = () => {
const cols = getAdminClusterColumns();
const role = users.currentUser.role;
const col = {
title: '操作',
width: 270,
render: (value: string, item: IMetaData) => (
<>
{
@@ -307,10 +482,10 @@ export class ClusterList extends SearchAndFilterContainer {
</a>
</> : <Tooltip placement="left" title="该功能只对运维人员开放">
<a style={{ color: '#a0a0a0' }} className="action-button"></a>
<a className="action-button" style={{ color: '#a0a0a0' }}>{item.status === 1 ? '暂停监控' : '开始监控'}</a>
<a style={{ color: '#a0a0a0' }}></a>
</Tooltip>
<a style={{ color: '#a0a0a0' }} className="action-button"></a>
<a className="action-button" style={{ color: '#a0a0a0' }}>{item.status === 1 ? '暂停监控' : '开始监控'}</a>
<a style={{ color: '#a0a0a0' }}></a>
</Tooltip>
}
</>
),
@@ -319,6 +494,20 @@ export class ClusterList extends SearchAndFilterContainer {
return cols;
}
public openModal(type: string, record: IMetaData) {
this.setState({
currentCluster: record,
}, () => {
this.handleVisible(type, true);
});
}
public handleVisible(type: string, visible: boolean) {
this.setState({
[type]: visible,
});
}
public renderClusterList() {
const role = users.currentUser.role;
return (
@@ -333,8 +522,8 @@ export class ClusterList extends SearchAndFilterContainer {
role && role === 2 ?
<Button type="primary" onClick={this.createOrRegisterCluster.bind(this, null)}></Button>
:
<Tooltip placement="left" title="该功能只对运维人员开放" trigger='hover'>
<Button disabled type="primary"></Button>
<Tooltip placement="left" title="该功能只对运维人员开放" trigger="hover">
<Button disabled={true} type="primary"></Button>
</Tooltip>
}
</li>
@@ -343,26 +532,63 @@ export class ClusterList extends SearchAndFilterContainer {
<div className="table-wrapper">
<Table
rowKey="key"
expandIcon={({ expanded, onExpand, record }) => (
record.haClusterVO ?
<Icon style={{ fontSize: 10 }} type={expanded ? 'down' : 'right'} onClick={e => onExpand(record, e)} />
: null
)}
loading={admin.loading}
dataSource={this.getData(admin.metaList)}
expandedRowRender={this.expandedRowRender}
dataSource={this.getData(admin.haMetaList)}
columns={this.getColumns()}
pagination={customPagination}
/>
</div>
</div>
{this.state.haVisible && <TopicHaRelationWrapper
handleVisible={(val: boolean) => this.handleVisible('haVisible', val)}
visible={this.state.haVisible}
currentCluster={this.state.currentCluster}
reload={() => admin.getHaMetaData()}
formData={{}}
/>}
{this.state.switchVisible &&
<TopicSwitchWrapper
reload={(jobId: number) => {
admin.getHaMetaData().then((res) => {
const currentRecord = res.find(item => item.clusterId === this.state.currentCluster.clusterId);
currentRecord.haClusterVO.haASSwitchJobId = jobId;
this.openModal('logVisible', currentRecord);
});
}}
handleVisible={(val: boolean) => this.handleVisible('switchVisible', val)}
visible={this.state.switchVisible}
currentCluster={this.state.currentCluster}
formData={{}}
/>
}
{this.state.logVisible &&
<TopicSwitchLog
reload={() => admin.getHaMetaData()}
handleVisible={(val: boolean) => this.handleVisible('logVisible', val)}
visible={this.state.logVisible}
currentCluster={this.state.currentCluster}
/>
}
</>
);
}
public componentDidMount() {
admin.getMetaData(true);
admin.getHaMetaData();
cluster.getClusterModes();
admin.getDataCenter();
}
public render() {
return (
admin.metaList ? <> {this.renderClusterList()} </> : null
admin.haMetaList ? <> {this.renderClusterList()} </> : null
);
}
}

View File

@@ -3,12 +3,13 @@ import { IUser, IUploadFile, IConfigure, IConfigGateway, IMetaData } from 'types
import { users } from 'store/users';
import { version } from 'store/version';
import { showApplyModal, showApplyModalModifyPassword, showModifyModal, showConfigureModal, showConfigGatewayModal } from 'container/modal/admin';
import { Popconfirm, Tooltip } from 'component/antd';
import { Icon, Popconfirm, Tooltip } from 'component/antd';
import { admin } from 'store/admin';
import { cellStyle } from 'constants/table';
import { timeFormat } from 'constants/strategy';
import { urlPrefix } from 'constants/left-menu';
import moment = require('moment');
import { Tag } from 'antd';
export const getUserColumns = () => {
const columns = [
@@ -28,15 +29,15 @@ export const getUserColumns = () => {
<span className="table-operation">
<a onClick={() => showApplyModal(record)}></a>
<a onClick={() => showApplyModalModifyPassword(record)}></a>
{record.username == users.currentUser.username ? "" :
<Popconfirm
title="确定删除?"
onConfirm={() => users.deleteUser(record.username)}
cancelText="取消"
okText="确认"
>
<a></a>
</Popconfirm>
{record.username === users.currentUser.username ? '' :
<Popconfirm
title="确定删除?"
onConfirm={() => users.deleteUser(record.username)}
cancelText="取消"
okText="确认"
>
<a></a>
</Popconfirm>
}
</span>);
},
@@ -271,33 +272,82 @@ export const getConfigColumns = () => {
const renderClusterHref = (value: number | string, item: IMetaData, key: number) => {
return ( // 0 暂停监控--不可点击 1 监控中---可正常点击
<>
{item.status === 1 ? <a href={`${urlPrefix}/admin/cluster-detail?clusterId=${item.clusterId}#${key}`}>{value}</a>
: <a style={{ cursor: 'not-allowed', color: '#999' }}>{value}</a>}
{item.status === 1 ? <a href={`${urlPrefix}/admin/cluster-detail?clusterId=${item.clusterId}#${key}`}>{value}</a> :
<a style={{ cursor: 'not-allowed', color: '#999' }}>{value}</a>}
</>
);
};
export const getAdminClusterColumns = () => {
const renderTopicNum = (value: number | string, item: IMetaData, key: number, active?: boolean) => {
const show = item.haClusterVO || (!item.haClusterVO && !active);
if (!show) {
return ( // 0 暂停监控--不可点击 1 监控中---可正常点击
<>
{item.status === 1 ? <a href={`${urlPrefix}/admin/cluster-detail?clusterId=${item.clusterId}#${key}`}>
{value}
</a> :
<a style={{ cursor: 'not-allowed', color: '#999' }}>
{value}
</a>}
</>
);
}
return ( // 0 暂停监控--不可点击 1 监控中---可正常点击
<>
{item.status === 1 ? <a href={`${urlPrefix}/admin/cluster-detail?clusterId=${item.clusterId}#${key}`}>
{value}
<>{item.activeTopicCount ?? '-'}/{item.standbyTopicCount ?? '-'}</>
</a> :
<a style={{ cursor: 'not-allowed', color: '#999' }}>
{value}
<>{item.activeTopicCount ?? '-'}/{item.standbyTopicCount ?? '-'}</>
</a>}
</>
);
};
const renderClusterName = (value: number | string, item: IMetaData, key: number, active: boolean) => {
const show = item.haClusterVO || (!item.haClusterVO && !active);
return ( // 0 暂停监控--不可点击 1 监控中---可正常点击
<>
{item.status === 1 ?
<a href={`${urlPrefix}/admin/cluster-detail?clusterId=${item.clusterId}`}>{value}</a> :
<a style={{ cursor: 'not-allowed', color: '#999' }}>{value}</a>}
{active ? <>
{item.haClusterVO ? <Tooltip title="高可用集群"><Tag className="cluster-tag">HA</Tag></Tooltip> : null}
{item.haClusterVO && item.haStatus !== 0 ? <Tooltip title="Topic主备切换中"><Icon type="swap" style={{ color: '#27D687' }} /></Tooltip>
: null}
</> : null}
</>
);
};
export const getAdminClusterColumns = (active = true) => {
return [
{
title: '物理集群ID',
dataIndex: 'clusterId',
key: 'clusterId',
sorter: (a: IMetaData, b: IMetaData) => b.clusterId - a.clusterId,
sorter: (a: IMetaData, b: IMetaData) => a.clusterId - b.clusterId,
width: active ? 115 : 111,
render: (text: number) => active ? text : `${text ?? 0}`,
},
{
title: '物理集群名称',
dataIndex: 'clusterName',
key: 'clusterName',
sorter: (a: IMetaData, b: IMetaData) => a.clusterName.charCodeAt(0) - b.clusterName.charCodeAt(0),
render: (text: string, item: IMetaData) => renderClusterHref(text, item, 1),
render: (text: string, item: IMetaData) => renderClusterName(text, item, 1, active),
width: 235,
},
{
title: 'Topic数',
dataIndex: 'topicNum',
key: 'topicNum',
sorter: (a: any, b: IMetaData) => b.topicNum - a.topicNum,
render: (text: number, item: IMetaData) => renderClusterHref(text, item, 2),
render: (text: number, item: IMetaData) => renderTopicNum(text, item, 2, active),
width: 140,
},
{
title: 'Broker数',
@@ -305,6 +355,7 @@ export const getAdminClusterColumns = () => {
key: 'brokerNum',
sorter: (a: IMetaData, b: IMetaData) => b.brokerNum - a.brokerNum,
render: (text: number, item: IMetaData) => renderClusterHref(text, item, 3),
width: 140,
},
{
title: 'Consumer数',
@@ -312,6 +363,8 @@ export const getAdminClusterColumns = () => {
key: 'consumerGroupNum',
sorter: (a: IMetaData, b: IMetaData) => b.consumerGroupNum - a.consumerGroupNum,
render: (text: number, item: IMetaData) => renderClusterHref(text, item, 4),
width: 150,
},
{
title: 'Region数',
@@ -319,6 +372,8 @@ export const getAdminClusterColumns = () => {
key: 'regionNum',
sorter: (a: IMetaData, b: IMetaData) => b.regionNum - a.regionNum,
render: (text: number, item: IMetaData) => renderClusterHref(text, item, 5),
width: 140,
},
{
title: 'Controllerld',
@@ -326,12 +381,15 @@ export const getAdminClusterColumns = () => {
key: 'controllerId',
sorter: (a: IMetaData, b: IMetaData) => b.controllerId - a.controllerId,
render: (text: number, item: IMetaData) => renderClusterHref(text, item, 7),
width: 150,
},
{
title: '监控中',
dataIndex: 'status',
key: 'status',
sorter: (a: IMetaData, b: IMetaData) => b.key - a.key,
width: 140,
render: (value: number) => value === 1 ?
<span className="success"></span > : <span className="fail"></span>,
},

View File

@@ -44,7 +44,7 @@ export class MyCluster extends SearchAndFilterContainer {
label: '所属应用',
rules: [{ required: true, message: '请选择所属应用' }],
type: 'select',
options: app.data.map((item) => {
options: app.clusterAppData.map((item) => {
return {
label: item.name,
value: item.appId,
@@ -135,8 +135,8 @@ export class MyCluster extends SearchAndFilterContainer {
if (!cluster.clusterModes.length) {
cluster.getClusterModes();
}
if (!app.data.length) {
app.getAppList();
if (!app.clusterAppData.length) {
app.getAppListByClusterId(-1);
}
}

View File

@@ -145,7 +145,7 @@ export const Header = observer((props: IHeader) => {
<div className="left-content">
<img className="kafka-header-icon" src={logoUrl} alt="" />
<span className="kafka-header-text">LogiKM</span>
<a className='kafka-header-version' href="https://github.com/didi/Logi-KafkaManager/releases" target='_blank'>v2.6.1</a>
<a className='kafka-header-version' href="https://github.com/didi/Logi-KafkaManager/releases" target='_blank'>v2.8.0</a>
{/* 添加版本超链接 */}
</div>
<div className="mid-content">

View File

@@ -0,0 +1,300 @@
import * as React from 'react';
import { Modal, Progress, Tooltip } from 'antd';
import { IMetaData } from 'types/base-type';
import { Alert, Badge, Button, Input, message, notification, Table } from 'component/antd';
import { getJobDetail, getJobState, getJobLog, switchAsJobs } from 'lib/api';
import moment from 'moment';
import { timeFormat } from 'constants/strategy';
interface IProps {
reload: any;
visible?: boolean;
handleVisible?: any;
currentCluster?: IMetaData;
}
interface IJobState {
failedNu: number;
jobNu: number;
runningNu: number;
successNu: number;
waitingNu: number;
runningInTimeoutNu: number;
progress: number;
}
interface IJobDetail {
standbyClusterPhyId: number;
status: number;
sumLag: number;
timeoutUnitSecConfig: number;
topicName: string;
activeClusterPhyName: string;
standbyClusterPhyName: string;
}
interface ILog {
bizKeyword: string;
bizType: number;
content: string;
id: number;
printTime: number;
}
interface IJobLog {
logList: ILog[];
endLogId: number;
}
const STATUS_MAP = {
'-1': '未知',
'30': '运行中',
'32': '超时运行中',
'101': '成功',
'102': '失败',
} as any;
const STATUS_COLORS = {
'-1': '#575757',
'30': '#575757',
'32': '#F5202E',
'101': '#2FC25B',
'102': '#F5202E',
} as any;
const STATUS_COLOR_MAP = {
'-1': 'black',
'30': 'black',
'32': 'red',
'101': 'green',
'102': 'red',
} as any;
const getFilters = () => {
const keys = Object.keys(STATUS_MAP);
const filters = [];
for (const key of keys) {
filters.push({
text: STATUS_MAP[key],
value: key,
});
}
return filters;
};
const columns = [
{
dataIndex: 'key',
title: '编号',
width: 60,
},
{
dataIndex: 'topicName',
title: 'Topic名称',
width: 120,
ellipsis: true,
},
{
dataIndex: 'sumLag',
title: '延迟',
width: 100,
render: (value: number) => value ?? '-',
},
{
dataIndex: 'status',
title: '状态',
width: 100,
filters: getFilters(),
onFilter: (value: string, record: IJobDetail) => record.status === Number(value),
render: (t: number) => (
<span className={'col-status ' + STATUS_COLOR_MAP[t]}>
<Badge color={STATUS_COLORS[t]} text={STATUS_MAP[t]} />
</span>
),
},
];
export class TopicSwitchLog extends React.Component<IProps> {
public state = {
radioCheck: 'all',
jobDetail: [] as IJobDetail[],
jobState: {} as IJobState,
jobLog: {} as IJobLog,
textStr: '',
primaryTargetKeys: [] as string[],
loading: false,
};
public timer = null as number;
public jobId = this.props.currentCluster?.haClusterVO?.haASSwitchJobId as number;
public handleOk = () => {
this.props.handleVisible(false);
this.props.reload();
}
public handleCancel = () => {
this.props.handleVisible(false);
this.props.reload();
}
public iTimer = () => {
this.timer = window.setInterval(() => {
const { jobLog } = this.state;
this.getContentJobLog(jobLog.endLogId);
this.getContentJobState();
this.getContentJobDetail();
}, 10 * 1 * 1000);
}
public getTextAreaStr = (logList: ILog[]) => {
const strs = [];
for (const item of logList) {
strs.push(`${moment(item.printTime).format(timeFormat)} ${item.content}`);
}
return strs.join(`\n`);
}
public getContentJobLog = (startId?: number) => {
getJobLog(this.jobId, startId).then((res: IJobLog) => {
const { jobLog } = this.state;
const logList = (jobLog.logList || []);
logList.push(...(res?.logList || []));
const newJobLog = {
endLogId: res?.endLogId,
logList,
};
this.setState({
textStr: this.getTextAreaStr(logList),
jobLog: newJobLog,
});
});
}
public getContentJobState = () => {
getJobState(this.jobId).then((res: IJobState) => {
// 成功后清除调用
if (res?.jobNu === res.successNu) {
clearInterval(this.timer);
}
this.setState({
jobState: res || {},
});
});
}
public getContentJobDetail = () => {
getJobDetail(this.jobId).then((res: IJobDetail[]) => {
this.setState({
jobDetail: (res || []).map((row, index) => ({
...row,
key: index,
})),
});
});
}
public switchJobs = () => {
const { jobState } = this.state;
Modal.confirm({
title: '强制切换',
content: `当前有${jobState.runningNu}个Topic切换中${jobState.runningInTimeoutNu}个Topic切换超时强制切换会使这些Topic有数据丢失的风险确定强制切换吗`,
onOk: () => {
this.setState({
loading: true,
});
switchAsJobs(this.jobId, {
action: 'force',
allJumpWaitInSync: true,
jumpWaitInSyncActiveTopicList: [],
}).then(res => {
message.success('强制切换成功');
}).finally(() => {
this.setState({
loading: false,
});
});
},
});
}
public componentWillUnmount() {
clearInterval(this.timer);
}
public componentDidMount() {
this.getContentJobDetail();
this.getContentJobState();
this.getContentJobLog();
setTimeout(this.iTimer, 0);
}
public render() {
const { visible, currentCluster } = this.props;
const { jobState, jobDetail, textStr, loading } = this.state;
const runtimeJob = jobDetail.filter(item => item.status === 32);
const percent = jobState?.progress;
return (
<Modal
title="主备切换日志"
wrapClassName="no-padding"
visible={visible}
onOk={this.handleOk}
onCancel={this.handleCancel}
maskClosable={false}
width={590}
okText="确认"
cancelText="取消"
>
{runtimeJob.length ?
<Alert
message={`${runtimeJob[0].topicName}消息同步已经超时${runtimeJob[0].timeoutUnitSecConfig}s建议立即强制切换`}
type="warning"
showIcon={true}
/>
: null}
<div className="switch-warning">
<Tooltip title="不用等待所有topic数据完成同步立即进行切换但是有数据丢失的风险。">
<Button loading={loading} type="primary" disabled={!runtimeJob.length} onClick={this.switchJobs} className={loading ? 'btn loading' : runtimeJob.length ? 'btn' : 'btn disabled'}></Button>
</Tooltip>
</div>
<div className="log-panel">
<div className="title">
<div className="divider" />
<span>Topic切换详情:</span>
</div>
<div className="log-process">
<div className="name">
<span> {jobDetail?.[0]?.standbyClusterPhyName || ''}</span>
<span> {jobDetail?.[0]?.activeClusterPhyName || ''}</span>
</div>
<Progress percent={percent} strokeColor="#F38031" status={percent === 100 ? 'normal' : 'active'} />
</div>
<div className="log-info">
Topic总数 <span className="text-num">{jobState.jobNu ?? '-'}</span>
<span className="text-num">{jobState.successNu ?? '-'}</span>
<span className="warning-num">{jobState.failedNu ?? '-'}</span>
<span className="warning-num">{jobState.waitingNu ?? '-'}</span>
</div>
<Table
className="log-table"
columns={columns}
dataSource={jobDetail}
size="small"
rowKey="topicName"
pagination={false}
bordered={false}
scroll={{ y: 138 }}
/>
<div className="title">
<div className="divider" />
<span>:</span>
</div>
<div>
<Input.TextArea value={textStr} rows={7} />
</div>
</div>
</Modal >
);
}
}

View File

@@ -0,0 +1,351 @@
import * as React from 'react';
import { admin } from 'store/admin';
import { Modal, Form, Radio } from 'antd';
import { IBrokersMetadata, IBrokersRegions, IMetaData } from 'types/base-type';
import { Alert, message, notification, Table, Tooltip, Transfer } from 'component/antd';
import { getClusterHaTopicsStatus, setHaTopics, unbindHaTopics } from 'lib/api';
import { cellStyle } from 'constants/table';
const layout = {
labelCol: { span: 3 },
wrapperCol: { span: 21 },
};
interface IXFormProps {
form: any;
reload: any;
formData?: any;
visible?: boolean;
handleVisible?: any;
currentCluster?: IMetaData;
}
interface IHaTopic {
clusterId: number;
clusterName: string;
haRelation: number;
topicName: string;
key: string;
disabled?: boolean;
}
const resColumns = [
{
title: 'TopicName',
dataIndex: 'topicName',
key: 'topicName',
width: 120,
},
{
title: '状态',
dataIndex: 'code',
key: 'code',
width: 60,
render: (t: number) => {
return (
<span className={t === 0 ? 'success' : 'fail'}>
{t === 0 ? '成功' : '失败'}
</span>
);
},
},
{
title: '原因',
dataIndex: 'message',
key: 'message',
width: 125,
onCell: () => ({
style: {
maxWidth: 120,
...cellStyle,
},
}),
render: (text: string) => {
return (
<Tooltip placement="bottomLeft" title={text} >
{text}
</Tooltip>);
},
},
];
class TopicHaRelation extends React.Component<IXFormProps> {
public state = {
radioCheck: 'spec',
haTopics: [] as IHaTopic[],
targetKeys: [] as string[],
confirmLoading: false,
firstMove: true,
primaryActiveKeys: [] as string[],
primaryStandbyKeys: [] as string[],
};
public handleOk = () => {
this.props.form.validateFields((err: any, values: any) => {
const unbindTopics = [];
const bindTopics = [];
if (values.rule === 'all') {
setHaTopics({
all: true,
activeClusterId: this.props.currentCluster.clusterId,
standbyClusterId: this.props.currentCluster.haClusterVO.clusterId,
topicNames: [],
}).then(res => {
handleMsg(res, '关联成功');
this.setState({
confirmLoading: false,
});
this.handleCancel();
});
return;
}
for (const item of this.state.primaryStandbyKeys) {
if (!this.state.targetKeys.includes(item)) {
unbindTopics.push(item);
}
}
for (const item of this.state.targetKeys) {
if (!this.state.primaryStandbyKeys.includes(item)) {
bindTopics.push(item);
}
}
if (!unbindTopics.length && !bindTopics.length) {
return message.info('请选择您要操作的Topic');
}
const handleMsg = (res: any[], successTip: string) => {
const errorRes = res.filter(item => item.code !== 0);
if (errorRes.length) {
Modal.confirm({
title: '执行结果',
width: 520,
icon: null,
content: (
<Table
columns={resColumns}
rowKey="id"
dataSource={res}
scroll={{ y: 260 }}
pagination={false}
/>
),
});
} else {
notification.success({ message: successTip });
}
this.props.reload();
};
if (bindTopics.length) {
this.setState({
confirmLoading: true,
});
setHaTopics({
all: false,
activeClusterId: this.props.currentCluster.clusterId,
standbyClusterId: this.props.currentCluster.haClusterVO.clusterId,
topicNames: bindTopics,
}).then(res => {
this.setState({
confirmLoading: false,
});
this.handleCancel();
handleMsg(res, '关联成功');
});
}
if (unbindTopics.length) {
this.setState({
confirmLoading: true,
});
unbindHaTopics({
all: false,
activeClusterId: this.props.currentCluster.clusterId,
standbyClusterId: this.props.currentCluster.haClusterVO.clusterId,
topicNames: unbindTopics,
}).then(res => {
this.setState({
confirmLoading: false,
});
this.handleCancel();
handleMsg(res, '解绑成功');
});
}
});
}
public handleCancel = () => {
this.props.handleVisible(false);
this.props.form.resetFields();
}
public handleRadioChange = (e: any) => {
this.setState({
radioCheck: e.target.value,
});
}
public isPrimaryStatus = (targetKeys: string[]) => {
const { primaryStandbyKeys } = this.state;
let isReset = false;
// 判断当前移动是否还原为最初的状态
if (primaryStandbyKeys.length === targetKeys.length) {
targetKeys.sort((a, b) => +a - (+b));
primaryStandbyKeys.sort((a, b) => +a - (+b));
let i = 0;
while (i < targetKeys.length) {
if (targetKeys[i] === primaryStandbyKeys[i]) {
i++;
} else {
break;
}
}
isReset = i === targetKeys.length;
}
return isReset;
}
public setTopicsStatus = (targetKeys: string[], disabled: boolean, isAll = false) => {
const { haTopics } = this.state;
const newTopics = Array.from(haTopics);
if (isAll) {
for (let i = 0; i < haTopics.length; i++) {
newTopics[i].disabled = disabled;
}
} else {
for (const key of targetKeys) {
const index = haTopics.findIndex(item => item.key === key);
if (index > -1) {
newTopics[index].disabled = disabled;
}
}
}
this.setState(({
haTopics: newTopics,
}));
}
public onTransferChange = (targetKeys: string[], direction: string, moveKeys: string[]) => {
const { primaryStandbyKeys, firstMove, primaryActiveKeys } = this.state;
// 判断当前移动是否还原为最初的状态
const isReset = this.isPrimaryStatus(targetKeys);
if (firstMove) {
const primaryKeys = direction === 'right' ? primaryStandbyKeys : primaryActiveKeys;
this.setTopicsStatus(primaryKeys, true, false);
this.setState(({
firstMove: false,
targetKeys,
}));
return;
}
// 如果是还原为初始状态则还原禁用状态
if (isReset) {
this.setTopicsStatus([], false, true);
this.setState(({
firstMove: true,
targetKeys,
}));
return;
}
this.setState({
targetKeys,
});
}
public componentDidMount() {
Promise.all([
getClusterHaTopicsStatus(this.props.currentCluster.clusterId, true),
getClusterHaTopicsStatus(this.props.currentCluster.clusterId, false),
]).then(([activeRes, standbyRes]: IHaTopic[][]) => {
activeRes = (activeRes || []).map(row => ({
...row,
key: row.topicName,
})).filter(item => item.haRelation === null);
standbyRes = (standbyRes || []).map(row => ({
...row,
key: row.topicName,
})).filter(item => item.haRelation === 1 || item.haRelation === 0);
this.setState({
haTopics: [].concat([...activeRes, ...standbyRes]).sort((a, b) => a.topicName.localeCompare(b.topicName)),
primaryActiveKeys: activeRes.map(row => row.topicName),
primaryStandbyKeys: standbyRes.map(row => row.topicName),
targetKeys: standbyRes.map(row => row.topicName),
});
});
}
public render() {
const { formData = {} as any, visible, currentCluster } = this.props;
const { getFieldDecorator } = this.props.form;
let metadata = [] as IBrokersMetadata[];
metadata = admin.brokersMetadata ? admin.brokersMetadata : metadata;
let regions = [] as IBrokersRegions[];
regions = admin.brokersRegions ? admin.brokersRegions : regions;
return (
<>
<Modal
title="Topic高可用关联"
wrapClassName="no-padding"
visible={visible}
onOk={this.handleOk}
onCancel={this.handleCancel}
maskClosable={false}
confirmLoading={this.state.confirmLoading}
width={590}
okText="确认"
cancelText="取消"
>
<Alert
message={`将【集群${currentCluster.clusterName}】和【集群${currentCluster.haClusterVO?.clusterName}】的Topic关联高可用关系`}
type="info"
showIcon={true}
/>
<Form {...layout} name="basic" className="x-form">
{/* <Form.Item label="规则">
{getFieldDecorator('rule', {
initialValue: 'spec',
rules: [{
required: true,
message: '请选择规则',
}],
})(<Radio.Group onChange={this.handleRadioChange} >
<Radio value="all">应用于所有Topic</Radio>
<Radio value="spec">应用于特定Topic</Radio>
</Radio.Group>)}
</Form.Item> */}
{this.state.radioCheck === 'spec' ? <Form.Item className="no-label" label="" >
{getFieldDecorator('topicNames', {
initialValue: this.state.targetKeys,
rules: [{
required: false,
message: '请选择Topic',
}],
})(
<Transfer
className="transfe-list"
dataSource={this.state.haTopics}
targetKeys={this.state.targetKeys}
showSearch={true}
onChange={this.onTransferChange}
render={item => item.topicName}
titles={['未关联', '已关联']}
locale={{
itemUnit: '',
itemsUnit: '',
}}
/>,
)}
</Form.Item> : ''}
</Form>
</Modal>
</>
);
}
}
export const TopicHaRelationWrapper = Form.create<IXFormProps>()(TopicHaRelation);

View File

@@ -0,0 +1,718 @@
import * as React from 'react';
import { admin } from 'store/admin';
import { Modal, Form, Radio, Tag, Popover, Button } from 'antd';
import { IBrokersMetadata, IBrokersRegions, IMetaData } from 'types/base-type';
import { Alert, Icon, message, Table, Transfer } from 'component/antd';
import { getClusterHaTopics, getAppRelatedTopics, createSwitchTask } from 'lib/api';
import { TooltipPlacement } from 'antd/es/tooltip';
import * as XLSX from 'xlsx';
import moment from 'moment';
import { timeMinute } from 'constants/strategy';
const layout = {
labelCol: { span: 3 },
wrapperCol: { span: 21 },
};
interface IXFormProps {
form: any;
reload: any;
formData?: any;
visible?: boolean;
handleVisible?: any;
currentCluster?: IMetaData;
}
interface IHaTopic {
clusterId: number;
topicName: string;
key: string;
activeClusterId: number;
consumeAclNum: number;
produceAclNum: number;
standbyClusterId: number;
status: number;
disabled?: boolean;
}
interface IKafkaUser {
clusterPhyId: number;
kafkaUser: string;
notHaTopicNameList: string[];
notSelectTopicNameList: string[];
selectedTopicNameList: string[];
show: boolean;
}
const columns = [
{
dataIndex: 'topicName',
title: '名称',
width: 100,
ellipsis: true,
},
{
dataIndex: 'produceAclNum',
title: '生产者数量',
width: 80,
},
{
dataIndex: 'consumeAclNum',
title: '消费者数量',
width: 80,
},
];
const kafkaUserColumn = [
{
dataIndex: 'kafkaUser',
title: 'kafkaUser',
width: 100,
ellipsis: true,
},
{
dataIndex: 'selectedTopicNameList',
title: '已选中Topic',
width: 120,
render: (text: string[]) => {
return text?.length ? renderAttributes({ data: text, limit: 3 }) : '-';
},
},
{
dataIndex: 'notSelectTopicNameList',
title: '选中关联Topic',
width: 120,
render: (text: string[]) => {
return text?.length ? renderAttributes({ data: text, limit: 3 }) : '-';
},
},
{
dataIndex: 'notHaTopicNameList',
title: '未建立HA Topic',
width: 120,
render: (text: string[]) => {
return text?.length ? renderAttributes({ data: text, limit: 3 }) : '-';
},
},
];
export const renderAttributes = (params: {
data: any;
type?: string;
limit?: number;
splitType?: string;
placement?: TooltipPlacement;
}) => {
const { data, type = ',', limit = 2, splitType = '', placement } = params;
let attrArray = data;
if (!Array.isArray(data) && data) {
attrArray = data.split(type);
}
const showItems = attrArray.slice(0, limit) || [];
const hideItems = attrArray.slice(limit, attrArray.length) || [];
const content = hideItems.map((item: string, index: number) => (
<Tag key={index} className="tag-gray">
{item}
</Tag>
));
const showItemsContent = showItems.map((item: string, index: number) => (
<Tag key={index} className="tag-gray">
{item}
</Tag>
));
return (
<div className="attribute-content">
{showItems.length > 0 ? showItemsContent : '-'}
{hideItems.length > 0 && (
<Popover placement={placement || 'bottomRight'} content={content} overlayClassName="attribute-tag">
{attrArray.length}<Icon className="icon" type="down" />
</Popover>
)}
</div>
);
};
class TopicHaSwitch extends React.Component<IXFormProps> {
public state = {
radioCheck: 'spec',
targetKeys: [] as string[],
selectedKeys: [] as string[],
topics: [] as IHaTopic[],
kafkaUsers: [] as IKafkaUser[],
primaryActiveKeys: [] as string[],
primaryStandbyKeys: [] as string[],
firstMove: true,
};
public isPrimaryStatus = (targetKeys: string[]) => {
const { primaryStandbyKeys } = this.state;
let isReset = false;
// 判断当前移动是否还原为最初的状态
if (primaryStandbyKeys.length === targetKeys.length) {
targetKeys.sort((a, b) => +a - (+b));
primaryStandbyKeys.sort((a, b) => +a - (+b));
let i = 0;
while (i < targetKeys.length) {
if (targetKeys[i] === primaryStandbyKeys[i]) {
i++;
} else {
break;
}
}
isReset = i === targetKeys.length;
}
return isReset;
}
public getTargetTopics = (currentKeys: string[], primaryKeys: string[]) => {
const targetTopics = [];
for (const key of currentKeys) {
if (!primaryKeys.includes(key)) {
const topic = this.state.topics.find(item => item.key === key)?.topicName;
targetTopics.push(topic);
}
}
return targetTopics;
}
public handleOk = () => {
const { primaryStandbyKeys, primaryActiveKeys, topics } = this.state;
const standbyClusterId = this.props.currentCluster.haClusterVO.clusterId;
const activeClusterId = this.props.currentCluster.clusterId;
this.props.form.validateFields((err: any, values: any) => {
if (values.rule === 'all') {
createSwitchTask({
activeClusterPhyId: activeClusterId,
all: true,
mustContainAllKafkaUserTopics: true,
standbyClusterPhyId: standbyClusterId,
topicNameList: [],
}).then(res => {
message.success('任务创建成功');
this.handleCancel();
this.props.reload(res);
});
return;
}
// 判断当前移动是否还原为最初的状态
const isPrimary = this.isPrimaryStatus(values.targetKeys || []);
if (isPrimary) {
return message.info('请选择您要切换的Topic');
}
// 右侧框值
const currentStandbyKeys = values.targetKeys || [];
// 左侧框值
const currentActiveKeys = [];
for (const item of topics) {
if (!currentStandbyKeys.includes(item.key)) {
currentActiveKeys.push(item.key);
}
}
const currentKeys = currentStandbyKeys.length > primaryStandbyKeys.length ? currentStandbyKeys : currentActiveKeys;
const primaryKeys = currentStandbyKeys.length > primaryStandbyKeys.length ? primaryStandbyKeys : primaryActiveKeys;
const activeClusterPhyId = currentStandbyKeys.length > primaryStandbyKeys.length ? standbyClusterId : activeClusterId;
const standbyClusterPhyId = currentStandbyKeys.length > primaryStandbyKeys.length ? activeClusterId : standbyClusterId;
const targetTopics = this.getTargetTopics(currentKeys, primaryKeys);
createSwitchTask({
activeClusterPhyId,
all: false,
mustContainAllKafkaUserTopics: true,
standbyClusterPhyId,
topicNameList: targetTopics,
}).then(res => {
message.success('任务创建成功');
this.handleCancel();
this.props.reload(res);
});
});
}
public handleCancel = () => {
this.props.handleVisible(false);
this.props.form.resetFields();
}
public handleRadioChange = (e: any) => {
this.setState({
radioCheck: e.target.value,
});
}
public getNewSelectKeys = (removeKeys: string[], selectedKeys: string[]) => {
const { topics, kafkaUsers } = this.state;
// 根据移除的key找与该key关联的其他key一起移除
let relatedTopics: string[] = [];
const relatedKeys: string[] = [];
const newSelectKeys = [];
for (const key of removeKeys) {
const topicName = topics.find(row => row.key === key)?.topicName;
for (const item of kafkaUsers) {
if (item.selectedTopicNameList.includes(topicName)) {
relatedTopics = relatedTopics.concat(item.selectedTopicNameList);
relatedTopics = relatedTopics.concat(item.notSelectTopicNameList);
}
}
for (const item of relatedTopics) {
const key = topics.find(row => row.topicName === item)?.key;
if (key) {
relatedKeys.push(key);
}
}
for (const key of selectedKeys) {
if (!relatedKeys.includes(key)) {
newSelectKeys.push(key);
}
}
}
return newSelectKeys;
}
public setTopicsStatus = (targetKeys: string[], disabled: boolean, isAll = false) => {
const { topics } = this.state;
const newTopics = Array.from(topics);
if (isAll) {
for (let i = 0; i < topics.length; i++) {
newTopics[i].disabled = disabled;
}
} else {
for (const key of targetKeys) {
const index = topics.findIndex(item => item.key === key);
if (index > -1) {
newTopics[index].disabled = disabled;
}
}
}
this.setState(({
topics: newTopics,
}));
}
public getFilterTopics = (selectKeys: string[]) => {
// 依据key值找topicName
const filterTopics: string[] = [];
const targetKeys = selectKeys;
for (const key of targetKeys) {
const topicName = this.state.topics.find(item => item.key === key)?.topicName;
if (topicName) {
filterTopics.push(topicName);
}
}
return filterTopics;
}
public getNewKafkaUser = (targetKeys: string[]) => {
const { primaryStandbyKeys, topics } = this.state;
const removeKeys = [];
const addKeys = [];
for (const key of primaryStandbyKeys) {
if (targetKeys.indexOf(key) < 0) {
// 移除的
removeKeys.push(key);
}
}
for (const key of targetKeys) {
if (primaryStandbyKeys.indexOf(key) < 0) {
// 新增的
addKeys.push(key);
}
}
const keepKeys = [...removeKeys, ...addKeys];
const newKafkaUsers = this.state.kafkaUsers;
const moveTopics = this.getFilterTopics(keepKeys);
for (const topic of moveTopics) {
for (const item of newKafkaUsers) {
if (item.selectedTopicNameList.includes(topic)) {
item.show = true;
}
}
}
const showKafaUsers = newKafkaUsers.filter(item => item.show === true);
for (const item of showKafaUsers) {
let i = 0;
while (i < moveTopics.length) {
if (!item.selectedTopicNameList.includes(moveTopics[i])) {
i++;
} else {
break;
}
}
// 表示该kafkaUser不该展示
if (i === moveTopics.length) {
item.show = false;
}
}
return showKafaUsers;
}
public getAppRelatedTopicList = (selectedKeys: string[]) => {
const { topics, targetKeys, primaryStandbyKeys, kafkaUsers } = this.state;
const filterTopicNameList = this.getFilterTopics(selectedKeys);
const isReset = this.isPrimaryStatus(targetKeys);
if (!filterTopicNameList.length && isReset) {
// targetKeys
this.setState({
kafkaUsers: kafkaUsers.map(item => ({
...item,
show: false,
})),
});
return;
} else {
// 保留选中项与移动的的项
this.setState({
kafkaUsers: this.getNewKafkaUser(targetKeys),
});
}
// 单向选择所以取当前值的aactiveClusterId
const clusterPhyId = topics.find(item => item.topicName === filterTopicNameList[0]).activeClusterId;
getAppRelatedTopics({
clusterPhyId,
filterTopicNameList,
}).then((res: IKafkaUser[]) => {
let notSelectTopicNames: string[] = [];
const notSelectTopicKeys: string[] = [];
for (const item of (res || [])) {
notSelectTopicNames = notSelectTopicNames.concat(item.notSelectTopicNameList || []);
}
for (const item of notSelectTopicNames) {
const key = topics.find(row => row.topicName === item)?.key;
if (key) {
notSelectTopicKeys.push(key);
}
}
const newSelectedKeys = selectedKeys.concat(notSelectTopicKeys);
const newKafkaUsers = (res || []).map(item => ({
...item,
show: true,
}));
const { kafkaUsers } = this.state;
for (const item of kafkaUsers) {
const resItem = res.find(row => row.kafkaUser === item.kafkaUser);
if (!resItem) {
newKafkaUsers.push(item);
}
}
this.setState({
kafkaUsers: newKafkaUsers,
selectedKeys: newSelectedKeys,
});
if (notSelectTopicKeys.length) {
this.getAppRelatedTopicList(newSelectedKeys);
}
});
}
public getRelatedKeys = (currentKeys: string[]) => {
// 未被选中的项
const removeKeys = [];
// 对比上一次记录的选中的值找出本次取消的项
const { selectedKeys } = this.state;
for (const preKey of selectedKeys) {
if (!currentKeys.includes(preKey)) {
removeKeys.push(preKey);
}
}
return removeKeys?.length ? this.getNewSelectKeys(removeKeys, currentKeys) : currentKeys;
}
public handleTopicChange = (sourceSelectedKeys: string[], targetSelectedKeys: string[]) => {
const { topics, targetKeys } = this.state;
// 条件限制只允许选中一边,单向操作
const keys = [...sourceSelectedKeys, ...targetSelectedKeys];
// 判断当前选中项属于哪一类
if (keys.length) {
const activeClusterId = topics.find(item => item.key === keys[0]).activeClusterId;
const needDisabledKeys = topics.filter(item => item.activeClusterId !== activeClusterId).map(row => row.key);
this.setTopicsStatus(needDisabledKeys, true);
}
const selectedKeys = this.state.selectedKeys.length ? this.getRelatedKeys(keys) : keys;
const isReset = this.isPrimaryStatus(targetKeys);
if (!selectedKeys.length && isReset) {
this.setTopicsStatus([], false, true);
}
this.setState({
selectedKeys,
});
this.getAppRelatedTopicList(selectedKeys);
}
public onDirectChange = (targetKeys: string[], direction: string, moveKeys: string[]) => {
const { primaryStandbyKeys, firstMove, primaryActiveKeys, topics } = this.state;
const getKafkaUser = () => {
const newKafkaUsers = this.state.kafkaUsers;
const moveTopics = this.getFilterTopics(moveKeys);
for (const topic of moveTopics) {
for (const item of newKafkaUsers) {
if (item.selectedTopicNameList.includes(topic)) {
item.show = true;
}
}
}
return newKafkaUsers;
};
// 判断当前移动是否还原为最初的状态
const isReset = this.isPrimaryStatus(targetKeys);
if (firstMove) {
const primaryKeys = direction === 'right' ? primaryStandbyKeys : primaryActiveKeys;
this.setTopicsStatus(primaryKeys, true, false);
this.setState(({
firstMove: false,
kafkaUsers: getKafkaUser(),
targetKeys,
}));
return;
}
// 如果是还原为初始状态则还原禁用状态
if (isReset) {
this.setTopicsStatus([], false, true);
this.setState(({
firstMove: true,
targetKeys,
kafkaUsers: [],
}));
return;
}
// 切换后重新判定展示项
this.setState(({
targetKeys,
kafkaUsers: this.getNewKafkaUser(targetKeys),
}));
}
public downloadData = () => {
const { kafkaUsers } = this.state;
const tableData = kafkaUsers.map(item => {
return {
// tslint:disable
'kafkaUser': item.kafkaUser,
'已选中Topic': item.selectedTopicNameList?.join('、'),
'选中关联Topic': item.notSelectTopicNameList?.join('、'),
'未建立HA Topic': item.notHaTopicNameList?.join(``),
};
});
const data = [].concat(tableData);
const wb = XLSX.utils.book_new();
// json转sheet
const ws = XLSX.utils.json_to_sheet(data, {
header: ['kafkaUser', '已选中Topic', '选中关联Topic', '未建立HA Topic'],
});
// XLSX.utils.
XLSX.utils.book_append_sheet(wb, ws, 'kafkaUser');
// 输出
XLSX.writeFile(wb, 'kafkaUser-' + moment((new Date()).getTime()).format(timeMinute) + '.xlsx');
}
public judgeSubmitStatus = () => {
const { kafkaUsers } = this.state;
const newKafkaUsers = kafkaUsers.filter(item => item.show)
for (const item of newKafkaUsers) {
if (item.notHaTopicNameList.length) {
return true;
}
}
return false;
}
public componentDidMount() {
const standbyClusterId = this.props.currentCluster.haClusterVO.clusterId;
const activeClusterId = this.props.currentCluster.clusterId;
getClusterHaTopics(this.props.currentCluster.clusterId, standbyClusterId).then((res: IHaTopic[]) => {
res = res.map((item, index) => ({
key: index.toString(),
...item,
}));
const targetKeys = (res || []).filter((item) => item.activeClusterId === standbyClusterId).map(row => row.key);
const primaryActiveKeys = (res || []).filter((item) => item.activeClusterId === activeClusterId).map(row => row.key);
this.setState({
topics: res || [],
primaryStandbyKeys: targetKeys,
primaryActiveKeys,
targetKeys,
});
});
}
public render() {
const { visible, currentCluster } = this.props;
const { getFieldDecorator } = this.props.form;
let metadata = [] as IBrokersMetadata[];
metadata = admin.brokersMetadata ? admin.brokersMetadata : metadata;
let regions = [] as IBrokersRegions[];
regions = admin.brokersRegions ? admin.brokersRegions : regions;
const tableData = this.state.kafkaUsers.filter(row => row.show);
return (
<Modal
title="Topic主备切换"
wrapClassName="no-padding"
visible={visible}
onCancel={this.handleCancel}
maskClosable={false}
width={800}
footer={<>
<Button onClick={this.handleCancel}></Button>
<Button disabled={this.judgeSubmitStatus()} style={{ marginLeft: 8 }} type="primary" onClick={() => this.handleOk()}></Button>
</>
}
>
<Alert
message={`注意必须把同一个kafkauser关联的所有Topic都建立高可用关系并且都选中才能执行任务`}
type="info"
showIcon={true}
/>
<Form {...layout} name="basic" className="x-form">
{/* <Form.Item label="规则" >
{getFieldDecorator('rule', {
initialValue: 'spec',
rules: [{
required: true,
message: '请选择规则',
}],
})(<Radio.Group onChange={this.handleRadioChange} >
<Radio value="all">应用于所有Topic</Radio>
<Radio value="spec">应用于特定Topic</Radio>
</Radio.Group>)}
</Form.Item> */}
{this.state.radioCheck === 'spec' ? <Form.Item className="no-label" label="" >
{getFieldDecorator('targetKeys', {
initialValue: this.state.targetKeys,
rules: [{
required: false,
message: '请选择Topic',
}],
})(
<TransferTable
selectedKeys={this.state.selectedKeys}
topicChange={this.handleTopicChange}
onDirectChange={this.onDirectChange}
dataSource={this.state.topics}
currentCluster={currentCluster}
/>,
)}
</Form.Item> : ''}
</Form>
{this.state.radioCheck === 'spec' ?
<>
<Table
className="modal-table-content"
columns={kafkaUserColumn}
dataSource={tableData}
size="small"
rowKey="kafkaUser"
pagination={false}
scroll={{ y: 300 }}
/>
{this.state.kafkaUsers.length ? <div onClick={this.downloadData} className="modal-table-download"><a></a></div> : null}
</>
: null}
</Modal>
);
}
}
export const TopicSwitchWrapper = Form.create<IXFormProps>()(TopicHaSwitch);
const TableTransfer = ({ leftColumns, ...restProps }: any) => (
<Transfer {...restProps} showSelectAll={true}>
{({
filteredItems,
direction,
onItemSelect,
selectedKeys: listSelectedKeys,
}) => {
const columns = leftColumns;
const rowSelection = {
columnWidth: 40,
getCheckboxProps: (item: any) => ({
disabled: item.disabled,
}),
onSelect({ key }: any, selected: any) {
onItemSelect(key, selected);
},
selectedRowKeys: listSelectedKeys,
};
return (
<Table
rowSelection={rowSelection}
columns={columns}
dataSource={filteredItems}
size="small"
pagination={false}
scroll={{ y: 320 }}
style={{ marginBottom: 14 }}
bordered={false}
onRow={({ key, disabled }) => ({
onClick: () => {
if (disabled) return;
onItemSelect(key, !listSelectedKeys.includes(key));
},
})}
/>
);
}}
</Transfer>
);
interface IProps {
value?: any;
onChange?: any;
onDirectChange?: any;
currentCluster: any;
topicChange: any;
dataSource: any[];
selectedKeys: string[];
}
export class TransferTable extends React.Component<IProps> {
public onChange = (nextTargetKeys: any, direction: string, moveKeys: string[]) => {
this.props.onDirectChange(nextTargetKeys, direction, moveKeys);
// tslint:disable-next-line:no-unused-expression
this.props.onChange && this.props.onChange(nextTargetKeys);
}
public render() {
const { currentCluster, dataSource, value, topicChange, selectedKeys } = this.props;
return (
<div>
<TableTransfer
dataSource={dataSource}
targetKeys={value || []}
selectedKeys={selectedKeys}
showSearch={true}
onChange={this.onChange}
onSelectChange={topicChange}
leftColumns={columns}
titles={[`集群${currentCluster.clusterName}`, `集群${currentCluster.haClusterVO.clusterName}`]}
locale={{
itemUnit: '',
itemsUnit: '',
}}
/>
</div>
);
}
}

View File

@@ -16,6 +16,17 @@ import { modal } from 'store/modal';
import { TopicAppSelect } from '../topic/topic-app-select';
import Url from 'lib/url-parser';
import { expandRemarks, quotaRemarks } from 'constants/strategy';
import { getAppListByClusterId } from 'lib/api';
const updateApplyTopicFormModal = (clusterId: number) => {
const formMap = wrapper.xFormWrapper.formMap;
const formData = wrapper.xFormWrapper.formData;
getAppListByClusterId(clusterId).then(res => {
formMap[2].customFormItem = <AppSelect selectData={res} />;
// tslint:disable-next-line:no-unused-expression
wrapper.ref && wrapper.ref.updateFormMap$(formMap, formData);
});
};
export const applyTopic = () => {
const xFormModal = {
@@ -28,6 +39,9 @@ export const applyTopic = () => {
rules: [{ required: true, message: '请选择' }],
attrs: {
placeholder: '请选择',
onChange(value: number) {
updateApplyTopicFormModal(value);
},
},
}, {
key: 'topicName',
@@ -49,7 +63,7 @@ export const applyTopic = () => {
type: 'custom',
defaultValue: '',
rules: [{ required: true, message: '请选择' }],
customFormItem: <AppSelect selectData={app.data} />,
customFormItem: <AppSelect selectData={[]} />,
}, {
key: 'peakBytesIn',
label: '峰值流量',
@@ -88,7 +102,7 @@ export const applyTopic = () => {
],
formData: {},
visible: true,
title: <div><span>Topic</span><a className='applicationDocument' href="https://github.com/didi/Logi-KafkaManager/blob/master/docs/user_guide/resource_apply.md" target='_blank'></a></div>,
title: <div><span>Topic</span><a className="applicationDocument" href="https://github.com/didi/Logi-KafkaManager/blob/master/docs/user_guide/resource_apply.md" target="_blank"></a></div>,
okText: '确认',
// customRenderElement: <span className="tips">集群资源充足时预计1分钟自动审批通过</span>,
isWaitting: true,
@@ -106,7 +120,7 @@ export const applyTopic = () => {
};
return topic.applyTopic(quotaParams).then(data => {
window.location.href = `${urlPrefix}/user/order-detail/?orderId=${data.id}&region=${region.currentRegion}`;
})
});
},
onSubmitFaild: (err: any, ref: any, formData: any, formMap: any) => {
if (err.message === 'topic already existed') {
@@ -115,10 +129,10 @@ export const applyTopic = () => {
topicName: {
value: topic,
errors: [new Error('该topic名称已存在')],
}
})
},
});
}
}
},
};
wrapper.open(xFormModal);
};
@@ -380,12 +394,19 @@ export const showTopicApplyQuatoModal = (item: ITopic) => {
consumeQuota: transMBToB(value.consumeQuota),
produceQuota: transMBToB(value.produceQuota),
});
if (item.isPhysicalClusterId) {
Object.assign(quota, {
isPhysicalClusterId: true,
});
}
const quotaParams = {
type: 2,
applicant: users.currentUser.username,
description: value.description,
extensions: JSON.stringify(quota),
};
topic.applyQuota(quotaParams).then((data) => {
notification.success({ message: '申请配额成功' });
window.location.href = `${urlPrefix}/user/order-detail/?orderId=${data.id}&region=${region.currentRegion}`;
@@ -454,23 +475,24 @@ const judgeAccessStatus = (access: number) => {
export const showAllPermissionModal = (item: ITopic) => {
let appId: string = null;
app.getAppListByClusterId(item.clusterId).then(res => {
if (!app.clusterAppData || !app.clusterAppData.length) {
return notification.info({
message: (
<>
<span>
<a href={`${urlPrefix}/topic/app-list?application=1`}></a>
</span>
</>),
});
}
const index = app.clusterAppData.findIndex(row => row.appId === item.appId);
if (!app.data || !app.data.length) {
return notification.info({
message: (
<>
<span>
<a href={`${urlPrefix}/topic/app-list?application=1`}></a>
</span>
</>),
appId = index > -1 ? item.appId : app.clusterAppData[0].appId;
topic.getAuthorities(appId, item.clusterId, item.topicName).then((data) => {
showAllPermission(appId, item, data.access);
});
}
const index = app.data.findIndex(row => row.appId === item.appId);
appId = index > -1 ? item.appId : app.data[0].appId;
topic.getAuthorities(appId, item.clusterId, item.topicName).then((data) => {
showAllPermission(appId, item, data.access);
});
};
@@ -494,7 +516,7 @@ const showAllPermission = (appId: string, item: ITopic, access: number) => {
defaultValue: appId,
rules: [{ required: true, message: '请选择应用' }],
type: 'custom',
customFormItem: <TopicAppSelect selectData={app.data} parameter={item} />,
customFormItem: <TopicAppSelect selectData={app.clusterAppData} parameter={item} />,
},
{
key: 'access',

View File

@@ -18,7 +18,7 @@ interface IFilterParams {
}
interface ISearchAndFilterState {
[filter: string]: boolean | string | number | any[];
[filter: string]: boolean | string | number | any;
}
export class SearchAndFilterContainer extends React.Component<any, ISearchAndFilterState> {

View File

@@ -331,11 +331,13 @@ export class TopicDetail extends React.Component<any> {
public render() {
const role = users.currentUser.role;
const baseInfo = topic.baseInfo as ITopicBaseInfo;
const showEditBtn = (role == 1 || role == 2) || (topic.topicBusiness && topic.topicBusiness.principals.includes(users.currentUser.username));
const showEditBtn = (role == 1 || role == 2) ||
(topic.topicBusiness && topic.topicBusiness.principals.includes(users.currentUser.username));
const topicRecord = {
clusterId: this.clusterId,
topicName: this.topicName,
clusterName: this.clusterName
clusterName: this.clusterName,
isPhysicalClusterId: !!this.isPhysicalTrue,
} as ITopic;
return (
@@ -349,9 +351,12 @@ export class TopicDetail extends React.Component<any> {
title={this.topicName || ''}
extra={
<>
{this.needAuth == "true" && <Button key="0" type="primary" onClick={() => showAllPermissionModal(topicRecord)} ></Button>}
<Button key="1" type="primary" onClick={() => applyTopicQuotaQuery(topicRecord)} ></Button>
<Button key="2" type="primary" onClick={() => applyExpandModal(topicRecord)} ></Button>
{this.needAuth == 'true' &&
<Button key="0" type="primary" onClick={() => showAllPermissionModal(topicRecord)} ></Button>}
{baseInfo.haRelation === 0 ? null :
<Button key="1" type="primary" onClick={() => applyTopicQuotaQuery(topicRecord)} ></Button>}
{baseInfo.haRelation === 0 ? null :
<Button key="2" type="primary" onClick={() => applyExpandModal(topicRecord)} ></Button>}
<Button key="3" type="primary" onClick={() => this.props.history.push(`/alarm/add`)} ></Button>
<Button key="4" type="primary" onClick={this.showDrawer.bind(this)} ></Button>
{/* {showEditBtn && <Button key="5" onClick={() => this.compileDetails()} type="primary">编辑</Button>} */}

View File

@@ -248,6 +248,10 @@ export const getAppTopicList = (appId: string, mine: boolean) => {
return fetch(`/normal/apps/${appId}/topics?mine=${mine}`);
};
export const getAppListByClusterId = (clusterId: number) => {
return fetch(`/normal/apps/${clusterId}`);
};
/**
* 专家服务
*/
@@ -418,8 +422,69 @@ export const getMetaData = (needDetail: boolean = true) => {
return fetch(`/rd/clusters/basic-info?need-detail=${needDetail}`);
};
export const getHaMetaData = () => {
return fetch(`/rd/clusters/ha/basic-info`);
};
export const getClusterHaTopics = (firstClusterId: number, secondClusterId?: number) => {
return fetch(`/rd/clusters/${firstClusterId}/ha-topics?secondClusterId=${secondClusterId || ''}`);
};
export const getClusterHaTopicsStatus = (firstClusterId: number, checkMetadata: boolean) => {
return fetch(`/rd/clusters/${firstClusterId}/ha-topics/status?checkMetadata=${checkMetadata}`);
};
export const setHaTopics = (params: any) => {
return fetch(`/op/ha-topics`, {
method: 'POST',
body: JSON.stringify(params),
});
};
export const getAppRelatedTopics = (params: any) => {
return fetch(`/rd/apps/relate-topics
`, {
method: 'POST',
body: JSON.stringify(params),
});
};
// 取消Topic高可用
export const unbindHaTopics = (params: any) => {
return fetch(`/op/ha-topics`, {
method: 'DELETE',
body: JSON.stringify(params),
});
};
// 创建Topic主备切换任务
export const createSwitchTask = (params: any) => {
return fetch(`/op/as-switch-jobs`, {
method: 'POST',
body: JSON.stringify(params),
});
};
export const getJobDetail = (jobId: number) => {
return fetch(`/op/as-switch-jobs/${jobId}/job-detail`);
};
export const getJobLog = (jobId: number, startLogId?: number) => {
return fetch(`/op/as-switch-jobs/${jobId}/job-logs?startLogId=${startLogId || ''}`);
};
export const getJobState = (jobId: number) => {
return fetch(`/op/as-switch-jobs/${jobId}/job-state`);
};
export const switchAsJobs = (jobId: number, params: any) => {
return fetch(`/op/as-switch-jobs/${jobId}/action`, {
method: 'PUT',
body: JSON.stringify(params),
});
};
export const getOperationRecordData = (params: any) => {
return fetch(`/rd/operate-record`,{
return fetch(`/rd/operate-record`, {
method: 'POST',
body: JSON.stringify(params),
});
@@ -569,15 +634,15 @@ export const getCandidateController = (clusterId: number) => {
return fetch(`/rd/clusters/${clusterId}/controller-preferred-candidates`);
};
export const addCandidateController = (params:any) => {
return fetch(`/op/cluster-controller/preferred-candidates`, {
export const addCandidateController = (params: any) => {
return fetch(`/op/cluster-controller/preferred-candidates`, {
method: 'POST',
body: JSON.stringify(params),
});
};
export const deleteCandidateCancel = (params:any)=>{
return fetch(`/op/cluster-controller/preferred-candidates`, {
export const deleteCandidateCancel = (params: any) => {
return fetch(`/op/cluster-controller/preferred-candidates`, {
method: 'DELETE',
body: JSON.stringify(params),
});

View File

@@ -33,7 +33,6 @@ const checkStatus = (res: Response) => {
};
const filter = (init: IInit) => (res: IRes) => {
if (res.code !== 0 && res.code !== 200) {
if (!init.errorNoTips) {
notification.error({
@@ -117,7 +116,7 @@ export default function fetch(url: string, init?: IInit) {
export function formFetch(url: string, init?: IInit) {
url = url.indexOf('?') > 0 ?
`${url}&dataCenter=${region.currentRegion}` : `${url}?dataCenter=${region.currentRegion}`;
`${url}&dataCenter=${region.currentRegion}` : `${url}?dataCenter=${region.currentRegion}`;
let realUrl = url;
if (!/^http(s)?:\/\//.test(url)) {
@@ -127,8 +126,8 @@ export function formFetch(url: string, init?: IInit) {
init = addCustomHeader(init);
return window
.fetch(realUrl, init)
.then(res => checkStatus(res))
.then((res) => res.json())
.then(filter(init));
.fetch(realUrl, init)
.then(res => checkStatus(res))
.then((res) => res.json())
.then(filter(init));
}

View File

@@ -1,4 +1,3 @@
* {
padding: 0;
margin: 0;
@@ -13,7 +12,9 @@ li {
list-style-type: none;
}
html, body, .router-nav {
html,
body,
.router-nav {
width: 100%;
height: 100%;
font-family: PingFangSC-Regular;
@@ -52,11 +53,12 @@ html, body, .router-nav {
color: @primary-color;
}
.ant-table-thead > tr > th, .ant-table-tbody > tr > td {
.ant-table-thead>tr>th,
.ant-table-tbody>tr>td {
padding: 13px;
}
.ant-table-tbody > tr > td {
.ant-table-tbody>tr>td {
background: #fff;
}
@@ -72,15 +74,11 @@ html, body, .router-nav {
overflow: auto;
}
.ant-form-item {
margin-bottom: 16px;
}
.mb-24 {
margin-bottom: 24px;
}
.ant-table-thead > tr > th .ant-table-filter-icon {
.ant-table-thead>tr>th .ant-table-filter-icon {
right: initial;
}
@@ -100,7 +98,7 @@ html, body, .router-nav {
margin-left: 10px;
}
.config-info{
.config-info {
white-space: pre-line;
height: 100%;
overflow-y: scroll;
@@ -113,4 +111,3 @@ html, body, .router-nav {
cursor: pointer;
font-size: 12px;
}

View File

@@ -1,6 +1,7 @@
import { BrowserRouter as Router, Route } from 'react-router-dom';
import { hot } from 'react-hot-loader/root';
import * as React from 'react';
import zhCN from 'antd/lib/locale/zh_CN';
import Home from './page/topic';
import Admin from './page/admin';
@@ -12,58 +13,62 @@ import { urlPrefix } from 'constants/left-menu';
import ErrorPage from './page/error';
import Login from './page/login';
import InfoPage from './page/info';
import { ConfigProvider } from 'antd';
class RouterDom extends React.Component {
public render() {
return (
<Router basename={urlPrefix}>
<Route path="/" exact={true} component={Home} />
<Route path={`/topic`} exact={true} component={Home} />
<Route
path={`/topic/:page`}
exact={true}
component={Home}
/>
<ConfigProvider locale={zhCN}>
<Route path={`/admin`} exact={true} component={Admin} />
<Route
path={`/admin/:page`}
exact={true}
component={Admin}
/>
<Router basename={urlPrefix}>
<Route path="/" exact={true} component={Home} />
<Route path={`/topic`} exact={true} component={Home} />
<Route
path={`/topic/:page`}
exact={true}
component={Home}
/>
<Route path={`/user`} exact={true} component={User} />
<Route path={`/user/:page`} exact={true} component={User} />
<Route path={`/admin`} exact={true} component={Admin} />
<Route
path={`/admin/:page`}
exact={true}
component={Admin}
/>
<Route path={`/cluster`} exact={true} component={Cluster} />
<Route
path={`/cluster/:page`}
exact={true}
component={Cluster}
/>
<Route path={`/user`} exact={true} component={User} />
<Route path={`/user/:page`} exact={true} component={User} />
<Route path={`/expert`} exact={true} component={Expert} />
<Route
path={`/expert/:page`}
exact={true}
component={Expert}
/>
<Route path={`/cluster`} exact={true} component={Cluster} />
<Route
path={`/cluster/:page`}
exact={true}
component={Cluster}
/>
<Route path={`/alarm`} exact={true} component={Alarm} />
<Route
path={`/alarm/:page`}
exact={true}
component={Alarm}
/>
<Route path={`/expert`} exact={true} component={Expert} />
<Route
path={`/expert/:page`}
exact={true}
component={Expert}
/>
<Route
path={`/login`}
exact={true}
component={Login}
/>
<Route path={`/error`} exact={true} component={ErrorPage} />
<Route path={`/info`} exact={true} component={InfoPage} />
</Router>
<Route path={`/alarm`} exact={true} component={Alarm} />
<Route
path={`/alarm/:page`}
exact={true}
component={Alarm}
/>
<Route
path={`/login`}
exact={true}
component={Login}
/>
<Route path={`/error`} exact={true} component={ErrorPage} />
<Route path={`/info`} exact={true} component={InfoPage} />
</Router>
</ConfigProvider>
);
}
}

View File

@@ -57,8 +57,9 @@ import {
getBillStaffDetail,
getCandidateController,
addCandidateController,
deleteCandidateCancel
} from 'lib/api';
deleteCandidateCancel,
getHaMetaData,
} from 'lib/api';
import { getControlMetricOption, getClusterMetricOption } from 'lib/line-charts-config';
import { copyValueMap } from 'constants/status-map';
@@ -104,12 +105,15 @@ class Admin {
@observable
public metaList: IMetaData[] = [];
@observable
public haMetaList: IMetaData[] = [];
@observable
public oRList: any[] = [];
@observable
public oRparams:any={
moduleId:0
public oRparams: any = {
moduleId: 0
};
@observable
@@ -169,7 +173,7 @@ class Admin {
@observable
public controllerCandidate: IController[] = [];
@observable
@observable
public filtercontrollerCandidate: string = '';
@observable
@@ -329,9 +333,20 @@ class Admin {
}
@action.bound
public setOperationRecordList(data:any){
public setHaMetaList(data: IMetaData[]) {
this.setLoading(false);
this.oRList = data ? data.map((item:any, index: any) => {
this.haMetaList = data ? data.map((item, index) => {
item.key = index;
return item;
}) : [];
this.haMetaList = this.haMetaList.sort((a, b) => a.clusterId - b.clusterId);
return this.haMetaList;
}
@action.bound
public setOperationRecordList(data: any) {
this.setLoading(false);
this.oRList = data ? data.map((item: any, index: any) => {
item.key = index;
return item;
}) : [];
@@ -394,9 +409,9 @@ class Admin {
item.key = index;
return item;
}) : [];
this.filtercontrollerCandidate = data?data.map((item,index)=>{
this.filtercontrollerCandidate = data ? data.map((item, index) => {
return item.brokerId
}).join(','):''
}).join(',') : ''
}
@action.bound
@@ -479,8 +494,8 @@ class Admin {
}
@action.bound
public setBrokersMetadata(data: IBrokersMetadata[]|any) {
this.brokersMetadata = data ? data.map((item:any, index:any) => {
public setBrokersMetadata(data: IBrokersMetadata[] | any) {
this.brokersMetadata = data ? data.map((item: any, index: any) => {
item.key = index;
return {
...item,
@@ -675,6 +690,11 @@ class Admin {
getMetaData(needDetail).then(this.setMetaList);
}
public getHaMetaData() {
this.setLoading(true);
return getHaMetaData().then(this.setHaMetaList);
}
public getOperationRecordData(params: any) {
this.setLoading(true);
this.oRparams = params
@@ -738,17 +758,17 @@ class Admin {
}
public getCandidateController(clusterId: number) {
return getCandidateController(clusterId).then(data=>{
return getCandidateController(clusterId).then(data => {
return this.setCandidateController(data)
});
}
public addCandidateController(clusterId: number, brokerIdList: any) {
return addCandidateController({clusterId, brokerIdList}).then(()=>this.getCandidateController(clusterId));
return addCandidateController({ clusterId, brokerIdList }).then(() => this.getCandidateController(clusterId));
}
public deleteCandidateCancel(clusterId: number, brokerIdList: any){
return deleteCandidateCancel({clusterId, brokerIdList}).then(()=>this.getCandidateController(clusterId));
public deleteCandidateCancel(clusterId: number, brokerIdList: any) {
return deleteCandidateCancel({ clusterId, brokerIdList }).then(() => this.getCandidateController(clusterId));
}
public getBrokersBasicInfo(clusterId: number, brokerId: number) {

View File

@@ -1,5 +1,5 @@
import { observable, action } from 'mobx';
import { getAppList, getAppDetail, getAppTopicList, applyOrder, modfiyApplication, modfiyAdminApp, getAdminAppList, getAppsConnections, getTopicAppQuota } from 'lib/api';
import { getAppList, getAppDetail, getAppTopicList, applyOrder, modfiyApplication, modfiyAdminApp, getAdminAppList, getAppsConnections, getTopicAppQuota, getAppListByClusterId } from 'lib/api';
import { IAppItem, IAppQuota, ITopic, IOrderParams, IConnectionInfo } from 'types/base-type';
class App {
@@ -12,6 +12,9 @@ class App {
@observable
public data: IAppItem[] = [];
@observable
public clusterAppData: IAppItem[] = [];
@observable
public adminAppData: IAppItem[] = [];
@@ -19,7 +22,7 @@ class App {
public selectData: IAppItem[] = [{
appId: '-1',
name: '所有关联应用',
} as IAppItem,
} as IAppItem,
];
@observable
@@ -51,12 +54,12 @@ class App {
@action.bound
public setTopicAppQuota(data: IAppQuota[]) {
return this.appQuota = data.map((item, index) => {
return {
...item,
label: item.appName,
value: item.appId,
key: index,
};
return {
...item,
label: item.appName,
value: item.appId,
key: index,
};
});
}
@@ -87,6 +90,16 @@ class App {
this.setLoading(false);
}
@action.bound
public setClusterAppData(data: IAppItem[] = []) {
this.clusterAppData = data.map((item, index) => ({
...item,
key: index,
principalList: item.principals ? item.principals.split(',') : [],
}));
return this.clusterAppData;
}
@action.bound
public setAdminData(data: IAppItem[] = []) {
this.adminAppData = data.map((item, index) => ({
@@ -133,6 +146,10 @@ class App {
getAppList().then(this.setData);
}
public getAppListByClusterId(clusterId: number) {
return getAppListByClusterId(clusterId).then(this.setClusterAppData);
}
public getTopicAppQuota(clusterId: number, topicName: string) {
return getTopicAppQuota(clusterId, topicName).then(this.setTopicAppQuota);
}

View File

@@ -37,6 +37,7 @@ export interface ITopicBaseInfo {
physicalClusterId: number;
percentile: string;
regionNameList: any;
haRelation: number;
}
export interface IRealTimeTraffic {

View File

@@ -474,7 +474,14 @@ export interface IMetaData {
status: number;
topicNum: number;
zookeeper: string;
haRelation?: number;
haASSwitchJobId?: number;
haStatus?: number;
haClusterVO?: IMetaData;
activeTopicCount?: number;
standbyTopicCount?: number;
key?: number;
mutualBackupClusterName?: string;
}
export interface IConfigure {
@@ -641,6 +648,7 @@ export interface IClusterTopics {
properties: any;
clusterName: string;
logicalClusterId: number;
haRelation?: number;
key?: number;
}

View File

@@ -130,9 +130,7 @@ module.exports = {
historyApiFallback: true,
proxy: {
'/api/v1/': {
// target: 'http://127.0.0.1:8080',
target: 'http://10.179.37.199:8008',
// target: 'http://99.11.45.164:8888',
target: 'http://127.0.0.1:8080/',
changeOrigin: true,
}
},

View File

@@ -0,0 +1,32 @@
package com.xiaojukeji.kafka.manager.service.biz.ha;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.HaASRelationDO;
import com.xiaojukeji.kafka.manager.common.entity.vo.ha.HaClusterTopicVO;
import com.xiaojukeji.kafka.manager.common.entity.vo.normal.topic.HaClusterTopicHaStatusVO;
import java.util.List;
public interface HaASRelationManager {
/**
* 获取集群主备信息
*/
List<HaClusterTopicVO> getHATopics(Long firstClusterPhyId, Long secondClusterPhyId, boolean filterSystemTopics);
/**
* 获取集群Topic的主备状态信息
*/
Result<List<HaClusterTopicHaStatusVO>> listHaStatusTopics(Long clusterPhyId, Boolean checkMetadata);
/**
* 获取获取集群topic高可用关系 0备topic, 1:主topic, -1非高可用
*/
Integer getRelation(Long clusterId, String topicName);
/**
* 获取获取集群topic高可用关系
*/
HaASRelationDO getASRelation(Long clusterId, String topicName);
}

View File

@@ -0,0 +1,16 @@
package com.xiaojukeji.kafka.manager.service.biz.ha;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.vo.rd.app.AppRelateTopicsVO;
import java.util.List;
/**
* Ha App管理
*/
public interface HaAppManager {
Result<List<AppRelateTopicsVO>> appRelateTopics(Long clusterPhyId, List<String> filterTopicNameList);
boolean isContainAllRelateAppTopics(Long clusterPhyId, List<String> filterTopicNameList);
}

View File

@@ -0,0 +1,19 @@
package com.xiaojukeji.kafka.manager.service.biz.ha;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.ao.ClusterDetailDTO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ClusterDO;
import java.util.List;
/**
* Ha Cluster管理
*/
public interface HaClusterManager {
List<ClusterDetailDTO> getClusterDetailDTOList(Boolean needDetail);
Result<Void> addNew(ClusterDO clusterDO, Long activeClusterId, String operator);
Result<Void> deleteById(Long clusterId, String operator);
}

View File

@@ -0,0 +1,44 @@
package com.xiaojukeji.kafka.manager.service.biz.ha;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.TopicOperationResult;
import com.xiaojukeji.kafka.manager.common.entity.ao.ha.HaSwitchTopic;
import com.xiaojukeji.kafka.manager.common.entity.dto.op.topic.HaTopicRelationDTO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.JobLogDO;
import java.util.List;
/**
* Ha Topic管理
*/
public interface HaTopicManager {
/**
* 批量更改主备关系
*/
Result<List<TopicOperationResult>> batchCreateHaTopic(HaTopicRelationDTO dto, String operator);
/**
* 批量更改主备关系
*/
Result<List<TopicOperationResult>> batchRemoveHaTopic(HaTopicRelationDTO dto, String operator);
/**
* 可重试的执行主备切换
* @param newActiveClusterPhyId 主集群
* @param newStandbyClusterPhyId 备集群
* @param switchTopicNameList 切换的Topic列表
* @param focus 强制切换
* @param firstTriggerExecute 第一次触发执行
* @param switchLogTemplate 切换日志模版
* @param operator 操作人
* @return 操作结果
*/
Result<HaSwitchTopic> switchHaWithCanRetry(Long newActiveClusterPhyId,
Long newStandbyClusterPhyId,
List<String> switchTopicNameList,
boolean focus,
boolean firstTriggerExecute,
JobLogDO switchLogTemplate,
String operator);
}

View File

@@ -0,0 +1,140 @@
package com.xiaojukeji.kafka.manager.service.biz.ha.impl;
import com.xiaojukeji.kafka.manager.common.bizenum.TopicAuthorityEnum;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaRelationTypeEnum;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaResTypeEnum;
import com.xiaojukeji.kafka.manager.common.constant.KafkaConstant;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.ResultStatus;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ClusterDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.TopicDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.gateway.AuthorityDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.HaASRelationDO;
import com.xiaojukeji.kafka.manager.common.entity.vo.ha.HaClusterTopicVO;
import com.xiaojukeji.kafka.manager.common.entity.vo.normal.topic.HaClusterTopicHaStatusVO;
import com.xiaojukeji.kafka.manager.common.utils.ValidateUtils;
import com.xiaojukeji.kafka.manager.service.biz.ha.HaASRelationManager;
import com.xiaojukeji.kafka.manager.service.cache.PhysicalClusterMetadataManager;
import com.xiaojukeji.kafka.manager.service.service.TopicManagerService;
import com.xiaojukeji.kafka.manager.service.service.gateway.AuthorityService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaASRelationService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaTopicService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
@Service
public class HaASRelationManagerImpl implements HaASRelationManager {
@Autowired
private HaASRelationService haASRelationService;
@Autowired
private TopicManagerService topicManagerService;
@Autowired
private HaTopicService haTopicService;
@Autowired
private AuthorityService authorityService;
@Override
public List<HaClusterTopicVO> getHATopics(Long firstClusterPhyId, Long secondClusterPhyId, boolean filterSystemTopics) {
List<HaASRelationDO> doList = haASRelationService.listAllHAFromDB(firstClusterPhyId, secondClusterPhyId, HaResTypeEnum.TOPIC);
if (ValidateUtils.isEmptyList(doList)) {
return new ArrayList<>();
}
List<HaClusterTopicVO> voList = new ArrayList<>();
for (HaASRelationDO relationDO: doList) {
if (filterSystemTopics
&& (relationDO.getActiveResName().startsWith("__") || relationDO.getStandbyResName().startsWith("__"))) {
// 过滤掉系统Topic && 存在系统Topic则过滤掉
continue;
}
HaClusterTopicVO vo = new HaClusterTopicVO();
vo.setClusterId(firstClusterPhyId);
if (firstClusterPhyId.equals(relationDO.getActiveClusterPhyId())) {
vo.setTopicName(relationDO.getActiveResName());
} else {
vo.setTopicName(relationDO.getStandbyResName());
}
vo.setProduceAclNum(0);
vo.setConsumeAclNum(0);
vo.setActiveClusterId(relationDO.getActiveClusterPhyId());
vo.setStandbyClusterId(relationDO.getStandbyClusterPhyId());
vo.setStatus(relationDO.getStatus());
// 补充ACL信息
List<AuthorityDO> authorityDOList = authorityService.getAuthorityByTopicFromCache(relationDO.getActiveClusterPhyId(), relationDO.getActiveResName());
authorityDOList.forEach(elem -> {
if ((elem.getAccess() & TopicAuthorityEnum.WRITE.getCode()) > 0) {
vo.setProduceAclNum(vo.getProduceAclNum() + 1);
}
if ((elem.getAccess() & TopicAuthorityEnum.READ.getCode()) > 0) {
vo.setConsumeAclNum(vo.getConsumeAclNum() + 1);
}
});
voList.add(vo);
}
return voList;
}
@Override
public Result<List<HaClusterTopicHaStatusVO>> listHaStatusTopics(Long clusterPhyId, Boolean checkMetadata) {
ClusterDO clusterDO = PhysicalClusterMetadataManager.getClusterFromCache(clusterPhyId);
if (clusterDO == null){
return Result.buildFrom(ResultStatus.CLUSTER_NOT_EXIST);
}
List<TopicDO> topicDOS = topicManagerService.getByClusterId(clusterPhyId);
if (ValidateUtils.isEmptyList(topicDOS)) {
return Result.buildSuc(new ArrayList<>());
}
Map<String, Integer> haRelationMap = haTopicService.getRelation(clusterPhyId);
List<HaClusterTopicHaStatusVO> statusVOS = new ArrayList<>();
topicDOS.stream().filter(topicDO -> !topicDO.getTopicName().startsWith("__"))//过滤引擎自带topic
.forEach(topicDO -> {
if(checkMetadata && !PhysicalClusterMetadataManager.isTopicExist(clusterPhyId, topicDO.getTopicName())){
return;
}
HaClusterTopicHaStatusVO statusVO = new HaClusterTopicHaStatusVO();
statusVO.setClusterId(clusterPhyId);
statusVO.setClusterName(clusterDO.getClusterName());
statusVO.setTopicName(topicDO.getTopicName());
statusVO.setHaRelation(haRelationMap.get(topicDO.getTopicName()));
statusVOS.add(statusVO);
});
return Result.buildSuc(statusVOS);
}
@Override
public Integer getRelation(Long clusterId, String topicName) {
HaASRelationDO relationDO = haASRelationService.getHAFromDB(clusterId, topicName, HaResTypeEnum.TOPIC);
if (relationDO == null){
return HaRelationTypeEnum.UNKNOWN.getCode();
}
if (topicName.equals(KafkaConstant.COORDINATOR_TOPIC_NAME)){
return HaRelationTypeEnum.MUTUAL_BACKUP.getCode();
}
if (clusterId.equals(relationDO.getActiveClusterPhyId())){
return HaRelationTypeEnum.ACTIVE.getCode();
}
if (clusterId.equals(relationDO.getStandbyClusterPhyId())){
return HaRelationTypeEnum.STANDBY.getCode();
}
return HaRelationTypeEnum.UNKNOWN.getCode();
}
@Override
public HaASRelationDO getASRelation(Long clusterId, String topicName) {
return haASRelationService.getHAFromDB(clusterId, topicName, HaResTypeEnum.TOPIC);
}
}

View File

@@ -0,0 +1,94 @@
package com.xiaojukeji.kafka.manager.service.biz.ha.impl;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaResTypeEnum;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.vo.rd.app.AppRelateTopicsVO;
import com.xiaojukeji.kafka.manager.service.biz.ha.HaAppManager;
import com.xiaojukeji.kafka.manager.service.service.gateway.AuthorityService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaASRelationService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.*;
import java.util.stream.Collectors;
@Service
public class HaAppManagerImpl implements HaAppManager {
@Autowired
private AuthorityService authorityService;
@Autowired
private HaASRelationService haASRelationService;
@Override
public Result<List<AppRelateTopicsVO>> appRelateTopics(Long clusterPhyId, List<String> filterTopicNameList) {
// 获取关联的Topic列表
Map<String, Set<String>> userTopicMap = this.appRelateTopicsMap(clusterPhyId, filterTopicNameList);
// 获取集群已建立HA的Topic列表
Set<String> haTopicNameSet = haASRelationService.listAllHAFromDB(clusterPhyId, HaResTypeEnum.TOPIC)
.stream()
.map(elem -> elem.getActiveResName())
.collect(Collectors.toSet());
Set<String> filterTopicNameSet = new HashSet<>(filterTopicNameList);
List<AppRelateTopicsVO> voList = new ArrayList<>();
for (Map.Entry<String, Set<String>> entry: userTopicMap.entrySet()) {
AppRelateTopicsVO vo = new AppRelateTopicsVO();
vo.setClusterPhyId(clusterPhyId);
vo.setKafkaUser(entry.getKey());
vo.setSelectedTopicNameList(new ArrayList<>());
vo.setNotSelectTopicNameList(new ArrayList<>());
vo.setNotHaTopicNameList(new ArrayList<>());
entry.getValue().forEach(elem -> {
if (elem.startsWith("__")) {
// ignore
return;
}
if (!haTopicNameSet.contains(elem)) {
vo.getNotHaTopicNameList().add(elem);
} else if (filterTopicNameSet.contains(elem)) {
vo.getSelectedTopicNameList().add(elem);
} else {
vo.getNotSelectTopicNameList().add(elem);
}
});
voList.add(vo);
}
return Result.buildSuc(voList);
}
@Override
public boolean isContainAllRelateAppTopics(Long clusterPhyId, List<String> filterTopicNameList) {
Map<String, Set<String>> userTopicMap = this.appRelateTopicsMap(clusterPhyId, filterTopicNameList);
Set<String> relateTopicSet = new HashSet<>();
userTopicMap.values().forEach(elem -> relateTopicSet.addAll(elem));
return filterTopicNameList.containsAll(relateTopicSet);
}
private Map<String, Set<String>> appRelateTopicsMap(Long clusterPhyId, List<String> filterTopicNameList) {
Map<String, Set<String>> userTopicMap = new HashMap<>();
for (String topicName: filterTopicNameList) {
authorityService.getAuthorityByTopicFromCache(clusterPhyId, topicName)
.stream()
.map(elem -> elem.getAppId())
.filter(item -> !userTopicMap.containsKey(item))
.forEach(kafkaUser ->
userTopicMap.put(
kafkaUser,
authorityService.getAuthority(kafkaUser).stream().map(authorityDO -> authorityDO.getTopicName()).collect(Collectors.toSet())
)
);
}
return userTopicMap;
}
}

View File

@@ -0,0 +1,169 @@
package com.xiaojukeji.kafka.manager.service.biz.ha.impl;
import com.xiaojukeji.kafka.manager.common.bizenum.ClusterModeEnum;
import com.xiaojukeji.kafka.manager.common.bizenum.DBStatusEnum;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaResTypeEnum;
import com.xiaojukeji.kafka.manager.common.constant.MsgConstant;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.ResultStatus;
import com.xiaojukeji.kafka.manager.common.entity.ao.ClusterDetailDTO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ClusterDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.LogicalClusterDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.RegionDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.HaASRelationDO;
import com.xiaojukeji.kafka.manager.common.utils.ListUtils;
import com.xiaojukeji.kafka.manager.service.biz.ha.HaClusterManager;
import com.xiaojukeji.kafka.manager.service.service.ClusterService;
import com.xiaojukeji.kafka.manager.service.service.LogicalClusterService;
import com.xiaojukeji.kafka.manager.service.service.RegionService;
import com.xiaojukeji.kafka.manager.service.service.ZookeeperService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaASRelationService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaClusterService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.interceptor.TransactionAspectSupport;
import java.util.List;
@Component
public class HaClusterManagerImpl implements HaClusterManager {
private static final Logger LOGGER = LoggerFactory.getLogger(HaClusterManagerImpl.class);
@Autowired
private ClusterService clusterService;
@Autowired
private HaClusterService haClusterService;
@Autowired
private ZookeeperService zookeeperService;
@Autowired
private LogicalClusterService logicalClusterService;
@Autowired
private RegionService regionService;
@Autowired
private HaASRelationService haASRelationService;
@Override
public List<ClusterDetailDTO> getClusterDetailDTOList(Boolean needDetail) {
return clusterService.getClusterDetailDTOList(needDetail);
}
@Override
@Transactional
public Result<Void> addNew(ClusterDO clusterDO, Long activeClusterId, String operator) {
if (activeClusterId == null) {
// 普通集群直接写入DB
Long clusterPhyId = zookeeperService.getClusterIdAndNullIfFailed(clusterDO.getZookeeper());
if (clusterPhyId != null && clusterService.getById(clusterPhyId) == null) {
// 该集群ID不存在时则进行设置如果已经存在了则忽略
clusterDO.setId(clusterPhyId);
}
return Result.buildFrom(clusterService.addNew(clusterDO, operator));
}
//高可用集群
ClusterDO activeClusterDO = clusterService.getById(activeClusterId);
if (activeClusterDO == null) {
// 主集群不存在
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, MsgConstant.getClusterPhyNotExist(activeClusterId));
}
HaASRelationDO oldRelationDO = haClusterService.getHA(activeClusterId);
if (oldRelationDO != null){
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_ALREADY_USED,
MsgConstant.getActiveClusterDuplicate(activeClusterDO.getId(), activeClusterDO.getClusterName()));
}
Long standbyClusterPhyId = zookeeperService.getClusterIdAndNullIfFailed(clusterDO.getZookeeper());
if (standbyClusterPhyId != null && clusterService.getById(standbyClusterPhyId) == null) {
// 该集群ID不存在时则进行设置如果已经存在了则忽略
clusterDO.setId(standbyClusterPhyId);
}
ResultStatus rs = clusterService.addNew(clusterDO, operator);
if (!ResultStatus.SUCCESS.equals(rs)) {
return Result.buildFrom(rs);
}
Result<List<Integer>> rli = zookeeperService.getBrokerIds(clusterDO.getZookeeper());
if (!rli.hasData()){
return Result.buildFrom(ResultStatus.BROKER_NOT_EXIST);
}
// 备集群创建region
RegionDO regionDO = new RegionDO(DBStatusEnum.ALIVE.getStatus(), clusterDO.getClusterName(), clusterDO.getId(), ListUtils.intList2String(rli.getData()));
rs = regionService.createRegion(regionDO);
if (!ResultStatus.SUCCESS.equals(rs)){
TransactionAspectSupport.currentTransactionStatus().setRollbackOnly();
return Result.buildFrom(rs);
}
// 备集群创建逻辑集群
List<LogicalClusterDO> logicalClusterDOS = logicalClusterService.getByPhysicalClusterId(activeClusterId);
if (!logicalClusterDOS.isEmpty()) {
// 有逻辑集群,则对应创建逻辑集群
Integer mode = logicalClusterDOS.get(0).getMode();
LogicalClusterDO logicalClusterDO = new LogicalClusterDO(
clusterDO.getClusterName(),
clusterDO.getClusterName(),
ClusterModeEnum.INDEPENDENT_MODE.getCode().equals(mode)?mode:ClusterModeEnum.SHARED_MODE.getCode(),
ClusterModeEnum.INDEPENDENT_MODE.getCode().equals(mode)?logicalClusterDOS.get(0).getAppId(): "",
clusterDO.getId(),
regionDO.getId().toString()
);
ResultStatus clcRS = logicalClusterService.createLogicalCluster(logicalClusterDO);
if (clcRS.getCode() != ResultStatus.SUCCESS.getCode()){
TransactionAspectSupport.currentTransactionStatus().setRollbackOnly();
return Result.buildFrom(clcRS);
}
}
return haClusterService.createHA(activeClusterId, clusterDO.getId(), operator);
}
@Override
@Transactional
public Result<Void> deleteById(Long clusterId, String operator) {
HaASRelationDO haRelationDO = haClusterService.getHA(clusterId);
if (haRelationDO == null){
return clusterService.deleteById(clusterId, operator);
}
Result rv = checkForDelete(haRelationDO, clusterId);
if (rv.failed()){
return rv;
}
//解除高可用关系
Result result = haClusterService.deleteHA(haRelationDO.getActiveClusterPhyId(), haRelationDO.getStandbyClusterPhyId());
if (result.failed()){
return result;
}
//删除集群
result = clusterService.deleteById(clusterId, operator);
if (result.failed()){
return result;
}
return Result.buildSuc();
}
private Result<Void> checkForDelete(HaASRelationDO haRelationDO, Long clusterId){
List<HaASRelationDO> relationDOS = haASRelationService.listAllHAFromDB(haRelationDO.getActiveClusterPhyId(),
haRelationDO.getStandbyClusterPhyId(),
HaResTypeEnum.TOPIC);
if (relationDOS.stream().filter(relationDO -> !relationDO.getActiveResName().startsWith("__")).count() > 0){
return Result.buildFromRSAndMsg(ResultStatus.OPERATION_FORBIDDEN, "集群还存在高可topic");
}
return Result.buildSuc();
}
}

View File

@@ -0,0 +1,559 @@
package com.xiaojukeji.kafka.manager.service.biz.ha.impl;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaResTypeEnum;
import com.xiaojukeji.kafka.manager.common.bizenum.ha.HaStatusEnum;
import com.xiaojukeji.kafka.manager.common.constant.MsgConstant;
import com.xiaojukeji.kafka.manager.common.entity.Result;
import com.xiaojukeji.kafka.manager.common.entity.ResultStatus;
import com.xiaojukeji.kafka.manager.common.entity.TopicOperationResult;
import com.xiaojukeji.kafka.manager.common.entity.ao.ha.HaSwitchTopic;
import com.xiaojukeji.kafka.manager.common.entity.dto.op.topic.HaTopicRelationDTO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ClusterDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.TopicDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.HaASRelationDO;
import com.xiaojukeji.kafka.manager.common.entity.pojo.ha.JobLogDO;
import com.xiaojukeji.kafka.manager.common.utils.BackoffUtils;
import com.xiaojukeji.kafka.manager.common.utils.ConvertUtil;
import com.xiaojukeji.kafka.manager.common.utils.ValidateUtils;
import com.xiaojukeji.kafka.manager.service.biz.ha.HaTopicManager;
import com.xiaojukeji.kafka.manager.service.cache.PhysicalClusterMetadataManager;
import com.xiaojukeji.kafka.manager.service.service.ClusterService;
import com.xiaojukeji.kafka.manager.service.service.JobLogService;
import com.xiaojukeji.kafka.manager.service.service.TopicManagerService;
import com.xiaojukeji.kafka.manager.service.service.gateway.AuthorityService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaASRelationService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaKafkaUserService;
import com.xiaojukeji.kafka.manager.service.service.ha.HaTopicService;
import com.xiaojukeji.kafka.manager.service.utils.ConfigUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import java.util.*;
import java.util.stream.Collectors;
@Component
public class HaTopicManagerImpl implements HaTopicManager {
private static final Logger LOGGER = LoggerFactory.getLogger(HaTopicManagerImpl.class);
@Autowired
private ClusterService clusterService;
@Autowired
private AuthorityService authorityService;
@Autowired
private HaTopicService haTopicService;
@Autowired
private HaKafkaUserService haKafkaUserService;
@Autowired
private HaASRelationService haASRelationService;
@Autowired
private TopicManagerService topicManagerService;
@Autowired
private ConfigUtils configUtils;
@Autowired
private JobLogService jobLogService;
@Override
public Result<HaSwitchTopic> switchHaWithCanRetry(Long newActiveClusterPhyId,
Long newStandbyClusterPhyId,
List<String> switchTopicNameList,
boolean focus,
boolean firstTriggerExecute,
JobLogDO switchLogTemplate,
String operator) {
LOGGER.info(
"method=switchHaWithCanRetry||newActiveClusterPhyId={}||newStandbyClusterPhyId={}||switchTopicNameList={}||focus={}||operator={}",
newActiveClusterPhyId, newStandbyClusterPhyId, ConvertUtil.obj2Json(switchTopicNameList), focus, operator
);
// 1、获取集群
ClusterDO newActiveClusterPhyDO = clusterService.getById(newActiveClusterPhyId);
if (ValidateUtils.isNull(newActiveClusterPhyDO)) {
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, MsgConstant.getClusterPhyNotExist(newActiveClusterPhyId));
}
ClusterDO newStandbyClusterPhyDO = clusterService.getById(newStandbyClusterPhyId);
if (ValidateUtils.isNull(newStandbyClusterPhyDO)) {
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, MsgConstant.getClusterPhyNotExist(newStandbyClusterPhyId));
}
// 2、进行参数检查
Result<List<HaASRelationDO>> doListResult = this.checkParamAndGetASRelation(newActiveClusterPhyId, newStandbyClusterPhyId, switchTopicNameList);
if (doListResult.failed()) {
LOGGER.error(
"method=switchHaWithCanRetry||newActiveClusterPhyId={}||newStandbyClusterPhyId={}||switchTopicNameList={}||paramErrResult={}||operator={}",
newActiveClusterPhyId, newStandbyClusterPhyId, ConvertUtil.obj2Json(switchTopicNameList), doListResult, operator
);
return Result.buildFromIgnoreData(doListResult);
}
List<HaASRelationDO> doList = doListResult.getData();
// 3、如果是第一次触发执行且状态是stable则修改状态
for (HaASRelationDO relationDO: doList) {
if (firstTriggerExecute && relationDO.getStatus().equals(HaStatusEnum.STABLE_CODE)) {
relationDO.setStatus(HaStatusEnum.SWITCHING_PREPARE_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_PREPARE_CODE);
}
}
// 4、进行切换预处理
HaSwitchTopic switchTopic = this.prepareSwitching(newStandbyClusterPhyDO, doList, focus, switchLogTemplate);
// 5、直接等待10秒使得相关数据有机会同步完成
BackoffUtils.backoff(10000);
// 6、检查数据同步情况
for (HaASRelationDO relationDO: doList) {
switchTopic.addHaSwitchTopic(this.checkTopicInSync(newActiveClusterPhyDO, newStandbyClusterPhyDO, relationDO, focus, switchLogTemplate));
}
// 7、删除旧的备Topic的同步配置
for (HaASRelationDO relationDO: doList) {
switchTopic.addHaSwitchTopic(this.oldStandbyTopicDelFetchConfig(newActiveClusterPhyDO, newStandbyClusterPhyDO, relationDO, focus, switchLogTemplate, operator));
}
// 8、增加新的备Topic的同步配置
switchTopic.addHaSwitchTopic(this.newStandbyTopicAddFetchConfig(newActiveClusterPhyDO, newStandbyClusterPhyDO, doList, focus, switchLogTemplate, operator));
// 9、进行切换收尾
switchTopic.addHaSwitchTopic(this.closeoutSwitching(newActiveClusterPhyDO, newStandbyClusterPhyDO, configUtils.getDKafkaGatewayZK(), doList, focus, switchLogTemplate));
// 10、状态结果汇总记录
doList.forEach(elem -> switchTopic.addActiveTopicStatus(elem.getActiveResName(), elem.getStatus()));
// 11、日志记录并返回
LOGGER.info(
"method=switchHaWithCanRetry||newActiveClusterPhyId={}||newStandbyClusterPhyId={}||switchTopicNameList={}||switchResult={}||operator={}",
newActiveClusterPhyId, newStandbyClusterPhyId, ConvertUtil.obj2Json(switchTopicNameList), switchTopic, operator
);
return Result.buildSuc(switchTopic);
}
@Override
public Result<List<TopicOperationResult>> batchCreateHaTopic(HaTopicRelationDTO dto, String operator) {
List<HaASRelationDO> relationDOS = haASRelationService.listAllHAFromDB(dto.getActiveClusterId(), dto.getStandbyClusterId(), HaResTypeEnum.CLUSTER);
if (relationDOS.isEmpty()){
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, "集群高可用关系未建立");
}
//获取主集群已有的高可用topic
Map<String, Integer> haRelationMap = haTopicService.getRelation(dto.getActiveClusterId());
List<String> topicNames = dto.getTopicNames();
if (dto.getAll()){
topicNames = topicManagerService.getByClusterId(dto.getActiveClusterId())
.stream()
.filter(topicDO -> !topicDO.getTopicName().startsWith("__"))//过滤掉kafka自带topic
.filter(topicDO -> !haRelationMap.keySet().contains(topicDO.getTopicName()))//过滤调已成为高可用topic的topic
.filter(topicDO -> PhysicalClusterMetadataManager.isTopicExist(dto.getActiveClusterId(), topicDO.getTopicName()))
.map(TopicDO::getTopicName)
.collect(Collectors.toList());
}
List<TopicOperationResult> operationResultList = new ArrayList<>();
topicNames.forEach(topicName->{
Result<Void> rv = haTopicService.createHA(dto.getActiveClusterId(), dto.getStandbyClusterId(),topicName, operator);
operationResultList.add(TopicOperationResult.buildFrom(dto.getActiveClusterId(), topicName, rv));
});
return Result.buildSuc(operationResultList);
}
@Override
public Result<List<TopicOperationResult>> batchRemoveHaTopic(HaTopicRelationDTO dto, String operator) {
List<HaASRelationDO> relationDOS = haASRelationService.listAllHAFromDB(dto.getActiveClusterId(), dto.getStandbyClusterId(), HaResTypeEnum.CLUSTER);
if (relationDOS.isEmpty()){
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, "集群高可用关系未建立");
}
List<TopicOperationResult> operationResultList = new ArrayList<>();
for(String topicName : dto.getTopicNames()){
HaASRelationDO relationDO = haASRelationService.getHAFromDB(
dto.getActiveClusterId(),
topicName,
HaResTypeEnum.TOPIC
);
if (relationDO == null) {
return Result.buildFromRSAndMsg(ResultStatus.RESOURCE_NOT_EXIST, "主备关系不存在");
}
Result<Void> rv = haTopicService.deleteHA(relationDO.getActiveClusterPhyId(), relationDO.getStandbyClusterPhyId(), topicName, operator);
operationResultList.add(TopicOperationResult.buildFrom(dto.getActiveClusterId(), topicName, rv));
}
return Result.buildSuc(operationResultList);
}
/**************************************************** private method ****************************************************/
private void saveLogs(JobLogDO switchLogTemplate, String content) {
jobLogService.addLogAndIgnoreException(switchLogTemplate.setAndCopyNew(new Date(), content));
}
/**
* 切换预处理
* 1、在主集群上将Topic关联的KafkaUser的active集群设置为None
*/
private HaSwitchTopic prepareSwitching(ClusterDO oldActiveClusterPhyDO, List<HaASRelationDO> doList, boolean focus, JobLogDO switchLogTemplate) {
// 暂停HA的KafkaUser
Set<String> stoppedHaKafkaUserSet = new HashSet<>();
HaSwitchTopic haSwitchTopic = new HaSwitchTopic(true);
boolean allSuccess = true; // 所有都成功
boolean needLog = false; // 需要记录日志
for (HaASRelationDO relationDO: doList) {
if (!relationDO.getStatus().equals(HaStatusEnum.SWITCHING_PREPARE_CODE)) {
// 当前不处于prepare状态
haSwitchTopic.setFinished(true);
continue;
}
needLog = true;
// 获取关联的KafkaUser
Set<String> relatedKafkaUserSet = authorityService.getAuthorityByTopic(relationDO.getActiveClusterPhyId(), relationDO.getActiveResName())
.stream()
.map(elem -> elem.getAppId())
.filter(kafkaUser -> !stoppedHaKafkaUserSet.contains(kafkaUser))
.collect(Collectors.toSet());
// 暂停kafkaUser HA
for (String kafkaUser: relatedKafkaUserSet) {
Result<Void> rv = haKafkaUserService.setNoneHAInKafka(oldActiveClusterPhyDO.getZookeeper(), kafkaUser);
if (rv.failed() && !focus) {
haSwitchTopic.setFinished(false);
this.saveLogs(switchLogTemplate, String.format("%s:\t失败1分钟后再进行重试", HaStatusEnum.SWITCHING_PREPARE.getMsg(oldActiveClusterPhyDO.getClusterName())));
return haSwitchTopic;
} else if (rv.failed() && focus) {
allSuccess = false;
}
}
// 记录操作过的user
stoppedHaKafkaUserSet.addAll(relatedKafkaUserSet);
// 修改Topic主备状态
relationDO.setStatus(HaStatusEnum.SWITCHING_WAITING_IN_SYNC_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_WAITING_IN_SYNC_CODE);
}
if (needLog) {
this.saveLogs(switchLogTemplate, String.format("%s:\t%s", HaStatusEnum.SWITCHING_PREPARE.getMsg(oldActiveClusterPhyDO.getClusterName()), allSuccess? "成功": "存在失败,但进行强制执行,跳过该操作"));
}
haSwitchTopic.setFinished(true);
return haSwitchTopic;
}
/**
* 等待主备Topic同步
*/
private HaSwitchTopic checkTopicInSync(ClusterDO newActiveClusterPhyDO, ClusterDO newStandbyClusterPhyDO, HaASRelationDO relationDO, boolean focus, JobLogDO switchLogTemplate) {
HaSwitchTopic haSwitchTopic = new HaSwitchTopic(true);
if (!relationDO.getStatus().equals(HaStatusEnum.SWITCHING_WAITING_IN_SYNC_CODE)) {
// 状态错误,直接略过
haSwitchTopic.setFinished(true);
return haSwitchTopic;
}
if (focus) {
// 无需等待inSync
// 修改Topic主备状态
relationDO.setStatus(HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE);
haSwitchTopic.setFinished(true);
this.saveLogs(switchLogTemplate, String.format(
"%s:\tTopic:[%s] 强制切换,跳过等待主备同步完成,直接进入下一步",
HaStatusEnum.SWITCHING_WAITING_IN_SYNC.getMsg(newActiveClusterPhyDO.getClusterName()),
relationDO.getActiveResName()
));
return haSwitchTopic;
}
Result<Long> lagResult = haTopicService.getStandbyTopicFetchLag(newStandbyClusterPhyDO.getId(), relationDO.getStandbyResName());
if (lagResult.failed()) {
// 获取Lag信息失败
this.saveLogs(switchLogTemplate, String.format(
"%s:\tTopic:[%s] 获取同步的Lag信息失败1分钟后再检查是否主备同步完成",
HaStatusEnum.SWITCHING_WAITING_IN_SYNC.getMsg(newActiveClusterPhyDO.getClusterName()),
relationDO.getActiveResName()
));
haSwitchTopic.setFinished(false);
return haSwitchTopic;
}
if (lagResult.getData().longValue() > 0) {
this.saveLogs(switchLogTemplate, String.format(
"%s:\tTopic:[%s] 还存在 %d 条数据未同步完成1分钟后再检查是否主备同步完成",
HaStatusEnum.SWITCHING_WAITING_IN_SYNC.getMsg(newActiveClusterPhyDO.getClusterName()),
relationDO.getActiveResName(),
lagResult.getData()
));
haSwitchTopic.setFinished(false);
return haSwitchTopic;
}
// 修改Topic主备状态
relationDO.setStatus(HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE);
haSwitchTopic.setFinished(true);
this.saveLogs(switchLogTemplate, String.format(
"%s:\tTopic:[%s] 主备同步完成",
HaStatusEnum.SWITCHING_WAITING_IN_SYNC.getMsg(newActiveClusterPhyDO.getClusterName()),
relationDO.getActiveResName()
));
return haSwitchTopic;
}
/**
* 备Topic删除拉取主Topic数据的配置
*/
private HaSwitchTopic oldStandbyTopicDelFetchConfig(ClusterDO newActiveClusterPhyDO, ClusterDO newStandbyClusterPhyDO, HaASRelationDO relationDO, boolean focus, JobLogDO switchLogTemplate, String operator) {
HaSwitchTopic haSwitchTopic = new HaSwitchTopic(true);
if (!relationDO.getStatus().equals(HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH_CODE)) {
// 状态不对
haSwitchTopic.setFinished(true);
return haSwitchTopic;
}
Result<Void> rv = haTopicService.stopHAInKafka(
newActiveClusterPhyDO, relationDO.getStandbyResName(), // 旧的备
operator
);
if (rv.failed() && !focus) {
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 失败1分钟后再进行重试", HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH.getMsg(newActiveClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
haSwitchTopic.setFinished(false);
return haSwitchTopic;
} else if (rv.failed() && focus) {
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 失败,但进行强制执行,跳过该操作", HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH.getMsg(newActiveClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
} else {
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 成功", HaStatusEnum.SWITCHING_CLOSE_OLD_STANDBY_TOPIC_FETCH.getMsg(newActiveClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
}
// 修改Topic主备状态
relationDO.setStatus(HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH_CODE);
haSwitchTopic.setFinished(true);
return haSwitchTopic;
}
/**
* 新的备Topic创建拉取新主Topic数据的配置
*/
private HaSwitchTopic newStandbyTopicAddFetchConfig(ClusterDO newActiveClusterPhyDO,
ClusterDO newStandbyClusterPhyDO,
List<HaASRelationDO> doList,
boolean focus,
JobLogDO switchLogTemplate,
String operator) {
boolean forceAndFailed = false;
for (HaASRelationDO relationDO: doList) {
if (!relationDO.getStatus().equals(HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH_CODE)) {
// 状态不对
continue;
}
Result<Void> rv = null;
if (!forceAndFailed) {
// 非 强制切换并且失败了
rv = haTopicService.activeHAInKafka(
newActiveClusterPhyDO, relationDO.getStandbyResName(),
newStandbyClusterPhyDO, relationDO.getStandbyResName(),
operator
);
}
if (forceAndFailed) {
// 强制切换并且失败了,记录该日志
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 失败,但因为是强制执行且强制执行时依旧出现操作失败,因此直接跳过该操作", HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH.getMsg(newStandbyClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
} else if (rv.failed() && !focus) {
// 如果失败了,并且非强制切换,则直接返回
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 失败1分钟后再进行重试", HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH.getMsg(newStandbyClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
return new HaSwitchTopic(false);
} else if (rv.failed() && focus) {
// 如果失败了,但是是强制切换,则记录日志并继续
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 失败,但因为是强制执行,因此跳过该操作", HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH.getMsg(newStandbyClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
forceAndFailed = true;
} else {
// 记录成功日志
this.saveLogs(switchLogTemplate, String.format("%s:\tTopic:[%s] 成功", HaStatusEnum.SWITCHING_OPEN_NEW_STANDBY_TOPIC_FETCH.getMsg(newStandbyClusterPhyDO.getClusterName()), relationDO.getActiveResName()));
}
// 修改Topic主备状态
relationDO.setStatus(HaStatusEnum.SWITCHING_CLOSEOUT_CODE);
haASRelationService.updateRelationStatus(relationDO.getId(), HaStatusEnum.SWITCHING_CLOSEOUT_CODE);
}
return new HaSwitchTopic(true);
}
/**
* 切换收尾
* 1、原先的主集群-修改user的active集群指向新的主集群
* 2、原先的备集群-修改user的active集群指向新的主集群
* 3、网关-修改user的active集群指向新的主集群
*/
private HaSwitchTopic closeoutSwitching(ClusterDO newActiveClusterPhyDO, ClusterDO newStandbyClusterPhyDO, String gatewayZK, List<HaASRelationDO> doList, boolean focus, JobLogDO switchLogTemplate) {
// 暂停HA的KafkaUser
Set<String> activeHaKafkaUserSet = new HashSet<>();
boolean allSuccess = true;
boolean needLog = false;
boolean forceAndNewStandbyFailed = false; // 强制切换,但是新的备依旧操作失败
HaSwitchTopic haSwitchTopic = new HaSwitchTopic(true);
for (HaASRelationDO relationDO: doList) {
if (!relationDO.getStatus().equals(HaStatusEnum.SWITCHING_CLOSEOUT_CODE)) {
// 当前不处于closeout状态
haSwitchTopic.setFinished(false);
continue;
}
needLog = true;
// 获取关联的KafkaUser
Set<String> relatedKafkaUserSet = authorityService.getAuthorityByTopic(relationDO.getActiveClusterPhyId(), relationDO.getActiveResName())
.stream()
.map(elem -> elem.getAppId())
.filter(kafkaUser -> !activeHaKafkaUserSet.contains(kafkaUser))
.collect(Collectors.toSet());
for (String kafkaUser: relatedKafkaUserSet) {
// 操作新的主集群
Result<Void> rv = haKafkaUserService.activeHAInKafka(newActiveClusterPhyDO.getZookeeper(), newActiveClusterPhyDO.getId(), kafkaUser);
if (rv.failed() && !focus) {
haSwitchTopic.setFinished(false);
this.saveLogs(switchLogTemplate, String.format("%s:\t失败1分钟后再进行重试", HaStatusEnum.SWITCHING_CLOSEOUT.getMsg(newActiveClusterPhyDO.getClusterName())));
return haSwitchTopic;
} else if (rv.failed() && focus) {
allSuccess = false;
}
// 操作新的备集群如果出现错误则下次就不再进行操作ZK。新的备的Topic不是那么重要因此这里允许出现跳过
rv = null;
if (!forceAndNewStandbyFailed) {
// 如果对备集群的操作过程中,出现了失败,则直接跳过
rv = haKafkaUserService.activeHAInKafka(newStandbyClusterPhyDO.getZookeeper(), newActiveClusterPhyDO.getId(), kafkaUser);
}
if (rv != null && rv.failed() && !focus) {
haSwitchTopic.setFinished(false);
this.saveLogs(switchLogTemplate, String.format("%s:\t失败1分钟后再进行重试", HaStatusEnum.SWITCHING_CLOSEOUT.getMsg(newActiveClusterPhyDO.getClusterName())));
return haSwitchTopic;
} else if (rv != null && rv.failed() && focus) {
allSuccess = false;
forceAndNewStandbyFailed = true;
}
// 操作网关
rv = haKafkaUserService.activeHAInKafka(gatewayZK, newActiveClusterPhyDO.getId(), kafkaUser);
if (rv.failed() && !focus) {
haSwitchTopic.setFinished(false);
this.saveLogs(switchLogTemplate, String.format("%s:\t失败1分钟后再进行重试", HaStatusEnum.SWITCHING_CLOSEOUT.getMsg(newActiveClusterPhyDO.getClusterName())));
return haSwitchTopic;
} else if (rv.failed() && focus) {
allSuccess = false;
}
}
// 记录已经激活的User
activeHaKafkaUserSet.addAll(relatedKafkaUserSet);
// 修改Topic主备信息
HaASRelationDO newHaASRelationDO = new HaASRelationDO(
newActiveClusterPhyDO.getId(), relationDO.getActiveResName(),
newStandbyClusterPhyDO.getId(), relationDO.getStandbyResName(),
HaResTypeEnum.TOPIC.getCode(),
HaStatusEnum.STABLE_CODE
);
newHaASRelationDO.setId(relationDO.getId());
haASRelationService.updateById(newHaASRelationDO);
}
if (!needLog) {
return haSwitchTopic;
}
this.saveLogs(switchLogTemplate, String.format("%s:\t%s", HaStatusEnum.SWITCHING_CLOSEOUT.getMsg(newActiveClusterPhyDO.getClusterName()), allSuccess? "成功": "存在失败,但进行强制执行,跳过该操作"));
return haSwitchTopic;
}
/**
* 检查参数,并获取主备关系信息
*/
private Result<List<HaASRelationDO>> checkParamAndGetASRelation(Long activeClusterPhyId, Long standbyClusterPhyId, List<String> switchTopicNameList) {
List<HaASRelationDO> doList = new ArrayList<>();
for (String topicName: switchTopicNameList) {
Result<HaASRelationDO> doResult = this.checkParamAndGetASRelation(activeClusterPhyId, standbyClusterPhyId, topicName);
if (doResult.failed()) {
return Result.buildFromIgnoreData(doResult);
}
doList.add(doResult.getData());
}
return Result.buildSuc(doList);
}
/**
* 检查参数,并获取主备关系信息
*/
private Result<HaASRelationDO> checkParamAndGetASRelation(Long activeClusterPhyId, Long standbyClusterPhyId, String topicName) {
// newActiveTopic必须存在新的备Topic可以不存在
if (!PhysicalClusterMetadataManager.isTopicExist(activeClusterPhyId, topicName)) {
return Result.buildFromRSAndMsg(
ResultStatus.RESOURCE_NOT_EXIST,
String.format("新的主集群ID:[%d]-Topic:[%s] 不存在", activeClusterPhyId, topicName)
);
}
// 查询主备关系是否存在
HaASRelationDO relationDO = haASRelationService.getSpecifiedHAFromDB(
standbyClusterPhyId,
topicName,
activeClusterPhyId,
topicName,
HaResTypeEnum.TOPIC
);
if (relationDO == null) {
// 查询切换后的关系是否存在,如果已经存在,则后续会重新建立一遍
relationDO = haASRelationService.getSpecifiedHAFromDB(
activeClusterPhyId,
topicName,
standbyClusterPhyId,
topicName,
HaResTypeEnum.TOPIC
);
}
if (relationDO == null) {
// 主备关系不存在
return Result.buildFromRSAndMsg(
ResultStatus.RESOURCE_NOT_EXIST,
String.format("主集群ID:[%d]-Topic:[%s], 备集群ID:[%d] Topic:[%s] 的主备关系不存在,因此无法切换", activeClusterPhyId, topicName, standbyClusterPhyId, topicName)
);
}
return Result.buildSuc(relationDO);
}
}

Some files were not shown because too many files have changed in this diff Show More