Compare commits

..

10 Commits

Author SHA1 Message Date
Ve Lee
0388612a54 合并主分支 (#1241) 2024-10-12 20:26:14 +08:00
Ve Lee
4c10b4ce9c Fix node download url (#1240) 2024-10-12 19:59:58 +08:00
liwei
e5836dc29f Fix node download url 2024-10-12 19:44:18 +08:00
Ve Lee
99e086c1c5 [Bugfix]fix Topic level metric query (#1239)
1:topic维度的查询BytesIn,BytesOut 聚合类型应该是 sum 不能是 avg
2:getAggListMetrics dsl需要加上 brokerAgg = 1的条件
2024-10-12 14:37:56 +08:00
ruanliang-hualun
a4085adf10 [Bugfix]fix Topic level metric query 2024-10-07 20:51:22 +08:00
Peng
bfc6999c93 Update README.md 2024-08-23 15:38:04 +08:00
chang-wd
260cbb92d2 [Feature] Consume just filter key or value, not both. 消费消息支持单独过滤key或者value. (#1157)
close #1155

Consume just filter key or value, not both.
消费消息支持单独过滤key或者value。

---------

Co-authored-by: weidong_chang <weidong_chang@intsig.net>
2024-06-30 22:56:36 +08:00
Peng
232f06e5c2 Update README.md 2024-06-25 17:19:25 +08:00
jiangminbing
fcf0a08e0a [Bugfix] 修复BrokerConfigServiceImpl.getBrokerConfigByZKClient方法一定返回空的问题 (#1198)
修复获取ZK-Broker配置,出现空列表的问题

Co-authored-by: jiangmb <jiangmb@televehicle.com>
2024-01-06 16:40:11 +08:00
fang
68839a6725 [DOC]新增MySQL密码以加密方式存储并使用的文档 (#1135) 2023-12-10 01:15:46 +08:00
12 changed files with 182 additions and 15 deletions

View File

@@ -136,7 +136,7 @@
👍 我们正在组建国内最大,最权威的 **[Kafka中文社区](https://z.didi.cn/5gSF9)**
在这里你可以结交各大互联网的 Kafka大佬 以及 4000+ Kafka爱好者一起实现知识共享实时掌控最新行业资讯期待 👏 &nbsp; 您的加入中~ https://z.didi.cn/5gSF9
在这里你可以结交各大互联网的 Kafka大佬 以及 6200+ Kafka爱好者一起实现知识共享实时掌控最新行业资讯期待 👏 &nbsp; 您的加入中~ https://z.didi.cn/5gSF9
有问必答~ 互动有礼~
@@ -146,7 +146,7 @@ PS: 提问请尽量把问题一次性描述清楚,并告知环境信息情况
**`2、微信群`**
微信加群:添加`PenceXie` `szzdzhp001`的微信号备注KnowStreaming加群。
微信加群:添加`PenceXie` 的微信号备注KnowStreaming加群。
<br/>
加群之前有劳点一下 star一个小小的 star 是对KnowStreaming作者们努力建设社区的动力。

View File

@@ -0,0 +1,115 @@
## YML文件MYSQL密码加密存储手册
### 1、本地部署加密
**第一步:生成密文**
在本地仓库中找到jasypt-1.9.3.jar默认在org/jasypt/jasypt/1.9.3中,使用`java -cp`生成密文。
```bash
java -cp jasypt-1.9.3.jar org.jasypt.intf.cli.JasyptPBEStringEncryptionCLI input=mysql密码 password=加密的salt algorithm=PBEWithMD5AndDES
```
```bash
## 得到密文
DYbVDLg5D0WRcJSCUGWjiw==
```
**第二步配置jasypt**
在YML文件中配置jasypt例如
```yaml
jasypt:
encryptor:
algorithm: PBEWithMD5AndDES
iv-generator-classname: org.jasypt.iv.NoIvGenerator
```
**第三步:配置密文**
使用密文替换YML文件中的明文密码为ENC(密文),例如[application.yml](https://github.com/didi/KnowStreaming/blob/master/km-rest/src/main/resources/application.yml)中MYSQL密码。
```yaml
know-streaming:
username: root
password: ENC(DYbVDLg5D0WRcJSCUGWjiw==)
```
**第四步配置加密的salt选择其一**
- 配置在YML文件中不推荐
```yaml
jasypt:
encryptor:
password: salt
```
- 配置程序启动时的命令行参数
```bash
java -jar xxx.jar --jasypt.encryptor.password=salt
```
- 配置程序启动时的环境变量
```bash
export JASYPT_PASSWORD=salt
java -jar xxx.jar --jasypt.encryptor.password=${JASYPT_PASSWORD}
```
## 2、容器部署加密
利用docker swarm 提供的 secret 机制加密存储密码使用docker swarm来管理密码。
### 2.1、secret加密存储
**第一步初始化docker swarm**
```bash
docker swarm init
```
**第二步:创建密钥**
```bash
echo "admin2022_" | docker secret create mysql_password -
# 输出密钥
f964wi4gg946hu78quxsh2ge9
```
**第三步:使用密钥**
```yaml
# mysql用户密码
SERVER_MYSQL_USER: root
SERVER_MYSQL_PASSWORD: mysql_password
knowstreaming-mysql:
# root 用户密码
MYSQL_ROOT_PASSWORD: mysql_password
secrets:
mysql_password:
external: true
```
### 2.2、使用密钥文件加密
**第一步:创建密钥**
```bash
echo "admin2022_" > password
```
**第二步:使用密钥**
```yaml
# mysql用户密码
SERVER_MYSQL_USER: root
SERVER_MYSQL_PASSWORD: mysql_password
secrets:
mysql_password:
file: ./password
```

View File

@@ -144,6 +144,8 @@ const ConsumeClientTest = () => {
...configInfo,
needFilterKeyValue: changeValue === 1 || changeValue === 2,
needFilterSize: changeValue === 3 || changeValue === 4 || changeValue === 5,
needFilterKey: changeValue === 6,
needFilterValue: changeValue === 7,
});
break;
}

View File

@@ -16,19 +16,19 @@ export const cardList = [
export const filterList = [
{
label: 'none',
label: 'None',
value: 0,
},
{
label: 'contains',
label: 'Contains',
value: 1,
},
{
label: 'does not contains',
label: 'Does Not Contains',
value: 2,
},
{
label: 'equals',
label: 'Equals',
value: 3,
},
{
@@ -39,6 +39,14 @@ export const filterList = [
label: 'Under Size',
value: 5,
},
{
label: 'Key Contains',
value: 6,
},
{
label: 'Value Contains',
value: 7,
}
];
export const untilList = [
@@ -324,10 +332,10 @@ export const getFormConfig = (topicMetaData: any, info = {} as any, partitionLis
key: 'filterKey',
label: 'Key',
type: FormItemType.input,
invisible: !info?.needFilterKeyValue,
invisible: !info?.needFilterKeyValue && !info?.needFilterKey,
rules: [
{
required: info?.needFilterKeyValue,
required: info?.needFilterKeyValue || info?.needFilterKey,
message: '请输入Key',
},
],
@@ -336,10 +344,10 @@ export const getFormConfig = (topicMetaData: any, info = {} as any, partitionLis
key: 'filterValue',
label: 'Value',
type: FormItemType.input,
invisible: !info?.needFilterKeyValue,
invisible: !info?.needFilterKeyValue && !info?.needFilterValue,
rules: [
{
required: info?.needFilterKeyValue,
required: info?.needFilterKeyValue || info?.needFilterValue,
message: '请输入Value',
},
],

View File

@@ -44,7 +44,7 @@ const ExpandPartition = (props: { record: any; onConfirm: () => void }) => {
setLoading(true);
const metricParams = {
aggType: 'avg',
aggType: 'sum',
endTime: Math.round(endStamp),
metricsNames: ['BytesIn', 'BytesOut'],
startTime: Math.round(startStamp),

View File

@@ -32,8 +32,6 @@
<configuration>
<nodeVersion>v12.22.12</nodeVersion>
<npmVersion>6.14.16</npmVersion>
<nodeDownloadRoot>https://npm.taobao.org/mirrors/node/</nodeDownloadRoot>
<npmDownloadRoot>https://registry.npm.taobao.org/npm/-/</npmDownloadRoot>
</configuration>
</execution>
<execution>

View File

@@ -37,6 +37,7 @@ import scala.jdk.javaapi.CollectionConverters;
import javax.annotation.PostConstruct;
import java.util.*;
import java.util.stream.Collectors;
import static com.xiaojukeji.know.streaming.km.common.enums.version.VersionEnum.*;
@@ -154,9 +155,11 @@ public class BrokerConfigServiceImpl extends BaseKafkaVersionControlService impl
if (propertiesResult.failed()) {
return Result.buildFromIgnoreData(propertiesResult);
}
List<String> configKeyList = propertiesResult.getData().keySet().stream().map(Object::toString).collect(Collectors.toList());
return Result.buildSuc(KafkaConfigConverter.convert2KafkaBrokerConfigDetailList(
new ArrayList<>(),
configKeyList,
propertiesResult.getData()
));
}

View File

@@ -451,6 +451,18 @@ public class KafkaClientTestManagerImpl implements KafkaClientTestManager {
return Result.buildFromRSAndMsg(ResultStatus.PARAM_ILLEGAL, "包含的方式过滤必须有过滤的key或value");
}
// key包含过滤
if (KafkaConsumerFilterEnum.KEY_CONTAINS.getCode().equals(filter.getFilterType())
&& ValidateUtils.isBlank(filter.getFilterCompareKey())) {
return Result.buildFromRSAndMsg(ResultStatus.PARAM_ILLEGAL, "key包含的方式过滤必须有过滤的key");
}
// value包含过滤
if (KafkaConsumerFilterEnum.VALUE_CONTAINS.getCode().equals(filter.getFilterType())
&& ValidateUtils.isBlank(filter.getFilterCompareValue())) {
return Result.buildFromRSAndMsg(ResultStatus.PARAM_ILLEGAL, "value包含的方式过滤必须有过滤的value");
}
// 不包含过滤
if (KafkaConsumerFilterEnum.NOT_CONTAINS.getCode().equals(filter.getFilterType())
&& ValidateUtils.isBlank(filter.getFilterCompareKey()) && ValidateUtils.isBlank(filter.getFilterCompareValue())) {
@@ -550,6 +562,18 @@ public class KafkaClientTestManagerImpl implements KafkaClientTestManager {
return true;
}
// key包含过滤
if (KafkaConsumerFilterEnum.KEY_CONTAINS.getCode().equals(filter.getFilterType())
&& (!ValidateUtils.isBlank(filter.getFilterCompareKey()) && consumerRecord.key() != null && consumerRecord.key().toString().contains(filter.getFilterCompareKey()))) {
return true;
}
// value包含过滤
if (KafkaConsumerFilterEnum.VALUE_CONTAINS.getCode().equals(filter.getFilterType())
&& (!ValidateUtils.isBlank(filter.getFilterCompareValue()) && consumerRecord.value() != null && consumerRecord.value().toString().contains(filter.getFilterCompareValue()))) {
return true;
}
// 不包含过滤
if (KafkaConsumerFilterEnum.NOT_CONTAINS.getCode().equals(filter.getFilterType())
&& (!ValidateUtils.isBlank(filter.getFilterCompareKey()) && (consumerRecord.key() == null || !consumerRecord.key().toString().contains(filter.getFilterCompareKey())))

View File

@@ -19,7 +19,7 @@ public class KafkaConsumerFilterDTO extends BaseDTO {
/**
* @see KafkaConsumerFilterEnum
*/
@Range(min = 0, max = 5, message = "filterType最大和最小值必须在[0, 5]之间")
@Range(min = 0, max = 7, message = "filterType最大和最小值必须在[0, 7]之间")
@ApiModelProperty(value = "开始消费位置的类型", example = "2")
private Integer filterType;

View File

@@ -22,6 +22,10 @@ public enum KafkaConsumerFilterEnum {
UNDER_SIZE(5, "size小于"),
KEY_CONTAINS(6, "key包含"),
VALUE_CONTAINS(7, "value包含"),
;
private final Integer code;

View File

@@ -17,6 +17,13 @@
}
}
},
{
"term": {
"brokerAgg" : {
"value": "1"
}
}
},
{
"range": {
"timestamp": {

View File

@@ -143,6 +143,12 @@
<version>${springboot.version}</version>
</dependency>
<dependency>
<groupId>com.github.ulisesbocchio</groupId>
<artifactId>jasypt-spring-boot-starter</artifactId>
<version>3.0.5</version>
</dependency>
<!--testcontainers-->
<dependency>
<groupId>org.testcontainers</groupId>