Skip to content

Commit 17e9610

Browse files
committed
Go for Secured Consumer and dev-broker
1 parent 1ff4053 commit 17e9610

File tree

4 files changed

+110
-10
lines changed

4 files changed

+110
-10
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ So, we need to add a DIGEST authentication layer to Zookeeper (doesn’t support
7878

7979
Follow the [security section documentation](SECURITY.md)
8080

81-
## 5- Sourcing
81+
## 6- Sourcing
8282
* Zookeeper Docker image : we use the [kubernetes-zookeeper @kow3ns](https://github.com/kow3ns/kubernetes-zookeeper) as base image with 2 modifications:
8383
* Add JVM flags to be injected in Java environment file [@see start-zookeeper.sh](zookeeper/docker/scripts/start-zookeeper)
8484
```bash
@@ -104,7 +104,7 @@ Follow the [security section documentation](SECURITY.md)
104104
fi
105105
```
106106

107-
## 6- Tips
107+
## 7- Tips
108108
* For debugging, you can bypass the Kafka broker for topics management (kafka and ZooKeeper helpers script) :
109109
```bash
110110
kubectl exec -ti kafka-0 -- kafka-topics.sh --create --topic k8s --zookeeper zk-cs.kafka.svc.cluster.local:2181

SECURITY.md

Lines changed: 16 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -125,15 +125,23 @@ kubectl exec -ti kafka-1 -- kafka-console-producer.sh --bootstrap-server kafka-0
125125
>> Hello with secured connection
126126
```
127127
```bash
128-
kubectl exec -ti kafka-1 -- kafka-console-consumer.sh --bootstrap-server kafka-0.kafka-broker.kafka.svc.cluster.local:9093 --topic k8s --consumer.config /opt/kafka/config/client.properties --from-beginning
128+
kubectl exec -ti kafka-1 -- kafka-console-consumer.sh --bootstrap-server kafka-0.kafka-broker.kafka.svc.cluster.local:9093 --topic k8s -consumer.config /opt/kafka/config/client.properties --from-beginning
129129
<< Hello with secured connection
130130
```
131131
132+
Prepare Secret for client:
133+
```bash
134+
kubectl create secret generic client-ssl --from-file=ca-certs.pem=ssl/ca-cert --from-file=cert.pem=ssl/cert-signed-client
135+
kubectl apply -f consumer.secured.yaml
136+
kubectl logs consumer-secured
137+
```
138+
139+
132140
6. Sources & Links:
133-
- https://access.redhat.com/documentation/en-us/red_hat_amq/7.2/html/using_amq_streams_on_red_hat_enterprise_linux_rhel/configuring_kafka
134-
- https://cwiki.apache.org/confluence/display/ZOOKEEPER/Server-Server+mutual+authentication
135-
- https://kafka.apache.org/documentation/#security_overview
136-
- https://github.com/bitnami/charts/issues/1279
137-
- https://stackoverflow.com/questions/54903381/kafka-failed-authentication-due-to-ssl-handshake-failed
138-
- https://www.vertica.com/docs/9.2.x/HTML/Content/Authoring/KafkaIntegrationGuide/TLS-SSL/KafkaTLS-SSLExamplePart3ConfigureKafka.htm?tocpath=Integrating%20with%20Apache%20Kafka%7CUsing%20TLS%2FSSL%20Encryption%20with%20Kafka%7C_____7
139-
- https://gist.github.com/anoopl/85d869f7a85a70c6c60542922fc314a8
141+
- [Redhat-Kafka](https://access.redhat.com/documentation/en-us/red_hat_amq/7.2/html/using_amq_streams_on_red_hat_enterprise_linux_rhel/configuring_kafka)
142+
- [Confluence-Zookeeper](https://cwiki.apache.org/confluence/display/ZOOKEEPER/Server-Server+mutual+authentication)
143+
- [Apache-Kafka](https://kafka.apache.org/documentation/#security_overview)
144+
- [Bitnami-Kafka](https://github.com/bitnami/charts/issues/1279)
145+
- [Kafka-SSL](https://stackoverflow.com/questions/54903381/kafka-failed-authentication-due-to-ssl-handshake-failed)
146+
- [Vertica-Kafka-SSL](https://www.vertica.com/docs/9.2.x/HTML/Content/Authoring/KafkaIntegrationGuide/TLS-SSL/KafkaTLS-SSLExamplePart3ConfigureKafka.htm?tocpath=Integrating%20with%20Apache%20Kafka%7CUsing%20TLS%2FSSL%20Encryption%20with%20Kafka%7C_____7)
147+
- [Kafka-SSL](https://gist.github.com/anoopl/85d869f7a85a70c6c60542922fc314a8)

kafka/consumer.secured.yaml

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
apiVersion: v1
2+
kind: ConfigMap
3+
metadata:
4+
name: consumer-secured
5+
data:
6+
client: |
7+
from kafka import KafkaConsumer
8+
import logging
9+
import sys
10+
11+
logger = logging.getLogger('kafka')
12+
logger.addHandler(logging.StreamHandler(sys.stdout))
13+
logger.setLevel(logging.INFO)
14+
15+
print('Init consumer...')
16+
17+
consumer = KafkaConsumer(
18+
'k8s',
19+
security_protocol='SSL',
20+
ssl_cafile='/var/ssl/ca-certs.pem',
21+
ssl_certfile='/var/ssl/cert.pem',
22+
ssl_check_hostname= False,
23+
ssl_password='passcode',
24+
bootstrap_servers=['kafka-service.kafka.svc.cluster.local:9093'])
25+
26+
print('Waiting for message')
27+
for message in consumer:
28+
print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition, message.offset, message.key, message.value))
29+
---
30+
31+
apiVersion: v1
32+
kind: Pod
33+
metadata:
34+
name: consumer-secured
35+
spec:
36+
containers:
37+
- name: client
38+
image: python:3.8.5
39+
command: ['sh', '-c', 'pip install kafka-python && python /var/static/client']
40+
volumeMounts:
41+
- name: static-volume
42+
mountPath: /var/static
43+
- name: secret-volume
44+
mountPath: /var/ssl
45+
volumes:
46+
- name: static-volume
47+
configMap:
48+
name: consumer-secured
49+
- name: secret-volume
50+
secret:
51+
secretName: client-ssl

kafka/dev-brocker.secured.yaml

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
apiVersion: v1
2+
kind: Pod
3+
metadata:
4+
name: dev-secured-brocker
5+
spec:
6+
affinity:
7+
nodeAffinity:
8+
requiredDuringSchedulingIgnoredDuringExecution:
9+
nodeSelectorTerms:
10+
- matchExpressions:
11+
- key: kubernetes.io/arch
12+
operator: In
13+
values:
14+
- amd64
15+
- arm64
16+
containers:
17+
- name: single-secured-broker
18+
image: medinvention/kafka:2.13-2.7.0
19+
env:
20+
- name: HOSTNAME_VALUE
21+
value: "127.0.0.1"
22+
- name: KAFKA_ZOOKEEPER_CONNECT
23+
value: "zk-cs.kafka.svc.cluster.local:2181"
24+
- name: KAFKA_LOG4J_OPTS
25+
value: "-Dlog4j.configuration=file:/opt/kafka/config/log4j.properties -Djava.security.auth.login.config=/var/lib/kafka/jass/jaas.conf"
26+
- name: KAFAK_AUTO_CREATE_TOPICS_ENABLE
27+
value: "false"
28+
ports:
29+
- containerPort: 9092
30+
volumeMounts:
31+
- name: data
32+
mountPath: /var/lib/kafka/data
33+
- name: jaas
34+
mountPath: /var/lib/kafka/jass
35+
readOnly: true
36+
volumes:
37+
- name: data
38+
emptyDir: {}
39+
- name: jaas
40+
secret:
41+
secretName: kafka-jaas

0 commit comments

Comments
 (0)