CDK 2.0 and higher Powered By Apache Kafka supports Kerberos authentication, but it is supported only for the new Kafka Producer and Consumer APIs. If you already have a Kerberos server, you can add Kafka to your current configuration. If you do not have a Kerberos server, install it before proceeding. See Enabling Kerberos Authentication Using the Wizard.

If you already have configured the mapping from Kerberos principals to short names using the hadoop.security.auth_to_local HDFS configuration property, configure the same rules for Kafka by adding the sasl.kerberos.principal.to.local.rules property to the Advanced Configuration Snippet for Kafka Broker Advanced Configuration Snippet using Cloudera Manager. Specify the rules as a comma separated list.

To enable Kerberos authentication for Kafka:

  1. From Cloudera Manager, navigate to Kafka > Configurations. Set SSL client authentication to none. Set Inter Broker Protocol to SASL_PLAINTEXT.
  2. Click Save Changes.
  3. Restart the Kafka service.
  4. Make sure that listeners = SASL_PLAINTEXT is present in the Kafka broker logs /var/log/kafka/server.log.
  5. Create a jaas.conf file with the following contents to use with cached Kerberos credentials (you can modify this to use keytab files instead of cached credentials. To generate keytabs, see Step 6: Get or Create a Kerberos Principal for Each User Account).

    If you use kinit first, use this configuration.

    KafkaClient {
    com.sun.security.auth.module.Krb5LoginModule required
    useTicketCache=true;
    };
    If you use keytab, use this configuration:

    KafkaClient {
    com.sun.security.auth.module.Krb5LoginModule required
    useKeyTab=true
    keyTab="/etc/security/keytabs/kafka_server.keytab"
    principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
    };
  6. Create the client.properties file containing the following properties.
    security.protocol=SASL_PLAINTEXT
    sasl.kerberos.service.name=kafka
  7. Test with the Kafka console producer and consumer. To obtain a Kerberos ticket-granting ticket (TGT):
    $ kinit <user>
  8. Verify that your topic exists. (This does not use security features, but it is a best practice.)
    $ kafka-topics --list --zookeeper <zkhost>:2181
  9. Verify that the jaas.conf file is used by setting the environment.
    $ export KAFKA_OPTS="-Djava.security.auth.login.config=/home/user/jaas.conf"
  10. Run a Kafka console producer.
    $ kafka-console-producer --broker-list <anybroker>:9092 --topic test1
    --producer.config client.properties
  11. Run a Kafka console consumer.
    $ kafka-console-consumer --new-consumer --topic test1 --from-beginning
    --bootstrap-server <anybroker>:9092 --consumer.config client.properties

kafka Enabling Kerberos Authentication的更多相关文章

  1. flume集成kafka(kafka开启kerberos)配置

    根据flume官网:当kafka涉及kerberos认证: 涉及两点配置,如下: 配置一:见下实例中红色部分 配置conf实例: [root@gz237-107 conf]# cat flume_sl ...

  2. FIM 2010: Kerberos Authentication Setup

    The goal of this article is to provide some background information regarding the Kerberos related co ...

  3. Problem of Creating Topics in Kafka with Kerberos

    Hi, After enabled Kerberos using Ambari, I got problem creating topics in Kafka using the kafka-topi ...

  4. Step by Step Recipe for Securing Kafka with Kerberos

    Short Description: Step by Step Recipe for Securing Kafka with Kerberos. Article I found it is a lit ...

  5. hiveserver2 with kerberos authentication

    Kerberos协议: Kerberos协议主要用于计算机网络的身份鉴别(Authentication), 其特点是用户只需输入一次身份验证信息就可以凭借此验证获得的票据(ticket-grantin ...

  6. 进行Spark,Kafka针对Kerberos相关配置

    1. 提交任务的命令 spark-submit \--class <classname> \--master yarn \--deploy-mode client \--executor- ...

  7. Kafka集成Kerberos之后如何使用生产者消费者命令

    1.生产者1.1.准备jaas.conf并添加到环境变量(使用以下方式的其中一种)1.1.1.使用Kinit方式前提是手动kinit 配置内容为: KafkaClient { com.sun.secu ...

  8. kafka实战kerberos

    more /etc/krb5.conf [logging] default = FILE:/var/log/krb5libs.log kdc = FILE:/var/log/krb5kdc.log a ...

  9. kafka 配置kerberos校验以及开启acl实践

    转载请注明原创地址:http://www.cnblogs.com/dongxiao-yang/p/7131626.html kafka从0.9版本以后引入了集群安全机制,由于最近需要新搭建一套kafk ...

随机推荐

  1. Linux小知识点

    磁盘 Linux的磁盘类型有IDE和SCSI两种. IDE的命名方式是采用/dev/hdx(x代表磁盘块),其下的分区则是/dev/hdxy(y代表该磁盘上的分区号) SCSI则是采用/dev/sdx ...

  2. openCV CV2用法(转)

    文章转自:https://www.kancloud.cn/aollo/aolloopencv/262768 一.读入图像 使用函数cv2.imread(filepath,flags)读入一副图片 fi ...

  3. 搭建Ceph分布式存储

    环境: 系统 IP地址 主机名(登录用户) 承载角色 Centos 7.4 64Bit 1611 10.199.100.170 dlp(yzyu) ceph-client(root) admin-no ...

  4. shell 脚本中 while 只执行一次

    实例代码 while read line ; do ssh -p20002 $line -o StrictHostKeyChecking=no xxxxxxxxx done < ip.txt w ...

  5. 11-C#笔记-函数-方法

    # 1 函数基本使用 函数的调用方法用C++. 主函数要在一个Class中,静态的,无返回值: 见示例 using System; namespace CalculatorApplication { ...

  6. TensorFlow Lite for Android示例

    一.TensorFlow  Lite TensorFlow Lite 是用于移动设备和嵌入式设备的轻量级解决方案.TensorFlow Lite 支持 Android.iOS 甚至树莓派等多种平台. ...

  7. (9-4 )deepsort在ubuntu1604下配置

    Deep Sort with PyTorch YOLO https://github.com/ZQPei/deep_sort_pytorch 查看python版本 python3 --version ...

  8. Chrome DevTools的使用

    一.Chrome DevTools 简介 Chrome 开发者工具是一套内置于Google Chrome中的Web开发和调试工具,可用来对网站进行迭代.调试和分析 手册:Chrome 开发者工具中文手 ...

  9. 以py脚本形式ORM操作 及 django终端打印sql语句的设置

    1. 在Django项目的settings.py文件中,在最后复制粘贴如下代码: LOGGING = { 'version': 1, 'disable_existing_loggers': False ...

  10. 洛谷 P1816 忠诚 题解

    P1816 忠诚 题目描述 老管家是一个聪明能干的人.他为财主工作了整整10年,财主为了让自已账目更加清楚.要求管家每天记k次账,由于管家聪明能干,因而管家总是让财主十分满意.但是由于一些人的挑拨,财 ...