1、工程简介

elk-eureka-server作为其他三个项目的服务注册中心

elk-kafka-client调用elk-kafka-server,elk-kafka-server再调用elk-kafka-server1

2、新建springboot项目elk-eureka-server

pom.xml内容如下

 <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <groupId>com.carry.elk</groupId>
<artifactId>elk-eureka-server</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging> <name>elk-eureka-server</name>
<description>Demo project for Spring Boot</description> <parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.3.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent> <properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spring-cloud.version>Finchley.RELEASE</spring-cloud.version>
</properties> <dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-server</artifactId>
</dependency> <dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies> <dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement> <build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build> </project>

application.yml内容如下

 server:
port: 8761
eureka:
instance:
hostname: localhost
client:
registerWithEureka: false
fetchRegistry: false
serviceUrl:
defaultZone: http://${eureka.instance.hostname}:${server.port}/eureka/

在启动类ElkEurekaServerApplication上加上注解@EnableEurekaServer

3、新建springboot项目elk-kafka-client

pom.xml内容如下

 <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <groupId>com.carry.elk</groupId>
<artifactId>elk-kafka-client</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging> <name>elk-kafka-client</name>
<description>Demo project for Spring Boot</description> <parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.3.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent> <properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spring-cloud.version>Finchley.RELEASE</spring-cloud.version>
</properties> <dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-openfeign</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>com.github.danielwegener</groupId>
<artifactId>logback-kafka-appender</artifactId>
<version>0.1.0</version>
<scope>runtime</scope>
</dependency> <dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies> <dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement> <build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build> </project>

application.yml内容如下

 server:
port: 8080
spring:
application:
name: ELK-KAFKA-CLIENT
devtools:
restart:
enabled: true
eureka:
client:
serviceUrl:
defaultZone: http://localhost:8761/eureka/

在resources目录下新增logback.xml文件,内容如下

 <?xml version="1.0" encoding="UTF-8"?>
<configuration debug="false" scan="true"
scanPeriod="1 seconds">
<include
resource="org/springframework/boot/logging/logback/base.xml" />
<!-- <jmxConfigurator/> -->
<contextName>logback</contextName> <property name="log.path" value="\logs\logback.log" /> <property name="log.pattern"
value="%d{yyyy-MM-dd HH:mm:ss.SSS} -%5p ${PID} --- traceId:[%X{mdc_trace_id}] [%15.15t] %-40.40logger{39} : %m%n" /> <appender name="file"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${log.path}</file> <encoder>
<pattern>${log.pattern}</pattern>
</encoder> <rollingPolicy
class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <fileNamePattern>info-%d{yyyy-MM-dd}-%i.log
</fileNamePattern> <timeBasedFileNamingAndTriggeringPolicy
class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP"> <maxFileSize>10MB</maxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
<maxHistory>10</maxHistory>
</rollingPolicy> </appender> <appender name="KafkaAppender"
class="com.github.danielwegener.logback.kafka.KafkaAppender">
<encoder
class="com.github.danielwegener.logback.kafka.encoding.LayoutKafkaMessageEncoder">
<layout class="net.logstash.logback.layout.LogstashLayout">
<includeContext>true</includeContext>
<includeCallerData>true</includeCallerData>
<customFields>{"system":"kafka"}</customFields>
<fieldNames
class="net.logstash.logback.fieldnames.ShortenedFieldNames" />
</layout>
<charset>UTF-8</charset>
</encoder>
<!--kafka topic 需要与配置文件里面的topic一致 否则kafka会沉默并鄙视你 -->
<topic>applog</topic>
<keyingStrategy
class="com.github.danielwegener.logback.kafka.keying.HostNameKeyingStrategy" />
<deliveryStrategy
class="com.github.danielwegener.logback.kafka.delivery.AsynchronousDeliveryStrategy" />
<producerConfig>bootstrap.servers=192.168.68.110:9092,192.168.68.111:9092,192.168.68.112:9092</producerConfig>
</appender> <!-- <logger name="Application_ERROR">
<appender-ref ref="KafkaAppender" />
</logger> --> <root level="info">
<appender-ref ref="KafkaAppender" />
</root> </configuration>

新增Feign接口CilentFeignApi,内容如下

 package com.carry.elk;

 import org.springframework.cloud.openfeign.FeignClient;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod; @FeignClient("ELK-KAFKA-SERVER")
public interface CilentFeignApi { @RequestMapping(value = "/server", method = RequestMethod.GET, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE)
public String getString(); }

新增ClientController,内容如下

 package com.carry.elk;

 import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController; @RestController
public class ClientController { // private final static Logger logger = LoggerFactory.getLogger("Application_ERROR"); private final Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired
CilentFeignApi api; @GetMapping("/client")
public String getString() {
logger.info("开始调用服务端Server");
return api.getString();
}
}

在启动类ElkKafkaClientApplication上加上注解@EnableEurekaClient和@EnableFeignClients

4、新建springboot项目elk-kafka-server

此项目跟elk-kafka-client项目相似只需作一下修改

application.yml文件中修改端口号为8081,application.name为ELK-KAFKA-SERVER

Feign接口中注解修改为@FeignClient("ELK-KAFKA-SERVER1")

Controller内容修改为

 package com.carry.elk;

 import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController; @RestController
public class ServerController { // private final static Logger logger = LoggerFactory.getLogger("Application_ERROR");
private final Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired
ServerFeignApi api; @GetMapping("/server")
public String getString() {
logger.info("接收客户端调用并调用服务端Server1");
return api.getString();
}
}

5、新建springboot项目elk-kafka-server1

此项目不需要调用其他项目所以去掉Feign的支持

application.yml文件中修改端口号为8082,application.name为ELK-KAFKA-SERVER1

Controller内容修改为

 package com.carry.elk;

 import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController; @RestController
public class Server1Controller { // private final static Logger logger = LoggerFactory.getLogger("Application_ERROR");
private final Logger logger = LoggerFactory.getLogger(this.getClass()); @GetMapping("/server")
public String getString() {
logger.info("接收服务端server的调用");
return "I am server1.";
} }

项目已写完让我们来看看结果吧

首先启动elk-eureka-server,然后依次启动elk-kafka-server1,elk-kafka-server,elk-kafka-client,浏览器中访问localhost:8761/,查看三个项目是否成功注册到eureka上去

访问localhost:8080/client,查看项目后台日志

这时我们可以去Kibana上查询日志了,访问http://192.168.68.112:5601/,进入Management创建index pattern kafka-logs-*,返回Discover页面

Spring Cloud Sleuth通过Kafka将链路追踪日志输出到ELK的更多相关文章

  1. 分布式链路追踪之Spring Cloud Sleuth+Zipkin最全教程!

    大家好,我是不才陈某~ 这是<Spring Cloud 进阶>第九篇文章,往期文章如下: 五十五张图告诉你微服务的灵魂摆渡者Nacos究竟有多强? openFeign夺命连环9问,这谁受得 ...

  2. 服务链路追踪(Spring Cloud Sleuth)

    sleuth:英 [slu:θ] 美 [sluθ] n.足迹,警犬,侦探vi.做侦探 微服务架构是一个分布式架构,它按业务划分服务单元,一个分布式系统往往有很多个服务单元.由于服务单元数量众多,业务的 ...

  3. Spring Cloud Sleuth+ZipKin+ELK服务链路追踪(七)

    序言 sleuth是spring cloud的分布式跟踪工具,主要记录链路调用数据,本身只支持内存存储,在业务量大的场景下,为拉提升系统性能也可通过http传输数据,也可换做rabbit或者kafka ...

  4. Spring Cloud Sleuth服务链路追踪(zipkin)(转)

    这篇文章主要讲述服务追踪组件zipkin,Spring Cloud Sleuth集成了zipkin组件. 一.简介 Spring Cloud Sleuth 主要功能就是在分布式系统中提供追踪解决方案, ...

  5. SpringCloud(7)服务链路追踪Spring Cloud Sleuth

    1.简介 Spring Cloud Sleuth 主要功能就是在分布式系统中提供追踪解决方案,并且兼容支持了 zipkin,你只需要在pom文件中引入相应的依赖即可.本文主要讲述服务追踪组件zipki ...

  6. 第八篇: 服务链路追踪(Spring Cloud Sleuth)

    一.简介 一个分布式系统由若干分布式服务构成,每一个请求会经过多个业务系统并留下足迹,但是这些分散的数据对于问题排查,或是流程优化都很有限.   要能做到追踪每个请求的完整链路调用,收集链路调用上每个 ...

  7. 史上最简单的SpringCloud教程 | 第九篇: 服务链路追踪(Spring Cloud Sleuth)

    这篇文章主要讲述服务追踪组件zipkin,Spring Cloud Sleuth集成了zipkin组件. 注意情况: 该案例使用的spring-boot版本1.5.x,没使用2.0.x, 另外本文图3 ...

  8. spring cloud 入门系列八:使用spring cloud sleuth整合zipkin进行服务链路追踪

    好久没有写博客了,主要是最近有些忙,今天忙里偷闲来一篇. =======我是华丽的分割线========== 微服务架构是一种分布式架构,微服务系统按照业务划分服务单元,一个微服务往往会有很多个服务单 ...

  9. spring boot 2.0.3+spring cloud (Finchley)7、服务链路追踪Spring Cloud Sleuth

    参考:Spring Cloud(十二):分布式链路跟踪 Sleuth 与 Zipkin[Finchley 版] Spring Cloud Sleuth 是Spring Cloud的一个组件,主要功能是 ...

随机推荐

  1. JavaScript:理解prototype与__proto__,原型与原型链

    JS中的继承是原型继承,通过原型实现的.为了理解原型,我想先讲讲对象的内部属性[[prototype]]和属性__proto__,函数的属性prototype. 对象的内部属性[[prototype] ...

  2. 推荐《R数据可视化手册》高清英文版PDF+中文版PDF+源代码

    绝大多数的绘图案例都是以强大.灵活制图而著称的R包ggplot2实现的,充分展现了ggplot2生动.翔实的一面.从如何画点图.线图.柱状图,到如何添加注解.修改坐标轴和图例,再到分面的使用和颜色的选 ...

  3. python3之对本地TXT文件进行增加,删除,修改,查看功能。

    由于是初学,代码如有不足,欢迎指出! 本博客记录我的编程之路,记录所学到的知识,分享所学心得! 这是我的一个作业. 首先分析要求: 创建一个TXT文件用于存储账号与密码 实现对文件进行增加,删除,修改 ...

  4. 【转】[译]理解HTTP/304响应

    [转][译]理解HTTP/304响应 原文:http://www.telerik.com/automated-testing-tools/blog/eric-lawrence/12-11-06/und ...

  5. 全面解读Java中的枚举类型enum的使用

    这篇文章主要介绍了Java中的枚举类型enum的使用,开始之前先讲解了枚举的用处,然后还举了枚举在操作数据库时的实例,需要的朋友可以参考下 关于枚举 大多数地方写的枚举都是给一个枚举然后例子就开始sw ...

  6. static_cast 与 dynamic_cast

  7. [Python] Pandas load DataFrames

    Create an empty Data frame with date index: import pandas as pd def test_run(): start_date='2017-11- ...

  8. Win8.1应用开发之文件操作

    在操作文件之前,先相应用的应用功能声明进行设定.用户通过C#(非UI)对win8.1上的文件进行訪问,仅仅能局限于图片,音乐,视频和文档四个目录. 而通过文件选取器则能訪问到整个系统的文件. (一)应 ...

  9. cocos2d-x 显示触摸操作(显示水波点击效果,用于视频演示)

    昨天刚刚參加玩游戏设计大赛, 积累了一些东西. 接下去将会逐个分享出来. 首先是显示触摸操作. 由于要演示我们的作品.使用试玩过程中, 假设没办法显示我们的触摸操作(像录制视频一样, 点击了屏幕某点, ...

  10. [2012山东省第三届ACM大学生程序设计竞赛]——Mine Number

    Mine Number 题目:http://acm.sdut.edu.cn/sdutoj/problem.php? action=showproblem&problemid=2410 Time ...