0%

部署ELK日志分析7.3.x

安装公共签名密钥

1
sudo rpm --import https://packages.elastic.co/GPG-KEY-elasticsearch 

在/etc/yum.repos.d/ 目录下新建一个elastic.repo文件,新增如下内容:

1
2
3
4
5
6
7
8
9
10
vim /etc/yum.repos.d/elastic.repo

[elastic-7.x]
name=Elastic repository for 7.x packages
baseurl=https://artifacts.elastic.co/packages/7.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

安装elasticsearch

安装

1
yum install -y elasticsearch

配置

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
# 配置文件都在 /etc/elasticsearch/ 目录下
vim /etc/elasticsearch/elasticsearch.yml

# 集群名称
cluster.name: my-application
# 节点名称
node.name: es
# 数据文件与日志文件存放目录,没有的话新建
path.data: /data/elasticsearch/data
path.logs: /data/elasticsearch/logs
# 网络设置
network.host: 0.0.0.0
http.port: 9200
# 集群设置
cluster.initial_master_nodes: ["es"]


# 修改配置中目录的用户与用户组,不然无法启动
chown -R elasticsearch:elasticsearch /data/elasticsearch/

启动

1
2
3
4
5
6
7
8
# 刷新服务配置
systemctl daemon-reload
# 启动
systemctl start elasticsearch.service
# 开机自启
systemctl enable elasticsearch.service
# 查看状态
systemctl status elasticsearch.service

安装kibana

安装

1
yum install -y kibana

配置

1
2
3
4
5
6
7
# 配置
vim /etc/kibana/kibana.yml

server.host: "0.0.0.0"
# 不要用 127.0.0.1,可能会提示 Kibana server is not ready yet
elasticsearch.hosts: ["http://localhost:9200"]
i18n.locale: "zh-CN"

启动

1
2
3
4
5
6
7
8
# 刷新服务配置
systemctl daemon-reload
# 开机自启
systemctl enable kibana.service
# 启动
systemctl start kibana.service
# 查看状态
systemctl status kibana.service

安装logstash

安装

1
yum install -y logstash

启动

1
2
3
4
5
6
7
8
9
10
# 刷新服务配置
systemctl daemon-reload
# 开机自启
systemctl enable kibana.service
# 启动
systemctl start kibana.service
# centos 6 启动方式
initctl start logstash
# 查看状态
systemctl status kibana.service

启动不了

1
2
3
4
5
6
7
8
问题1:查看/var/log/messages日志,提示logstash: could not find java; set JAVA_HOME or ensure java is in PATH

解决:创建软连接
ln -s /usr/local/jdk1.8.0_77/bin/java /usr/bin/java

问题2:通过geoip解析ip时,没有geo_point

解决:索引使用logstash开头

配置模板

geoip 官方文档:https://www.elastic.co/cn/blog/geoip-in-the-elastic-stack

原日志类型如下:

1
2019-09-30 10:33:12,432 [HttpTaskSync-1-thread-114],1569810792432,/device,11,11,2,6.4.5.1,175.167.138.111,0,0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
file{
path => "/data/apilogs/req/requestLog.log"
}
}
filter {
grok {
match => {
"message" => ["(?<date>[0-9\-]+\s[0-9\:]+)\W+(?<info>[0-9]+)\s+(?<info2>\[.*\]+)\W+(?<create_time>[0-9]+)\W+(?<url>[\/A-Za-z0-9\.]+)\W+(?<user_time>[0-9]+)\W+(?<total_time>[0-9]+)\W+(?<terminal_type>[0-9]+)\W+(?<app_version>[0-9\.]+)\W+(?<remote_ip>[0-9\.]+)\W+(?<user_id>[\d]+)\W++(?<status>[0-9]+)"]
}
remove_field => "message"
remove_field => "info"
remove_field => "info2"
}
if [url] == "getStoreSalesBarPicId" {
drop { }
}
geoip {
source => "remote_ip"
target => "geoip"
database => "/etc/logstash/GeoLite2-City_20190924/GeoLite2-City.mmdb"
}
}
output {
elasticsearch {
hosts => "http://10.81.54.16:9200"
index => "logstash-helper-requestlog"
#user => "elastic"
#password => "changeme"
}
}