Some notes:
- Used Jeff Geerling's ansible-role-logstash for the main setup of the ELK server I have
- Used logstash-forwarder (used to be called lumberjack) on all servers that need to send their logs to the ELK server
- Wrapped the installation and configuration of logstash-forwarder into a simple ansible role which installs the .deb file for this package and copies over a templatized logstash-forwarder.conf file; here is my ansible template for this file
- Customized the lumberjack input config file on the ELK server (still called lumberjack, but actually used in conjunction with the logstash-forwarder agents running on each box that sends its logs to ELK); here is my /etc/logstash/conf.d/01-lumberjack-input.conf file
- Added my app-specific config file on the ELK server; here is my /etc/logstash/conf.d/20-app.conf file with a few things to note
- the grok stanza applies the 'valid' tag only to the lines that match the APPLOGLINE pattern (see below for more on this pattern)
- the 'payload' field of any line that matches the APPLOGLINE pattern is parsed as JSON; this is nice because I can change the names of the fields in the JSON object in the log file and all these fields will be individually shown in ELK
- all lines that are not taggeed as 'valid' will be dropped
- Created a file called myapp in the /opt/logstash/patterns directory on the ELK server; this file contains all my app-specific patterns referenced in the 20-app.conf file above, in this example just 1 pattern:
- APPLOGLINE \[myapp\] %{TIMESTAMP_ISO8601:timestamp}Z\+00:000 \[%{WORD:severity}\] \[myresponse\] \[%{NUMBER:response}\] %{GREEDYDATA:payload}
- this patterns uses predefined logstash patterns such as TIMESTAMP_ISO8601, WORD, NUMBER and GREEDYDATA
- note the last field called payload; this is the JSON payload that gets parsed by logstash
No comments:
Post a Comment