Log Aggregators — Splunk, ELK, Graylog, and Loki
Modern log aggregation platforms provide centralized storage, search, correlation, and visualization of logs from multiple sources. MideyeServer integrates with these platforms through file-based log collection — a lightweight agent tails the log files and forwards entries to the aggregation backend.
Supported platforms:
- Splunk — Enterprise log management and SIEM
- Elasticsearch (ELK Stack) — Open-source search and analytics
- Graylog — Open-source log management
- Grafana Loki — Cloud-native log aggregation
Log file locations
Section titled “Log file locations”Before configuring collection agents, identify your log file paths. See Overview → Log file locations for the full platform reference.
| Platform | Log Directory |
|---|---|
| Debian / Ubuntu | /opt/mideyeserver6/log/ |
| RHEL / Rocky | /opt/mideyeserver6/log/ |
| Windows | C:\Program Files (x86)\Mideye Server 6\log\ |
| Docker | /home/mideye/log/ |
Both mideyeserver.log and mideyeserver.error are available in these directories.
Splunk Universal Forwarder
Section titled “Splunk Universal Forwarder”The Splunk Universal Forwarder is a lightweight agent for forwarding logs to Splunk Enterprise or Splunk Cloud.
Installation
Section titled “Installation”Download and install from Splunk Downloads.
wget -O splunkforwarder.tgz '<download-url>'tar xvzf splunkforwarder.tgz -C /opt/opt/splunkforwarder/bin/splunk start --accept-license# Run installer and follow promptsmsiexec /i splunkforwarder-9.x.x-xxx.msi AGREETOLICENSE=YesConfiguration
Section titled “Configuration”-
Add MideyeServer log files as inputs
Terminal window /opt/splunkforwarder/bin/splunk add monitor /opt/mideyeserver6/log/mideyeserver.log \-sourcetype mideyeserver:log \-index main/opt/splunkforwarder/bin/splunk add monitor /opt/mideyeserver6/log/mideyeserver.error \-sourcetype mideyeserver:error \-index mainOr edit inputs.conf directly:
/opt/splunkforwarder/etc/system/local/inputs.conf [monitor:///opt/mideyeserver6/log/mideyeserver.log]disabled = falsesourcetype = mideyeserver:logindex = main[monitor:///opt/mideyeserver6/log/mideyeserver.error]disabled = falsesourcetype = mideyeserver:errorindex = mainC:\Program Files\SplunkUniversalForwarder\etc\system\local\inputs.conf [monitor://C:\Program Files (x86)\Mideye Server 6\log\mideyeserver.log]disabled = falsesourcetype = mideyeserver:logindex = main -
Configure forwarding destination
Terminal window /opt/splunkforwarder/bin/splunk add forward-server splunk.example.com:9997Or edit:
/opt/splunkforwarder/etc/system/local/outputs.conf [tcpout]defaultGroup = default-autolb-group[tcpout:default-autolb-group]server = splunk.example.com:9997 -
Restart Universal Forwarder
Terminal window /opt/splunkforwarder/bin/splunk restart
Log parsing in Splunk
Section titled “Log parsing in Splunk”Create a custom field extraction to parse MideyeServer’s log format:
[mideyeserver:log]SHOULD_LINEMERGE = falseTIME_PREFIX = ^TIME_FORMAT = %Y-%m-%d %H:%M:%S.%3N%zMAX_TIMESTAMP_LOOKAHEAD = 32EXTRACT-level = ^\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\.\d{3}\+\d{2}:\d{2}\s(?<level>\w+)EXTRACT-thread = \[(?<thread>[^\]]+)\]EXTRACT-logger = \]\s(?<logger>\w+):Elasticsearch (ELK Stack) with Filebeat
Section titled “Elasticsearch (ELK Stack) with Filebeat”Filebeat is a lightweight shipper for forwarding logs to Elasticsearch or Logstash.
Installation
Section titled “Installation”curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-8.x.x-amd64.debsudo dpkg -i filebeat-8.x.x-amd64.debcurl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-8.x.x-x86_64.rpmsudo rpm -vi filebeat-8.x.x-x86_64.rpm# Download and extractExpand-Archive filebeat-8.x.x-windows-x86_64.zipcd filebeat-8.x.x-windows-x86_64.\install-service-filebeat.ps1Configuration
Section titled “Configuration”Edit filebeat.yml:
filebeat.inputs: # MideyeServer main log - type: log enabled: true paths: - /opt/mideyeserver6/log/mideyeserver.log fields: application: mideyeserver log_type: application fields_under_root: true
# Multiline configuration for stack traces multiline.type: pattern multiline.pattern: '^\d{4}-\d{2}-\d{2}' multiline.negate: true multiline.match: after
# MideyeServer error log - type: log enabled: true paths: - /opt/mideyeserver6/log/mideyeserver.error fields: application: mideyeserver log_type: error fields_under_root: true
# Output to Elasticsearchoutput.elasticsearch: hosts: ["elasticsearch.example.com:9200"] index: "mideyeserver-%{+yyyy.MM.dd}"
# Optional: authentication username: "filebeat" password: "changeme"
# Optional: TLS ssl.enabled: true ssl.certificate_authorities: ["/etc/filebeat/ca.crt"]
# Pipeline for parsing (optional)setup.ilm.enabled: falsesetup.template.name: "mideyeserver"setup.template.pattern: "mideyeserver-*"Logstash pipeline (optional)
Section titled “Logstash pipeline (optional)”If using Logstash for parsing, configure Filebeat to output to Logstash:
output.logstash: hosts: ["logstash.example.com:5044"]And create a Logstash pipeline:
input { beats { port => 5044 }}
filter { if [application] == "mideyeserver" { grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} \[%{DATA:thread}\] %{DATA:logger}: %{GREEDYDATA:log_message}" } } date { match => ["timestamp", "yyyy-MM-dd HH:mm:ss.SSSZ"] target => "@timestamp" } }}
output { elasticsearch { hosts => ["elasticsearch.example.com:9200"] index => "mideyeserver-%{+YYYY.MM.dd}" }}Start Filebeat
Section titled “Start Filebeat”sudo systemctl enable filebeatsudo systemctl start filebeatsudo systemctl status filebeatGraylog with Filebeat
Section titled “Graylog with Filebeat”Graylog can receive logs from Filebeat via the Beats input plugin.
Graylog configuration
Section titled “Graylog configuration”-
Create Beats Input
- Navigate to System → Inputs
- Select Beats and click Launch new input
- Configure:
- Title: MideyeServer Logs
- Bind address: 0.0.0.0
- Port: 5044
- Click Save
-
Create Extractors (Optional)
Create extractors to parse the log format:
- Navigate to System → Inputs → MideyeServer Logs → Manage extractors
- Click Get started
- Use Grok pattern:
%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} \[%{DATA:thread}\] %{DATA:logger}: %{GREEDYDATA:message}
Filebeat configuration
Section titled “Filebeat configuration”filebeat.inputs: - type: log enabled: true paths: - /opt/mideyeserver6/log/mideyeserver.log fields: application: mideyeserver
output.logstash: hosts: ["graylog.example.com:5044"]Grafana Loki with Promtail
Section titled “Grafana Loki with Promtail”Promtail is the agent for shipping logs to Grafana Loki.
Installation
Section titled “Installation”docker run -d --name promtail \ -v /var/log:/var/log \ -v /opt/mideyeserver6/log:/mideyeserver/log \ -v /etc/promtail:/etc/promtail \ grafana/promtail:latest \ -config.file=/etc/promtail/config.ymlcurl -O -L "https://github.com/grafana/loki/releases/latest/download/promtail-linux-amd64.zip"unzip promtail-linux-amd64.zipchmod +x promtail-linux-amd64sudo mv promtail-linux-amd64 /usr/local/bin/promtailConfiguration
Section titled “Configuration”server: http_listen_port: 9080 grpc_listen_port: 0
positions: filename: /tmp/positions.yaml
clients: - url: http://loki.example.com:3100/loki/api/v1/push
scrape_configs: - job_name: mideyeserver static_configs: - targets: - localhost labels: application: mideyeserver host: mideyeserver01 __path__: /opt/mideyeserver6/log/mideyeserver.log
- job_name: mideyeserver-error static_configs: - targets: - localhost labels: application: mideyeserver log_type: error host: mideyeserver01 __path__: /opt/mideyeserver6/log/mideyeserver.errorLogQL queries in Grafana
Section titled “LogQL queries in Grafana”Query MideyeServer logs in Grafana:
# All MideyeServer logs{application="mideyeserver"}
# Error logs only{application="mideyeserver", log_type="error"}
# Filter by log level{application="mideyeserver"} |= "ERROR"
# Count errors per minutecount_over_time({application="mideyeserver"} |= "ERROR" [1m])Structured JSON logging
Section titled “Structured JSON logging”For advanced use cases, you can configure MideyeServer to output JSON-formatted logs using logstash-logback-encoder.
Prerequisites
Section titled “Prerequisites”Add the encoder dependency to MideyeServer’s classpath (contact Mideye Support for custom builds).
Configuration
Section titled “Configuration”Replace the pattern encoder with JSON encoder in logback.xml:
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender"> <!-- Replace <encoder> with JSON encoder --> <encoder class="net.logstash.logback.encoder.LogstashEncoder"> <includeContext>true</includeContext> <includeMdc>true</includeMdc> <includeStructuredArguments>true</includeStructuredArguments> <fieldNames> <timestamp>@timestamp</timestamp> <message>message</message> <logger>logger_name</logger> <thread>thread_name</thread> <level>level</level> <levelValue>[ignore]</levelValue> </fieldNames> </encoder> <!-- ... rest of configuration ... --></appender>This outputs logs in JSON format:
{ "@timestamp": "2026-02-25T14:32:15.847Z", "level": "INFO", "thread_name": "main", "logger_name": "com.mideye.mideyeserver.MideyeServerApp", "message": "Application 'MideyeServer' is running!", "host": "mideyeserver01"}Grok pattern for log parsing
Section titled “Grok pattern for log parsing”Use this Grok pattern to parse MideyeServer’s default log format:
%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level}\s+\[%{DATA:thread}\] %{DATA:logger}: %{GREEDYDATA:message}Example match:
2026-02-25 14:32:15.847+00:00 INFO [main] MideyeServerApp: Application 'MideyeServer' is running!Extracted fields:
timestamp:2026-02-25 14:32:15.847+00:00level:INFOthread:mainlogger:MideyeServerAppmessage:Application 'MideyeServer' is running!
Troubleshooting
Section titled “Troubleshooting”Agent not tailing log files
Section titled “Agent not tailing log files”Check file permissions:
ls -l /opt/mideyeserver6/log/# Agent user needs read accessCheck agent logs:
# Filebeatsudo journalctl -u filebeat -f
# Promtailsudo journalctl -u promtail -fVerify file path:
# Test with tailtail -f /opt/mideyeserver6/log/mideyeserver.logNo data in aggregation platform
Section titled “No data in aggregation platform”Check network connectivity:
# Elasticsearchcurl -X GET "elasticsearch.example.com:9200"
# Graylog Beats inputtelnet graylog.example.com 5044
# Lokicurl http://loki.example.com:3100/readyCheck agent status:
# Filebeat test outputsudo filebeat test output
# Filebeat test configsudo filebeat test configHigh CPU/memory usage
Section titled “High CPU/memory usage”Reduce log verbosity in MideyeServer (see Log Levels)
Limit multiline processing in Filebeat:
multiline.max_lines: 500multiline.timeout: 5sIncrease harvest interval:
close_inactive: 5mscan_frequency: 30sRelated documentation
Section titled “Related documentation”- Overview: Log file locations and paths per platform
- Syslog: Alternative to log aggregators for syslog-based collection
- Log Levels: Configure verbosity before forwarding
- Log Rotation: Configure rotation to prevent disk space issues