Logstash – Receiving, Processing, and Forwarding Logs to ELK.
Logstash
Logstash is a powerful tool designed for collecting, processing, and forwarding logs or event data. It acts as a data pipeline between different sources (e.g., system logs, applications, Beats) and destinations (e.g., Elasticsearch, databases, cloud storage).
In this setup, we are simulating a real-world logging pipeline where multiple Windows machines, each running Winlogbeat, collect logs and forward them to a central Logstash server. In enterprise environments, organizations often have dozens or even hundreds of endpoints generating logs. Instead of sending logs directly to Elasticsearch, Logstash acts as an intermediary, allowing for log enrichment, filtering, and transformation before forwarding the processed data to ELK for storage and analysis.
For the sake of resource limitations, we will be using only one Windows machine in this lab. However, in a real-world deployment, Winlogbeat would be installed on multiple Windows endpoints, and rather than installing it manually on each machine, organizations typically use Active Directory Group Policy (GPO) to deploy and configure it across all endpoints automatically. This ensures efficient management, consistency, and scalability of log collection in large infrastructures.
Logstash Installation and configuration
We can install logstash by those lines
wget https://artifacts.elastic.co/downloads/logstash/logstash-8.17.2-amd64.deb
sudo dpkg -i logstash-8.17.2-amd64.debNext, we will create a new configuration file to define the Logstash pipeline
sudo nano /etc/logstash/conf.d/test.confinput {
beats {
port => 5044
}
}
filter {
mutate {
add_field => { "log_source" => "winlogbeat" }
}
}
output {
elasticsearch {
hosts => ["https://192.168.174.128:9200"]
index => "%winlogbeat-%{+YYYY.MM.dd}"
user => "elastic"
password => "*jh5ifuDT8I_U07tyH6S"
ssl_certificate_verification => false
}
}
Logstash pipeline explanation:
Input: Listens for logs from Beats on port 5044.
Filter: Adds a custom field to mark the logs as coming from Winlogbeat.
Output: Sends logs to Elasticsearch, using an index format based on the date.
Check if the configuration is correct
sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t
Now we will start and enable Logstash service
sudo systemctl start logstash.service
sudo systemctl enable logstash.service
sudo systemctl status logstash.service
Next, we will run Logstash using the configuration file we made
sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/test.conf
Now Logstash will be listening on port 5044 for incoming data, and it will apply the filter we made to it and then forward it to Elasticsearch as we configured.
Configuring Winlogbeat on Windows
Open the Winlogbeat configuration file and update the output.logstash section with your logstash machine's IP

To confirm these configurations, we will run:
.\winlogbeat.exe test config -c .\winlogbeat.yml
.\winlogbeat.exe test output
Now, start winlogbeat service and confirm its status

Next, run winlogbeat using the winlogbeat.yml we updated previously

Verifying Logs in Kibana
From index management

From discover

you can see the field we added in the filter of the logstash pipeline, or you can search with the field name at the top left bar
Last updated