Logstash – Receiving, Processing, and Forwarding Logs to ELK.

Logstash

Logstash is a powerful tool designed for collecting, processing, and forwarding logs or event data. It acts as a data pipeline between different sources (e.g., system logs, applications, Beats) and destinations (e.g., Elasticsearch, databases, cloud storage).

  • In this setup, we are simulating a real-world logging pipeline where multiple Windows machines, each running Winlogbeat, collect logs and forward them to a central Logstash server. In enterprise environments, organizations often have dozens or even hundreds of endpoints generating logs. Instead of sending logs directly to Elasticsearch, Logstash acts as an intermediary, allowing for log enrichment, filtering, and transformation before forwarding the processed data to ELK for storage and analysis.

  • For the sake of resource limitations, we will be using only one Windows machine in this lab. However, in a real-world deployment, Winlogbeat would be installed on multiple Windows endpoints, and rather than installing it manually on each machine, organizations typically use Active Directory Group Policy (GPO) to deploy and configure it across all endpoints automatically. This ensures efficient management, consistency, and scalability of log collection in large infrastructures.

Environment Setup
  1. Windows machine: Running Winlogbeat to collect Windows logs

  2. Ubuntu machine: Logstash will be running on it, processing logs from its input (winlogbeat), then forwarding them to its output (Elasticsearch).

  3. Another Ubuntu machine: Elasticsearch and kibana are running on it

Drawing

Logstash Installation and configuration

  1. We can install logstash by those lines

wget https://artifacts.elastic.co/downloads/logstash/logstash-8.17.2-amd64.deb
sudo dpkg -i logstash-8.17.2-amd64.deb
  1. Next, we will create a new configuration file to define the Logstash pipeline

sudo nano /etc/logstash/conf.d/test.conf
input {
  beats {
    port => 5044
  }
}

filter {
  mutate {
    add_field => { "log_source" => "winlogbeat" }
  }
}

output {
  elasticsearch {
    hosts => ["https://192.168.174.128:9200"]
    index => "%winlogbeat-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "*jh5ifuDT8I_U07tyH6S"
    ssl_certificate_verification => false
  }
}

Logstash pipeline explanation:

  • Input: Listens for logs from Beats on port 5044.

  • Filter: Adds a custom field to mark the logs as coming from Winlogbeat.

  • Output: Sends logs to Elasticsearch, using an index format based on the date.

  1. Check if the configuration is correct

sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t
  1. Now we will start and enable Logstash service

sudo systemctl start logstash.service
sudo systemctl enable logstash.service
sudo systemctl status logstash.service
  1. Next, we will run Logstash using the configuration file we made

sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/test.conf

Now Logstash will be listening on port 5044 for incoming data, and it will apply the filter we made to it and then forward it to Elasticsearch as we configured.

Configuring Winlogbeat on Windows

  1. Open the Winlogbeat configuration file and update the output.logstash section with your logstash machine's IP

  1. To confirm these configurations, we will run:

.\winlogbeat.exe test config -c .\winlogbeat.yml
.\winlogbeat.exe test output
  1. Now, start winlogbeat service and confirm its status

  1. Next, run winlogbeat using the winlogbeat.yml we updated previously

Verifying Logs in Kibana

  1. From index management

  1. From discover

you can see the field we added in the filter of the logstash pipeline, or you can search with the field name at the top left bar

Last updated