Elasticsearch API and Ingestion Pipeline

Data Processing without Logstash

Elasticsearch API

We will use the Elasticsearch API to:

  1. Create a new index containing three documents.

  2. Delete one document using the DELETE method and another using a query-based deletion.

  3. Reindex the remaining documents and perform an index flush.

Let's start with opening Dev Tools in kibana

  1. Now, let's create a new index and add 3 documents to it

  1. Delete a document by the DELETE method

  1. Delete a document using a query-based deletion

  1. Reindex the remaining documents to another index

  1. Flush the index; flushing an index means forcing the data in memory to be written to disk and creating a new transaction log

Ingestion Pipline

Elasticsearch Ingestion Pipelines allow pre-processing of logs before indexing.This method is lightweight, reducing the need for Logstash in simple transformations.

  1. create a new pipeline, it will append a new field "country", convert the type of the ip field and uppercase the user field

  1. To try the pipeline, we will apply it to the test_index2 while the reindex request

  1. View the index to see the updates

Last updated