Elasticsearch API and Ingestion Pipeline
Data Processing without Logstash
Elasticsearch API
We will use the Elasticsearch API to:
Create a new index containing three documents.
Delete one document using the DELETE method and another using a query-based deletion.
Reindex the remaining documents and perform an index flush.
Let's start with opening Dev Tools in kibana

Now, let's create a new index and add 3 documents to it

Delete a document by the DELETE method

Delete a document using a query-based deletion

Reindex the remaining documents to another index

Flush the index; flushing an index means forcing the data in memory to be written to disk and creating a new transaction log

Ingestion Pipline
Elasticsearch Ingestion Pipelines allow pre-processing of logs before indexing.This method is lightweight, reducing the need for Logstash in simple transformations.
create a new pipeline, it will append a new field "country", convert the type of the ip field and uppercase the user field

To try the pipeline, we will apply it to the test_index2 while the reindex request

View the index to see the updates

Last updated