Skip to content

Configure Serilog with Logstash and ElasticSearch

Today I was setting up the Serilog logging that communicates with Logstash. Serilog can communicate to ElasticSearch easily with ElasticSeach sink https://github.com/serilog/serilog-sinks-elasticsearch, but that does not work with Logstash http input. So to use the Logstash http input, we need to install the Serilog Http Sink and configure it properly.

I prefer to use DurableHttpUsingFileSizeRolledBuffers writer, because it will store the log files temporary into disk, if the Logstash connection is down. It can be also configured with ArrayBatchFormatter to send logs in a batches . If you use the default batch formatter, it will send log files in wrong format and ElasticSearch will store them as array, not as single log events.

Also the default textFormatter does not provide a suitable format for Kibana. Log events are searchable only by timestamp, message and id. Luckily the Serilog has released their formatters as independent packages and we can change the formatter to ElasticsearchJsonFormatter. Formatter can be found from Serilog.Formatting.Elasticsearch nuget package.

Here is a complete sample to configure Serilog Http Sink to write into Logstash, where it can be send to ElasticSearch and queried easily with Kibana.

logger.WriteTo.DurableHttpUsingFileSizeRolledBuffers(
    requestUri: new Uri($"http://{elasticHost}:{elasticPort}").ToString(),
    batchFormatter: new ArrayBatchFormatter(),
    textFormatter: new ElasticsearchJsonFormatter());

https://andrewlock.net/writing-logs-to-elasticsearch-with-fluentd-using-serilog-in-asp-net-core/