Era Software

Send your CloudWatch logs to EraSearch

Centralize your logs in EraSearch, including your cloud logs. This article will help you through the AWS and Vector configs.

image of Send your CloudWatch logs to EraSearch

Summary:

Adding logs to EraSearch provides centralization across clouds to provide a wholistic view of operations at any point in time. EraSearch is a modern log management platform built for high volume workloads. DevOps, IT Operations and infrastructure teams use CloudWatch to monitor the performance of their cloud applications and infrastructures. Sending and analyzing logs using EraSearch enables teams to take advantage of highly cost optimized object storage and fast query performance critical for high data volumes.

This article will provide a how-to setup streaming Amazon CloudWatch logs through a Amazon Kinesis Data Firehose (Kinesis Firehose) to EraSearch.

Step 1: Configure AWS products to log to CloudWatch

The assumption is that logs are already flowing into CloudWatch and are ready to be sent to Amazon Kinesis.

Step 2: Configure Kinesis Firehose

Utilizing the links below setup a Kinesis subscription. Then stream CloudWatch logs to the subscription that was created.

https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CreateDestination.html

https://aws.amazon.com/premiumsupport/knowledge-center/streaming-cloudwatch-logs/

https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html#FirehoseExample

Step 3: Send data to a destination

Choose destination

Step 4: Configure Vector

Create a Vector configuration with a AWS Kinesis Firehose source. The access key from Step 2: Configuring the Kinesis Firehose will be needed. The access key ${AKF_PWD} should be passed in from a secret, and the ${VECPORT} is passed in from the docker command line.

data_dir = "/var/lib/vector"

[sources.akf]
type = "aws_kinesis_firehose"
address = "0.0.0.0:${VECPORT}"
access_key= "${AKF-PWD}"
record_compression="auto"
tls.enabled = true
tls.crt_file = "/var/lib/vector/cert/local.eradb.com.crt"
tls.key_file = "/var/lib/vector/cert/local.eradb.com.key"

Next, parse the line into JSON and add some timestamps

# Parse Syslog logs
# See the Vector Remap Language reference for more info: https://vrl.dev
[transforms.parse_logs_dev]
type = "remap"
inputs = ["akf"]
source = '''
. = parse_json!(.message)
._lid = to_unix_timestamp(now(),unit: "nanoseconds")
._ts =to_unix_timestamp(now(),unit: "milliseconds")
'''

Lastly, we send the line to EraSearch using the Elastic Search Sink. Again the password, ${ERAPWD} should be in a secret and passed in on startup

[sinks.EraSearch]
  type = "elasticsearch"
  inputs = ["parse_logs_dev"]
  endpoint="https://${ERAHOST}:443"
  # Define this as the index you want to use below, you will need it in Step 4
  index = "logs-akf"
  auth.user="${ERAUSR}"
  auth.password="${ERAPWD}"
  auth.strategy="basic"
  healthcheck.enabled = false

[sources.internal]
type = "internal_metrics"

[sinks.prometheus]
type = "prometheus_exporter"
inputs = ["internal"]
address = "0.0.0.0:9598"

Step 5: View data in EraSearch

Now that your logs are being sent to EraSearch it's easy to create a Grafana datasource.

The version of Grafana that we are utilizing is 8.1.1. We will assume that you have logged into Grafana and are on the Data sources page under configuration.

grafana1

ManyData Sources may have already been created or not, in any case click Add data source

grafana2

In the Filter by name or type type in Elas and click on Elasticsearch that is displayed.

grafana3

Once the Select button is clicked, the following screen is displayed. Enter in the Name, ${ERAHOST}, ${ERAUSR},and the ${ERAPWD}. Also don't forget to toggle the Basic auth and With Credentials sliders so they show blue.

grafana4

After those are inserted, scroll to the bottom and enter in the index, logs-akf for instance which you entered from step 4: under the sink.EraSearch. For the Time field name remove @timestamp and replace it with _ts , click the version to 7.10+ and click Save & test.

grafana5

At this point you should have a valid datasource and can explore your data, build a dashboard, and create alerts.

To try CloudWatch logs with EraSearch and many more capabilities, please contact us to sign up for our free trial.