Skip to content

Latest commit

 

History

History
205 lines (136 loc) · 6.44 KB

ForwardingLogs.rst

File metadata and controls

205 lines (136 loc) · 6.44 KB

Forward Logs to a Central Logging Server

Overview

Aviatrix supports forwarding logs from the gateway(s) to a central logging server for processing. This is particularly useful for connecting to services such as Datadog or for processing the logs prior to being sent to another log service such as Splunk.

Step-by-step Deployment Guide

Solution Overview

In addition to direct integrations with services like Splunk and Sumologic, the Aviatrix Controller supports forwarding logs to another syslog server.

If your logging provider requires that you process the data before it is sent to it, you can forward these logs to a Linux server that will handle the processing and forwarding. This solution guide will walk you through configuring this.

Steps to complete:

imageOverview

  1. Create an EC2 instance that will receive logs from all gateways
  2. Configure rsyslogd on the instance
  3. Configure Aviatrix to forward logs your new instance
  4. Update the security group for the instance to allow traffic from each Aviatrix Gateway
  5. Implement processing logic (or use one provided by Aviatrix) to process and forward the logs

Create an EC2 instance

  1. Create a new EC2 instance to receive logs from the gateways

rsyslogd

  1. Install rsyslogd

    sudo add-apt-repository ppa:adiscon/v8-stable
    sudo apt-get update
    sudo apt-get install rsyslog

    Note

    (see http://www.rsyslog.com/ubuntu-repository/ for more details)

  2. Configure rsyslogd to accept messages on your desired port
    1. SSH and login to the EC2 instance
    2. Check that the /etc/rsyslogd.conf is listening on UDP or TCP port. Optionally, update the port.

      # provides UDP syslog reception
      $ModLoad imudp
      $UDPServerRun 10514

      Note

      Use any port you wish here.

Enable Log Forwarding in Aviatrix Controller

  1. Login to your Aviatrix Controller
  2. Expand the Settings navigation group
  3. Click on Logging
  4. Scroll down to the Remote Syslog section. Click on the Disabled button to enable remote syslog.
  5. Enter the data

    Field Value
    Server Enter the IP address of the EC2 instance created in earlier step.
    Port Enter the port of the listening service
    Cert Upload the certificate (optional)
    Protocol Select TCP or UDP
    Optional Custom Template Enter a rsyslog template. See below for more details.
  6. Click Enable

Update Security Group of Receiving Instance

Allow inbound traffic on the selected UDP/TCP port to the EC2 instance you created earlier.

Implement Processing Logic

Implement the logic to process the incoming logs and forward to the log service. A few examples are provided below.

Write Logs to S3

  1. Install AWS CLI

    Note

    You may need to install the package python-pip first

  2. Create a directory on the local file system (e.g., /var/log/aviatrix)

    sudo mkdir /var/log/aviatrix
  3. Change the ownership of this directory to allow the rsyslogd user to write files to this directory

    sudo chown syslog:adm /var/log/aviatrix
    sudo chmod 750 /var/log/aviatrix
  4. Create a new rsyslogd configuration file /etc/rsyslog.d/22-aviatrix.conf with the following configuration:

    :msg, contains, "Aviatrix" /var/log/aviatrix/gateways.log
    
    # comment out the following line to allow Aviatrix messages through.
    & stop
  5. (optional) Reload rsyslogd configuration

    sudo /etc/init.d/rsyslogd force-reload
  6. Create a script to move the log files to S3. There is a template below:

    #!/bin/sh
    
    DIR=/var/log/aviatrix
    if [ ! -d ${DIR} ]; then exit 1; fi
    DESTDIR=s3://mybucket
    
    current_time=$(date +%Y-%m-%dT%H-%M-%S)
    new_filename=gateways.${current_time}.log
    
    # rename the file
    if [ -f ${DIR}/gateways.log ]; then
        sudo mv ${DIR}/gateways.log ${DIR}/${new_filename}
        if [ $? -ne 0 ]; then exit 2; fi
    
        # HUP rsyslogd to start logging to new file
        sudo killall -HUP rsyslogd
        if [ $? -ne 0 ]; then exit 3; fi
    fi
    
    # copy any outstanding file(s) to s3 bucket
    cd ${DIR}
    for f in $(ls); do
      if [ "$f" != "gateways.log" ]; then
          aws s3 cp ${DIR}/$f ${DESTDIR}/${new_filename}
          if [ $? -eq 0 ]; then
              sudo rm -f ${DIR}/$f
          fi
      fi
    done
  7. Create a crontab entry to run this script as often as desired

Datadog

For Datadog integration, please see this Github repository.

Use the following `Optional Custom Template`:

constant(value="tenant-identifier")
constant(value="   ")
property(name="timereported")
constant(value="   ")
property(name="msg")
constant(value="

")