Easily pipe docker logs
output from an
AWS ECS into
AWS Elasticsearch service
for later visualization with Kibana using Logstash (aka the ELK
Stack).
This repository may become deprecated when the support for the AWS CloudWatch Logs logging driver is added to the ECS agent. @samuelkarp wrote the logging driver and happens to work for AWS on ECS, so this seems inevitable.
Prerequisites: Docker >= 1.8. If you use Docker-compose, make sure its version >= 1.5
-
Spin up an Elasticsearch server. The easiest way to do this is via the AWS Elasticsearch Service:
-
Click "Create a new domain"
-
Set a domain name
-
Use the default options
-
Set an Access Policy. I suggest applying both IAM access to write to elasticsearch from your AWS account and IP-specific access so you can view logging outputs in Kibana. Here's a sample policy (be sure to change the region and xxxxxxxxxxxx with your 12-digit AWS account ID):
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::xxxxxxxxxxxx:root" }, "Action": "es:", "Resource": "arn:aws:es:us-west-2:xxxxxxxxxxxx:domain/my-elasticsearch-domain/" }, { "Sid": "", "Effect": "Allow", "Principal": { "AWS": "" }, "Action": "es:", "Resource": "arn:aws:es:us-west-2:xxxxxxxxxxxx:domain/my-elasticsearch-domain/*", "Condition": { "IpAddress": { "aws:SourceIp": [ "192.168.1.0", "192.168.1.1" ] } } } ] }
2. Ensure your AWS credentials are available in `$HOME/.aws`. You can
configure this via the [AWS cli](https://aws.amazon.com/cli/)
command `aws configure`.
3. Start docker logging container.
docker run -it -p 12201:12201/udp -v ~/.aws/:/root/.aws\
-e ELASTICSEARCH_HOST=search-pschmitt-es-test-3pm4igbk4q3nr5racsahpugud4.us-west-2.es.amazonaws.com \
pedros007/docker-logging /start_logstash.sh
If you do not have access to the `pedros007/docker-logging`
Docker repo, build it yourself first:
docker build -t pedros007/docker-logging .
4. Start a Docker container which you want logged using the Docker
logging flags. Here's a simple example:
docker run --log-driver=gelf --log-opt gelf-address=udp://localhost:12201 \
busybox /bin/sh -c 'while true; do echo "Hello $(date)"; sleep 1; done'
# Deploying to an ECS Cluster
1. Create an Elasticsearch Cluster with
[AWS Elasticsearch Service](https://aws.amazon.com/elasticsearch-service/)
(see setup, above) and make a note of the Elasticsearch URL.
2. Configure ECS cluster. Here's how you do it with CloudFormation:
1. The
[AWS ECS-optimized AMI](https://aws.amazon.com/marketplace/pp/B00U6QTYI2)
(2015.09.b) is running docker-1.7.1 as of this
writing. [A post in the AWS forums](https://forums.aws.amazon.com/thread.jspa?messageID=683482)
states "[AWS is] testing 1.9 RC and plan to deliver it this
month." It's not ready yet, so we must manually upgrade Docker.
We also fetch the Logstash configuration & pass in the
Elasticsearch URL and start the Logstash container. Add this to
the `commands` section of your
[AWS::Cloudformation::Init](http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-init.html).
```
"03_upgrade_docker_for_log_driver_support": {
"command": {
"Fn::Join": [
"",
[
"#!/bin/bash -xe\n",
"service docker stop\n",
"cp /usr/bin/docker /usr/bin/docker.old\n",
"curl -o /usr/bin/docker https://get.docker.com/builds/Linux/x86_64/docker-1.9.0\n",
"service docker start\n"
]
]
}
},
"04_configure_docker_logstash": {
"command": {
"Fn::Join": [
"",
[
"#!/bin/bash -xe\n",
"docker run -d --restart=always -p 12201:12201/udp",
" -e ELASTICSEARCH_HOST=",
{
"Ref": "ElasticsearchAddress"
},
" -e AWS_REGION=",
{
"Ref": "AWS::Region"
},
" pedros007/docker-logging /start_logstash.sh\n"
]
]
}
}
```
2. Add ElasticsearchAddress to your `Parameters` section:
```
"ElasticsearchAddress": {
"Type": "String",
"Description": "Host of Elasticsearch server for logging. Do not add http:// or the port. Ensure the access policy permits access."
}
-
Make sure your EC2 Instance or Autoscaling Group has an Instance Profile and Role which grant write access to your Elasticsearch service. Here are example Cloud Formation resources which enable this. Make sure your
AWS::EC2::Instance
orAWS::AutoScaling::LaunchConfiguration
haveIamInstanceProfile
set to the Instance Profile resource created by CloudFormation (in the example below, the setting would be"IamInstanceProfile": { "Ref": "EC2InstanceProfile" },
):"EC2Role": { "Type": "AWS::IAM::Role", "Metadata": { "Comment": "Defines all permissions which an EC2 Instance attached to ECS Cluster should have" }, "Properties": { "AssumeRolePolicyDocument": { "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ "ec2.amazonaws.com" ] }, "Action": [ "sts:AssumeRole" ] } ] }, "Path": "/", "ManagedPolicyArns": [ "arn:aws:iam::aws:policy/service-role/AmazonEC2ContainerServiceforEC2Role", "arn:aws:iam::aws:policy/AmazonESFullAccess", "arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess", "arn:aws:iam::aws:policy/AmazonSQSFullAccess" ] } }, "EC2InstanceProfile": { "Type": "AWS::IAM::InstanceProfile", "Properties": { "Path": "/", "Roles": [ { "Ref": "EC2Role" } ] } }
```
-
Submit an ECS task definition which uses the gelf logging driver. The ContainerDefinition should include a section like this:
"logConfiguration": { "logDriver": "gelf", "options": { "gelf-address": "udp://localhost:12201", "tag": "nginx" } }
Note the log option
tag
requires Docker > 1.9. For Docker 1.8, usegelf-tag
. Otherwise, ECS may reportFailed to initialize logging driver: unknown log opt 'tag' for gelf log driver".
As of this writing, the CloudFormation AWS::ECS::TaskDefintiion does not support the logConfiguration settings of an ECS TaskDefinition. Watch the Cloudformation Release History to be notified when this will be supported.
- The Graylog Extended Log Format (GELF) driver communicates via UDP, which can silently dropping logging events. TCP/syslog can provide a more robust solution. See this StackOverflow for some more details.
- There are many ways to pipe
docker logs
into Elasticsearch. This docker-logstash repo demonstrates a couple of options (gelf, lumberjack, syslog & tcp). - Find user-configurable variables on lines starting with
ENV
in Dockerfile.