Amazon S3 plugin with Reduced Redundancy Storage for Fluentd
This is a fork of the original fluent-plugin-s3 plugin with focus on large file handling.
- Plugin uses
@type s3_sr
instead of@type s3
in configurations - Added memory-efficient gzip handling for large files (70MB+)
- Version restarted at 1.0.0
s3-sr output plugin buffers event logs in local file and upload it to S3 periodically. This fork adds enhanced support for optimizations for handling large gzip files.
This plugin splits files exactly by using the time of event logs (not the time when the logs are received). For example, a log '2011-01-02 message B' is reached, and then another log '2011-01-03 message B' is reached in this order, the former one is stored in "20110102.gz" file, and latter one in "20110103.gz" file.
s3-sr input plugin reads data from S3 periodically. This plugin uses SQS queue on the region same as S3 bucket. This version includes memory optimizations for handling large gzip files (70MB+) efficiently. We must setup SQS queue and S3 event notification before use this plugin.
fluent-plugin-s3 | fluentd | ruby |
---|---|---|
>= 1.0.0 | >= v0.14.0 | >= 2.1 |
< 1.0.0 | >= v0.12.0 | >= 1.9 |
Simply use RubyGems:
# install latest version
$ gem install fluent-plugin-s3-sr --no-document # for fluentd v1.0 or later
# If you need to install specific version, use -v option
$ gem install fluent-plugin-s3-sr -v 1.0.0 --no-document
Both S3 input/output plugin provide several credential methods for authentication/authorization.
See Configuration: credentials about details.
See Configuration: Output about details.
See Configuration: Input about details.
See Migration guide from v0.12 about details.
Web site | http://fluentd.org/ |
---|---|
Documents | http://docs.fluentd.org/ |
Source repository | http://github.com/fluent/fluent-plugin-s3 |
Discussion | http://groups.google.com/group/fluentd |
Author | Sadayuki Furuhashi |
Copyright | (c) 2011 FURUHASHI Sadayuki |
License | Apache License, Version 2.0 |