Fluentd
Note
You can create a trigger that will launch a function in Cloud Functions or a container in Serverless Containers when data is sent to the stream. Read more about triggers for Data Streams.
-
Download and install Fluentd
. -
Install the Fluentd plugin to support the AWS Kinesis Data Streams protocol. This protocol will be used for streaming data.
sudo td-agent-gem install fluent-plugin-kinesis
-
In the management console
, select the folder with the stream. -
Select Data Streams.
-
Select the data stream.
-
Click Connect and go to the Fluentd tab.
-
Copy a sample configuration file and paste it into the
/etc/td-agent/td-agent.conf
file.Replace the
<key_id>
and<secret>
with the secret key and its ID.Sample configuration file:
<system> log_level debug </system> <source> @type http @id input_http port 8888 </source> <match kinesis> @type copy <store> @type stdout </store> <store> @type kinesis_streams aws_key_id <key_id> aws_sec_key <secret> # kinesis stream name stream_name /ru-central1/aoegtvhtp8ob9rqq8sto/cc8004q4lbo6bv9iivr0/test # region region ru-central-1 endpoint https://yds.serverless.yandexcloud.net <buffer> flush_interval 5s </buffer> </store> </match>
-
Send the test data to Fluentd:
curl -X POST -d 'json={"user_id":"user1", "score": 100}' http://localhost:8888/kinesis
If the setup is successful, the Fluentd
/var/log/td-agent/td-agent.log
operation log will include a message about receiving the data and sending it to Yandex Data Streams over the AWS Kinesis Data Streams protocol:kinesis: {"json":"message"} DEBUG -- : [Aws::Kinesis::Client 200 0.628973 0 retries] put_records(stream_name:"/ru-central1/aoeu1kuk2dhtaupdb1es/cc8029jgtuabequtgtbv/fluentd_stream",records:[{data:"{\"message\":\"Write chunk 5c0cf5c556654e99cac84b6e231347ba / 2 records / 0 KB\"}\n",partition_key:"6ec03a4e3ba832c85e80290161c1df8e"},{data:"{\"message\":\"Finish writing chunk\"}\n",partition_key:"8ada32f7373e1ab4c48fb96da43d59cf"},{data:"{\"json\":\"message\"}\n",partition_key:"70f21f2decfc90b6f19752cd6e66e611"}])