Skip to main content

Datadog Logs

Log Streaming to Datadog

Following the general pattern for log streaming from applications running on CloudFlow, in this guide we give specifics for Datadog.

Obtain the following information from your instance of Datadog:

  • DATADOG_API_KEY: this will appear during the Datadog setup wizard. Or generate one by visiting the API Keys area of Organization Settings.

Deployment

The following deployment will run the Fluentd log forwarder in your CloudFlow project, gathering logs from other pods in that same project. Substitute DATADOG_API_KEY accordingly.

datadog-logs-deployment.yaml
apiVersion: v1
kind: Service
metadata:
labels:
app: fluentd
section.io/logstream-destination: "true"
name: fluentd
namespace: default
spec:
ports:
- name: fluentdudp
port: 5160
protocol: UDP
targetPort: 5160
selector:
app: fluentd
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: fluentd
namespace: default
labels:
app: fluentd
spec:
replicas: 1
selector:
matchLabels:
app: fluentd
template:
metadata:
labels:
app: fluentd
section.io/logstream-collect: "false"
spec:
containers:
- name: fluentd
image: ghcr.io/section/fluentd-datadog:master
imagePullPolicy: Always
resources:
requests:
memory: ".5Gi"
cpu: "500m"
limits:
memory: ".5Gi"
cpu: "500m"
volumeMounts:
- name: config
mountPath: /fluentd/etc/fluent.conf
readOnly: true
subPath: fluent.conf
volumes:
- name: config
configMap:
name: fluent-conf
---
apiVersion: v1
kind: ConfigMap
metadata:
name: fluent-conf
namespace: default
data:
fluent.conf: |-
<source>
@type udp
tag all_cloudflow_logs
<parse>
@type json
</parse>
port 5160
message_length_limit 1MB
</source>

<match all_cloudflow_logs>
@type datadog
@id awesome_agent
api_key DATADOG_API_KEY

<buffer>
@type memory
flush_thread_count 4
flush_interval 3s
chunk_limit_size 5m
chunk_limit_records 500
</buffer>
</match>

Apply the above resources with kubectl apply -f datadog-logs-deployment.yaml into the same project where the pods to be logged are running.

View Logs in Datadog

Login to your Datadog account in order to see your logs.