Member-only story
Tailing Kubernetes JSON Logs? jq can help
A bit of JSON parsing goes a long way
The Problem
Do you have logs that look like this? Maybe you’re just tailing logs from a Kubernetes pod, such as kubectl logs -f my-pod-12345
and you see something like the following:
{"timestamp": "21031", "level":"warn", "message": "something go brr", "user_context":"1234"}
{"timestamp": "21031", "level": "info", "message": "something else go brr", "user_context": "1234"}
Or maybe if they’re logs with so many fields that they wrap, so that you can barely tell what’s happening.
What can we do to handle these annoying log streams? Why didn’t we just use logfmt?!
Fear not! I have a simple solution: jq
The Solution
This is a relatively complex jq function, since it handles the case when the line isn’t valid JSON, but behold!
kubectl logs -f my-pod-123 |
jq -Rr '. as $line | try (fromjson | [.timestamp, .level, .message] | @tsv ) catch $line'
To break this down specifically:
'. as $line | try (fromjson | [.timestamp, .level, .message] | @tsv ) catch $line'
- the
. as $line
sets the text to a variable,$line
, which is what’s being read from the…