Cited from jq official homepage:
is like
sedfor JSON data - you can use it to slice and filter and map and transform structured data with the same ease thatsed,awk,grepand friends let you play with text.
It is, indeed, a powerful tool to inspect JSON data. You can select specific data, transform any objects based on current condition, even define functions to do advance scripting.
These are some sample jq queries:
# all entries -> all data entries -> fieldname$ -> index=0 -> childname -> index [5,10)
jq '.[].data[].["fieldname$"].[0]."childname".[5:10]' filename.json
jq '.[]?.data.[]?.["fieldname$"].[0]."childname"?.[5:10]' filename.json # with error handling
jq '.error // .msg' filename.json # show ".error" if not false and not null, ".msg" otherwise
# pipe combines filter
jq '.[] | .data[] | .["fieldname$"] | .[0]' filename.json
# array and object construction
jq '{(.key): [.msg, .data[]], time: .time}' filename.json
# data modification by given paths
jq 'pick(.msg, .time, .error)' filename.json
jq 'del(.error)' filename.json
# select by given condition
jq 'select(has("error") or .level == "debug" and (has("msg") | not))' filename.json
# map(f) is equivalent to '[.[] | f]'
# map_values(f) is equivalent to .[] |= f (update assignment)
# to_entries constructs key-value object
# with_entries(f) is a shorthand for 'to_entries | map(f) | from_entries'
jq 'to_entries | map(select(.value) | .key)' filename.json
jq 'with_entries(select(.value) | (.key |= tostring))' filename.json
# indices show all array indexes those match given object
jq 'indices(true)' filename.json
# condition logic
jq 'map(.data | if . == null then null else (. | length) end)' filename.json