'How to convert a txt file to a `json` object using Shell?
I have a text file, which I want to convert to a json object:
MAX_PDQPRIORITY: 80
DS_MAX_QUERIES: 50
DS_MAX_SCANS: 1048576
DS_NONPDQ_QUERY_MEM: 100000 KB
DS_TOTAL_MEMORY: 1000000 KB
My script outputs wrong and I have to manually edit it to json.
How do I use shell to make this change?
Desired output:
[
{
"MAX_PDQPRIORITY":"80",
"DS_MAX_QUERIES":"50",
"DS_MAX_SCANS":"1048576",
"DS_NONPDQ_QUERY_MEM":"100000",
"DS_TOTAL_MEMORY":"1000000"
}
]
Script:
#!/bin/bash
# date:2019-02-02
# informix Show mgmdy .
LANG=EN
pathfile='/home/ampmon/agents/zabbix-agent/script/informix/text'
#expect mgm.#expect |grep -Ev 'Password:|spawn|Invalid' >$pathfile/mgm1.txt
cat $pathfile/mgm1.txt|grep MGM -A 8|grep -Ev 'MGM|-|^$' >$pathfile/mgm.txt
check=`cat $pathfile/mgm.txt|wc -l`
if [ $check -eq 0 ];then
echo "No query results"
exit 1
fi
MAX_PDQPRIORITY=($(cat $pathfile/mgm.txt|grep MAX_PDQPRIORITY |awk -F[:] '{print $2}'|awk '{print $1*1.00}'))
DS_MAX_QUERIES=($(cat $pathfile/mgm.txt|grep DS_MAX_QUERIES |awk -F[:] '{print $2}'|awk '{print $1}'))
DS_MAX_SCANS=($(cat $pathfile/mgm.txt|grep DS_MAX_SCANS |awk -F[:] '{print $2}'|awk '{print $1}'))
DS_NONPDQ_QUERY_MEM=($(cat $pathfile/mgm.txt|grep DS_NONPDQ_QUERY_MEM |awk -F[:] '{print $2}'|awk '{print $1}'))
DS_TOTAL_MEMORY=($(cat $pathfile/mgm.txt|grep DS_TOTAL_MEMORY |awk -F[:] '{print $2}'|awk '{print $1}'))
printf '\t[\n'
printf '\t\t{\n'
printf "\t\t\t \"MAX_PDQPRIORITY\":\"${MAX_PDQPRIORITY}\",\"DS_MAX_QUERIES\":\"${DS_MAX_QUERIES}\",\"DS_MAX_SCANS\":\"${DS_MAX_SCANS}\",\"DS_NONPDQ_QUERY_MEM\":\"${DS_NONPDQ_QUERY_MEM}\",\"DS_TOTAL_MEMORY\":\"${DS_TOTAL_MEMORY}\"}\n"
printf "\t]\n"
My current output:
[
{
","DS_NONPDQ_QUERY_MEM":"100000","DS_TOTAL_MEMORY":"1000000"}ES":"50
]
Can someone help me?
Solution 1:[1]
If jq is available, please try:
jq -s -R '[[ split("\n")[] | select(length > 0) | split(": +";"") | {(.[0]): .[1]}] | add]' input.txt
Output:
[
{
"MAX_PDQPRIORITY": "80",
"DS_MAX_QUERIES": "50",
"DS_MAX_SCANS": "1048576",
"DS_NONPDQ_QUERY_MEM": "100000 KB",
"DS_TOTAL_MEMORY": "1000000 KB"
}
]
As an alternative, if python happens to be your option, following will work as well:
#!/bin/bash
python -c '
import re
import json
import collections as cl
list = []
with open("input.txt") as f:
od = cl.OrderedDict()
for line in f:
key, val = re.split(r":\s*", line.rstrip("\r\n"))
od[key] = val
list.append(od)
print (json.dumps(list, indent=4))
'
Hope this helps.
Solution 2:[2]
For a simple translation, try using awk; it only reads the file once:
BEGIN {
print "{"
}
{
name=substr($1, 1, length($1)-1)
value=$2
print "\t\""name"\":\""value"\","
}
END {
print "}"
}
This strips the trailing colon from field 1, then prints the values surrounded by double-quotes. It also silently drops the units (KB), as your sample output indicates.
Solution 3:[3]
or perl:
with JSON module
perl -MJSON -lne ' @F = split(/:?\s+/); $data{$F[0]} = $F[1] } END { print encode_json [\%data] ' filewithout
perl -lne ' @F = split(/:?\s+/); push @data, sprintf(q{"%s":"%s"}, map {s/"/""/g; $_} @F[0,1]); } END { print "[{", join(",", @data), "}]"; ' file
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | tshiono |
| Solution 2 | Jeff Schaller |
| Solution 3 | glenn jackman |
