'Does Go std lib have a func to read csv file into []map[string]string?
I'd like to read a csv file from disk into a, []map[string]string datatype. Where the []slice is the line number and map["key"] is the header (line 1) of the csv file.
I could not find anything in the standard library to accomplish this.
Solution 1:[1]
Based on reply, it sounds like there is nothing in the standard libraries, like ioutil, to read a csv file into a map.
The following function given a path to a csv file will convert it into a slice of map[string]string.
Update: based on a comment I decided to provide my CSVFileToMap() and MapToCSV() func that writes the map back to a csv file.
package main
import (
"os"
"encoding/csv"
"fmt"
"strings"
)
// CSVFileToMap reads csv file into slice of map
// slice is the line number
// map[string]string where key is column name
func CSVFileToMap(filePath string) (returnMap []map[string]string, err error) {
// read csv file
csvfile, err := os.Open(filePath)
if err != nil {
return nil, fmt.Errorf(err.Error())
}
defer csvfile.Close()
reader := csv.NewReader(csvfile)
rawCSVdata, err := reader.ReadAll()
if err != nil {
return nil, fmt.Errorf(err.Error())
}
header := []string{} // holds first row (header)
for lineNum, record := range rawCSVdata {
// for first row, build the header slice
if lineNum == 0 {
for i := 0; i < len(record); i++ {
header = append(header, strings.TrimSpace(record[i]))
}
} else {
// for each cell, map[string]string k=header v=value
line := map[string]string{}
for i := 0; i < len(record); i++ {
line[header[i]] = record[i]
}
returnMap = append(returnMap, line)
}
}
return
}
// MapToCSVFile writes slice of map into csv file
// filterFields filters to only the fields in the slice, and maintains order when writing to file
func MapToCSVFile(inputSliceMap []map[string]string, filePath string, filterFields []string) (err error) {
var headers []string // slice of each header field
var line []string // slice of each line field
var csvLine string // string of line converted to csv
var CSVContent string // final output of csv containing header and lines
// iter over slice to get all possible keys (csv header) in the maps
// using empty Map[string]struct{} to get UNIQUE Keys; no value needed
var headerMap = make(map[string]struct{})
for _, record := range inputSliceMap {
for k, _ := range record {
headerMap[k] = struct{}{}
}
}
// convert unique headersMap to slice
for headerValue, _ := range headerMap {
headers = append(headers, headerValue)
}
// filter to filteredFields and maintain order
var filteredHeaders []string
if len(filterFields) > 0 {
for _, filterField := range filterFields {
for _, headerValue := range headers {
if filterField == headerValue {
filteredHeaders = append(filteredHeaders, headerValue)
}
}
}
} else {
filteredHeaders = append(filteredHeaders, headers...)
sort.Strings(filteredHeaders) // alpha sort headers
}
// write headers as the first line
csvLine, _ = WriteAsCSV(filteredHeaders)
CSVContent += csvLine + "\n"
// iter over inputSliceMap to get values for each map
// maintain order provided in header slice
// write to csv
for _, record := range inputSliceMap {
line = []string{}
// lines
for k, _ := range filteredHeaders {
line = append(line, record[filteredHeaders[k]])
}
csvLine, _ = WriteAsCSV(line)
CSVContent += csvLine + "\n"
}
// make the dir incase it's not there
err = os.MkdirAll(filepath.Dir(filePath), os.ModePerm)
if err != nil {
return err
}
// write out the csv contents to file
ioutil.WriteFile(filePath, []byte(CSVContent), os.FileMode(0644))
if err != nil {
return err
}
return
}
func WriteAsCSV(vals []string) (string, error) {
b := &bytes.Buffer{}
w := csv.NewWriter(b)
err := w.Write(vals)
if err != nil {
return "", err
}
w.Flush()
return strings.TrimSuffix(b.String(), "\n"), nil
}
Finally, here is a test case to show it's usage:
func TestMapToCSVFile(t *testing.T) {
// note: test case requires the file ExistingCSVFile exist on disk with a
// few rows of csv data
SomeKey := "some_column"
ValueForKey := "some_value"
OutputCSVFile := `.\someFile.csv`
ExistingCSVFile := `.\someExistingFile.csv`
// read csv file
InputCSVSliceMap, err := CSVFileToMap(ExistingCSVFile)
if err != nil {
t.Fatalf("MapToCSVFile() failed %v", err)
}
// add a field in the middle of csv
InputCSVSliceMap[2][SomeKey] = ValueForKey // add a new column name
"some_key" with a value of "some_value" to the second line.
err = MapToCSVFile(InputCSVSliceMap, OutputReport, nil)
if err != nil {
t.Fatalf("MapToCSVFile() failed writing outputReport %v", err)
}
// VALIDATION: check that Key field is present in MapToCSVFile output file
// read Output csv file
OutputCSVSliceMap, err := CSVFileToMap(OutputCSVFile)
if err != nil {
t.Fatalf("MapToCSVFile() failed reading output file %v", err)
}
// check that the added key has a value for Key
if OutputCSVSliceMap[2][SomeKey] != ValueForKey {
t.Fatalf("MapToCSVFile() expected row to contains key value: %v", ValueForKey)
}
}
Solution 2:[2]
It is disappointing that the Go csv package does not contain anything as useful as the csv.DictReader we have in Python (https://docs.python.org/3/library/csv.html#csv.DictReader).
However, I have found the csvutil package (https://github.com/jszwec/csvutil) very useful. It doesn't seem to support unmarshalling records to map[string]string but you can unmarshal directly to a struct type, which IMO is more useful.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | |
| Solution 2 | Ryan Collingham |
