September 16th, 2013, 08:05 PM
-
A ruby script to convert all csv files in a folder to json
I am trying to build a script to convert all .csv files in a folder to .json. I have a script with me where I use FasterCSV to convert individual CSVs to JSON i.e like this:
Code:
#!/usr/local/bin/ruby
require "rubygems"
require "fastercsv"
require "json"
jsonFile = File.open("some.json",'w');
jsonData = {}
FasterCSV.foreach("some.csv",
:headers => true, :header_converters => :symbol) do |row|
# puts "Row is #{row}"
jsonData[somename] = {
:name => row[2],
:length => row[3],
:lat => row[4],
:long => row[5],
};
# puts "data is #{jsonData[jsonData.length-1]}"
jsonFile.write(JSON.pretty_generate(jsonData));
But how can I change this to search all csv files in a folder and generate corresponding JSON.
2. One other thing I am stuck with is how to generate nested JSON i.e
currently my JSON is like this:
Code:
{
"NAME1": {
"LAT": "37.847048",
"Name": "SDFSDG",
"length": "0.03",
"LON": "-123.3433"
}
}{
"NAME1": {
"LAT": "37.847048",
"Name": "SDFSDFSDAFDG",
"length": "0.03",
"LON": "-123.32334433"
}
}
One issue with above is that NAME1 is redundant. So I want to add it to the top and then have 2 seperate fields with other data like "lat", "lon", "length".
I would really appreciate the help.
November 16th, 2013, 01:37 PM
-
Hi,
I think following code should work for you. I have modified it little to read the json field names from the csv headers.
Code:
#!/usr/local/bin/ruby
require "rubygems"
require "fastercsv"
require "json"
require "pathname"
Dir["./**/*.csv"].each do |csv_file_path|
puts csv_file_path
file_name = Pathname.new(csv_file_path).basename(".csv").to_s
File.open("#{file_name}.json",'w') do |json_file|
jsonData = FasterCSV.read(csv_file_path,
:headers => true, :header_converters => :symbol).map{|csv_row| csv_row.to_hash}
json_file.write(JSON.pretty_generate(jsonData));
end
end