Parsing csv string

I used the code for gmail csv file parsing as a base and am stuck at a point here.

Using http get I am pulling the file information which contains the csv string in the http_response

ok great. So using the gmail trigger code as a base I am doing this…

Now If I output the csv string (and comment out the elseif) I get the http_response so I know it is reading it. so. I know I want to read through that and parse out. In your example you were reading root/msg/attachment so you listed those in the elseif. pretty clear.

SO that is where I am stuck. If I make the main doc the http_response then i have no node to seelct in the elseif.

Any thoughts on how to direct it down the right path here?

I would just split it on the comma but I need to accomodate the possiblity of commas within double quotes.

The whole point of using the CSV library is that you don’t have to parse the CSV or worry about commas.

Line 25 gets an array of values for each line in the CSV data. The first line is usually the column names, so the if() block line 26 stores the first line of values into an array of column names. The column names are later used in lines 47 & 49 to generate XML node names for each data row.

The else if() on line 35 is a sanity check to ensure the number of values matches what is expected.

Line 37 is your problem, by re-using the gmail code it is expecting a certain format of the input XML in DATASTREAM1. Plus lines 39 thru 43 are not needed b/c the original gmail code is processing multiple messages.

The easiest solution is to add a new output XML document at line 11:

VTDDocument doc_out = new VTDDocument("<root/>");

Then change line 37 to be:

VTDElement elNewOutputRow = doc_out.getRootElement().addElement("row");

Change line 49 to use elNewOutputRow instead of elNewMsg.

Change line 55 to use doc_out instead of doc.

So instead of trying to embed the output into the input DATASTREAM1 XML, this would create a new standalone XML document just containing the converted CSV data.

gotcha. I should have read deeper.

thanks. Will give that a try