#1
  1. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    May 2004
    Posts
    3
    Rep Power
    0

    Post need help creating CSV file from data ( from oracle db), using java


    hi,

    I'm trying to create CSV using Java-Jsp from data in oracle db. I've been pointed to the Ostermiller utility. I have very large number of records.

    I basically get each row, put it in a string and use the Ostermiller utility (CSVPrinter) to parse and put the data in a text file. However, with very large number of records, i get java.lang.outofmemory error.

    Has anybody else encountered this problem ? I want to know if my approach is wrong or if i'm not implementing this correctly. Please let me know if i'm being too descriptive (I'll post the code segment otherwise ). I'm new to all this so pl

    Thanks,
  2. #2
  3. Feelin' Groovy
    Devshed Supreme Being (6500+ posts)

    Join Date
    Aug 2001
    Location
    WDSMIA
    Posts
    10,135
    Rep Power
    5054
    My guess is that you're not implementing your solution correctly. I've had no problems with Ostermiller's utilities.

    Please post a short, concise, executable example of what you're trying to do. This does not have to be the actual code you are using. Write a small example that demonstrates your intent, and only that. Wrap the code in a class and give it a main method that runs it - if we can just copy and paste the code into a text file, compile it and run it without any changes, then we can be sure that we haven't made incorrect assumptions about how you are using it.

    Comments on this post

    • stdunbar agrees : What he said - err on the side of too much information, not too little
    Yawmark
    class Sig{public static void main(String...args){\u0066or(int
    \u0020$:"v\"ʲ\"vΤ\"".to\u0043h\u0061rArray()
    )System./*goto/*$/%\u0126//^\u002A\u002Fout.print((char)(($>>
    +(~'"'&'#'))+('<'>>('\\'/'.')/\u002Array.const(~1)\*\u002F)));}}
  4. #3
  5. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    May 2004
    Posts
    3
    Rep Power
    0
    Hi Yawmark,

    The out of memory error only occurs for large number of rows of records ( ~ 400,000 rows returned from querying the db). For smaller numbers, my implementation produces the CSV file just fine.

    Here's the code snippet for you (i couldn't produce an executable like you said... )

    Code:
    //array to put each row of data and write to CSV
    String[] strArr;
    
    myBean.makeConnection();
    
    //CSV File pathname
    String csvPath = application.getRealPath("/CSV")+java.io.File.separator+"test.txt";
    
    File csvFile = new File(csvPath);
    FileOutputStream csvFileOut = new FileOutputStream(csvFile);
    CSVPrinter csvp = new CSVPrinter(csvFileOut);
    
    // executes select query 
    myBean.executeQuery(request,true);
    
     do{
                   int count = 0;
    
    	//query returns resultset with 6 columns ( columnnames are 
    	//put in array ColumnNames[] -- starting at 1)
    
                   for (int i=1; i < 7; i++){
    
                      data = myBean.getParameter(ColumnNames[i]);
                      strArr[count] = data;
                      count++;
                    }
    
                    csvp.writeln(strArr);
    
    } while(myBean.moveNext());
    
    csvp.close();
    csvFileOut.close();
    I hope the code has been helpful, i think it shows what i'm trying to implement. It says java.lang.outofmemory error. I had data with a little more than 400,000 rows. For lesser number of rows, the csv file is created at the given location without any errors.

    Thank you for quick reply earlier and sorry for replying late ( )
  6. #4
  7. Feelin' Groovy
    Devshed Supreme Being (6500+ posts)

    Join Date
    Aug 2001
    Location
    WDSMIA
    Posts
    10,135
    Rep Power
    5054
    Potential solutions:

    1) Add more memory.
    2) For queries that are >= 400,000 break up the query/results into smaller blocks.

    Without knowing what goes on in myBean there's not much else to give. I don't see anything there that looks particularly problematic. Are you getting the exception with the query?
    Yawmark
    class Sig{public static void main(String...args){\u0066or(int
    \u0020$:"v\"ʲ\"vΤ\"".to\u0043h\u0061rArray()
    )System./*goto/*$/%\u0126//^\u002A\u002Fout.print((char)(($>>
    +(~'"'&'#'))+('<'>>('\\'/'.')/\u002Array.const(~1)\*\u002F)));}}
  8. #5
  9. AYBABTU
    Devshed Beginner (1000 - 1499 posts)

    Join Date
    Jul 2004
    Location
    Here or There
    Posts
    1,256
    Rep Power
    376
    Originally Posted by Yawmark
    2) For queries that are >= 400,000 break up the query/results into smaller blocks.
    I would go for this one. You cannot add memory infinitely so eventually, as your results get larger and larger, your system will crash again. We've had problems with big resultsets over here and breaking the results up in chunks worked very well for us.
  10. #6
  11. Contributing User
    Devshed Regular (2000 - 2499 posts)

    Join Date
    May 2004
    Location
    Superior, CO, USA
    Posts
    2,418
    Rep Power
    1713
    Originally Posted by Yawmark
    1) Add more memory.
    You can do this with the -Xmx command line option in the Sun JDK. The default value for the 1.4 and 1.5 JDK is 64m. Try something like -Xmx256m or -Xmx512m. Of course, this depends on you actually having the memory available.

    Also, as Yawmark indicated, it would help to see the code for myBean. It seems like it is reading the entire result set which is certainly not what you want. You should scroll through the result set not maintain the whole thing in memory.
  12. #7
  13. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    May 2004
    Posts
    3
    Rep Power
    0
    Hi all,
    I've included code snippet for myBean.executeQuery() method.

    Also could anyone please point out how to

    1) Add more memory ?
    i'm not familiar with -Xmx command line option, as in how, when to execute it . need to do more research on this I think this might solve my problem. The query execution and myBean implementation I think should be all right but please have a quick look anyhow.

    2) For queries that are >= 400,000 break up the query/results into smaller blocks.

    Code:
    //The myBean.executequery method basically executes a select query
    //stSQL is the Select Statement( the query is built in another method called makeQuery() in the same bean) 
    //stSQL type : String
    public boolean executeQuery(javax.servlet.http.HttpServletRequest request) throws Exception {
    	
    	makeQuery(request); 
    
    	StringBuffer sb = new StringBuffer();
    	boolean retVal=false; // This method returns true/false depending on success of the query execution
    
    	sb.append(stSQL);			
    	stSQL = sb.toString();
    	
    	stmt=myConn.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,ResultSet.CONCUR_READ_ONLY);
    	myRS=stmt.executeQuery(stSQL);
    	myRS.absolute(currentRec);//currentRec sets the row (pointed to beginning of resultset)
    	retVal=true;
    
    return retVal;
    }
    Please let me know if this implementation is causing the problem for large recordsets and how i can modify it.

    Much thanks to all of you for your help and suggestions.
  14. #8
  15. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jun 2013
    Posts
    1
    Rep Power
    0

    program to get database table data to csv file


    The following project may help you


    sourceforge.net/projects/databasetocsv/

    Source:

    github.com/overtomanu/Database7

IMN logo majestic logo threadwatch logo seochat tools logo