#1
  1. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    6
    Rep Power
    0

    Parallel processing of perl scripts


    Hi all,
    We have a perl script which can be invoked by many shell scripts.the function of the perl script is, it accepts a file as input, removes leading and trailing spaces between fields and writes into an output file. In our case two different shell scripts called this perl script at the same time for different input files and while writing into their respective out files, both the process failed.
    Error Message: Error while writing into out file

    Kindly help to resolve this. our requirement is that the script should not fail even its invoked n number of times at the same time.

    underlined the line where it fails
    Script:

    #!/usr/bin/perl

    #open file
    $starttimestamp=time();
    $timestamp=localtime();
    print "Start text editsinglepipe.pl $timestamp\n";
    print " File to edit @ARGV[0]\n";
    print " File to write @ARGV[1]\n";

    $timestamp=localtime();
    print " open read file $timestamp\n";
    open(RFILE,"@ARGV[0]") or die("Unable to open read file");

    $timestamp=localtime();
    print " open write file $timestamp\n";
    open(WFILE,"+>@ARGV[1]") or die("Unable to open write file");

    $timestamp=localtime();
    print " loop through file $timestamp\n";

    $filecount=0;
    #loop through each record of the file
    while ($line = <RFILE> )
    {
    $newline ="";

    #Split each column in the record
    @record = split(/\|/,$line);

    #loop threw each record.
    foreach $column (@record)
    {
    $column =~ s/^\s+//; #removes leading space
    $column =~ s/\s+$//; #removes trailing space
    $column =~tr/\x80-\xFF//d; # removes non-ascii characters.
    $newline = "$newline$column|"; #puts line back together.
    }

    #Write to file;
    print WFILE "$newline\n";
    $newline='';

    #Count records
    $filecount=$filecount+1;
    }

    $timestamp=localtime();
    print " close file $timestamp\n";

    #Close files
    close(RFILE);
    close(WFILE);

    $endtimestamp=time();
    $totaltime = $endtimestamp-$starttimestamp;
    $timestamp=localtime();
    print " Edited $filecount Record(s) in $totaltime Seconds\n";
    print "End text editsinglepipe.pl $timestamp\n";
  2. #2
  3. !~ /m$/
    Devshed Specialist (4000 - 4499 posts)

    Join Date
    May 2004
    Location
    Reno, NV
    Posts
    4,261
    Rep Power
    1810
    Sounds like the problem is with file access, not running multiple perl instances.

    Although you can explore the idea of file locking depending on your operating system, the easier solution would be to use a database to store your records, since they have already been designed to handle multiple users.
  4. #3
  5. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    6
    Rep Power
    0
    [QUOTE=keath]Sounds like the problem is with file access, not running multiple perl instances.


    Thanks for the reply Keath.
    I dont think its a file locking issue.
    Example: i am running two instance of same perl script. both creating different files. even though both creates different files, it fails at line underlined above.
  6. #4
  7. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    6
    Rep Power
    0

    Parallel::ForkManager


    can Parallel::ForkManager be used for the above issue?
    how does this work?

IMN logo majestic logo threadwatch logo seochat tools logo