#1
  1. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    16
    Rep Power
    0

    Keeping count of the same action by multiple programs


    Hi

    I have an application that includes sending images through a single network interface (wireless modem). The images are of several types sent asynchrously by different programs.

    Some images will be sent by Crontab jobs and others will be sent as a result of external triggers.

    The type of file is identified in the filename prefix.

    I want to be able to keep count of how many files are sent of each different type. Basically I need something like a simple print spooler. Instead of printing docs, it counts the files it sends via the modem. It would also be helpful to record the size of each file. What is the best way to do this??

    I thought of having a file for each type that includes a single counter. So I could then:
    Code:
    Open the counter_type_x  file
    Read the $counter_value
    Increment the $counter_value
    Write  the $counter_value back to the file
    Closefile.
    If a second program tried to open the file while it was being updated by a program, it would fail.



    Dazz
  2. #2
  3. No Profile Picture
    Still Learning
    Devshed Newbie (0 - 499 posts)

    Join Date
    Dec 2012
    Location
    Montreal, Canada
    Posts
    55
    Rep Power
    39
    I will first just address the issue you mentioned of the file being locked when you try to open it while another process has the file open. A file lock can be detected. Each reader of the file can check if the file is locked and wait a bit and try again.

    This is a resource synchronization problem an there are techniques to this properly.
    It takes care to do it right.

    Your print spooler model could work well in the case. A print spooler is a serialization application. It allows many programs to submit information through one channel and and allows for querying information about what was submitted.

    I don't suggest you invent your own serialization technique but get code that works and extend it to suit your needs. You will also need a good understanding of inter process communication.

    Search for these topics. Some results I got follow:

    http://stackoverflow.com/questions/10335851/python-perl-print-server-writing-print-job-to-file-solved
    http://perl.active-venture.com/pod/perlipc-openipc.html
  4. #3
  5. No Profile Picture
    Contributing User
    Devshed Intermediate (1500 - 1999 posts)

    Join Date
    Apr 2009
    Posts
    1,940
    Rep Power
    1225
    Use a database instead of multiple text files acting as counters and presto, problem(s) you mentioned are solved.

    Comments on this post

    • admiraln agrees
  6. #4
  7. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    16
    Rep Power
    0
    Hi

    Can a single standard Perl pipe accept input from multiple pipes or is a pipe required for each individual input?

    I only have 4 inputs.

    Dazz
  8. #5
  9. No Profile Picture
    Contributing User
    Devshed Intermediate (1500 - 1999 posts)

    Join Date
    Apr 2009
    Posts
    1,940
    Rep Power
    1225
    It depends. A daemon listening on a given port can accept multiple connections, but a basic script where data is being piped to it can't.

    Your question is too vague. You need to provide your code and more details on what you are needing to accomplish and how your code is failing to meet your needs.
  10. #6
  11. No Profile Picture
    Contributing User
    Devshed Intermediate (1500 - 1999 posts)

    Join Date
    Apr 2009
    Posts
    1,940
    Rep Power
    1225
    BTW, if you decide to go with the individual files for maintaining the counters, you should use the File::CounterFile module.
  12. #7
  13. No Profile Picture
    Registered User
    Devshed Newbie (0 - 499 posts)

    Join Date
    Jan 2013
    Posts
    16
    Rep Power
    0
    Hi

    I don't have any code yet to show.

    I am building a remote webcam system based on Motion.
    It has two cameras that save periodic snapshots and motion triggered images. Motion creates a thread for each camera.

    The images are saved to a directory for further processing (add logo, filter out and delete dark images). Selected images are then uploaded to a website as shown here on the Mk1 single camera version:
    www dot kartsportwellington.co.nz/TrackCam.aspx

    (note "dot" needs to be replaced with "." to make a valid address)

    A lot more images are saved than sent. I need to actively manage traffic flow to stay under a 750MB monthly data cap on the mobile modem connection. I want to increase the rate of images when movement is detected.

    Long periods are expected where there is no motion and traffic will be low. Movement including rain, spider webs across the lens and people have the potential to trigger high updates rates for long periods of time. That would blow out my monthly data budget.

    I plan to include an algorithm that will shape the traffic flow. I need to monitor the performance and effectiveness of that algorithm. That is the subject of this topic.

    File::CounterFile looks like a useful module.


    I have written a bash script that produces a number of reports from utilities including 'vnstat'. vnstat will tell me how much traffic has flowed, but it won't tell me what has triggered the flow.

    I am now thinking of having 2 log files (one for each camera thread). Each log file will be concatenated with the date/time and size of each image file uploaded. I could then send these log files out with the bash reporting script and then import the log files into Excel for viewing/graphing.

    That I think would avoid any complex coding. I could set up Crontab jobs to send and delete the files once a day (no log rotation or archiving).

    Does anyone have a better idea?


    Dazz

IMN logo majestic logo threadwatch logo seochat tools logo