January 14th, 2013, 07:54 PM
Keeping count of the same action by multiple programs
I have an application that includes sending images through a single network interface (wireless modem). The images are of several types sent asynchrously by different programs.
Some images will be sent by Crontab jobs and others will be sent as a result of external triggers.
The type of file is identified in the filename prefix.
I want to be able to keep count of how many files are sent of each different type. Basically I need something like a simple print spooler. Instead of printing docs, it counts the files it sends via the modem. It would also be helpful to record the size of each file. What is the best way to do this??
I thought of having a file for each type that includes a single counter. So I could then:
If a second program tried to open the file while it was being updated by a program, it would fail.
Open the counter_type_x file
Read the $counter_value
Increment the $counter_value
Write the $counter_value back to the file
January 16th, 2013, 05:16 AM
I will first just address the issue you mentioned of the file being locked when you try to open it while another process has the file open. A file lock can be detected. Each reader of the file can check if the file is locked and wait a bit and try again.
This is a resource synchronization problem an there are techniques to this properly.
It takes care to do it right.
Your print spooler model could work well in the case. A print spooler is a serialization application. It allows many programs to submit information through one channel and and allows for querying information about what was submitted.
I don't suggest you invent your own serialization technique but get code that works and extend it to suit your needs. You will also need a good understanding of inter process communication.
Search for these topics. Some results I got follow:
January 16th, 2013, 09:11 AM
Use a database instead of multiple text files acting as counters and presto, problem(s) you mentioned are solved.
Comments on this post
January 19th, 2013, 05:27 AM
Can a single standard Perl pipe accept input from multiple pipes or is a pipe required for each individual input?
I only have 4 inputs.
January 19th, 2013, 09:14 AM
It depends. A daemon listening on a given port can accept multiple connections, but a basic script where data is being piped to it can't.
Your question is too vague. You need to provide your code and more details on what you are needing to accomplish and how your code is failing to meet your needs.
January 19th, 2013, 09:17 AM
BTW, if you decide to go with the individual files for maintaining the counters, you should use the File::CounterFile module.
January 23rd, 2013, 06:35 PM
I don't have any code yet to show.
I am building a remote webcam system based on Motion.
It has two cameras that save periodic snapshots and motion triggered images. Motion creates a thread for each camera.
The images are saved to a directory for further processing (add logo, filter out and delete dark images). Selected images are then uploaded to a website as shown here on the Mk1 single camera version:
www dot kartsportwellington.co.nz/TrackCam.aspx
(note "dot" needs to be replaced with "." to make a valid address)
A lot more images are saved than sent. I need to actively manage traffic flow to stay under a 750MB monthly data cap on the mobile modem connection. I want to increase the rate of images when movement is detected.
Long periods are expected where there is no motion and traffic will be low. Movement including rain, spider webs across the lens and people have the potential to trigger high updates rates for long periods of time. That would blow out my monthly data budget.
I plan to include an algorithm that will shape the traffic flow. I need to monitor the performance and effectiveness of that algorithm. That is the subject of this topic.
File::CounterFile looks like a useful module.
I have written a bash script that produces a number of reports from utilities including 'vnstat'. vnstat will tell me how much traffic has flowed, but it won't tell me what has triggered the flow.
I am now thinking of having 2 log files (one for each camera thread). Each log file will be concatenated with the date/time and size of each image file uploaded. I could then send these log files out with the bash reporting script and then import the log files into Excel for viewing/graphing.
That I think would avoid any complex coding. I could set up Crontab jobs to send and delete the files once a day (no log rotation or archiving).
Does anyone have a better idea?