March 6th, 2002, 02:54 PM
less complex (as in big-O) date algorithm?
I wrote a script in bash that takes in the argument YEAR, MONTH, DAY, "+" or "-", NUM_DAYS that either adds or subtracts the NUM_DAYS to/from the date provided and returns the date that is NUM_DAYS +/- the initial date.
My question: is there an algorithm that can determine this in one operation? Right now, it cycles through the dates linearly based on NUM_DAYS, so I guess is it O(n). I am looking for one that is less, maybe O(1) or O(n log n).
March 8th, 2002, 02:22 AM
If you first determin, if the number of days will overflow yxour current month, the see if the rest will overflow the current year, then check how many years fit in, then (back down again) if the rest is greater than a year , then see if the rest is enought for a month ect. you can make it run in O(1)-operations at somewhat enhanced complexity.
You have to check how long a year/month is depending on the first date.
Checks (in worst case) have to be done:
one for the current year.
one for the n-years inbetween
one for the last year
one for the current month
one for the months to next year
one for the months in hte last year
Actually the complexity rises at O(log(days)) up to the region of 400 years the stays constant (if you don't include calender changes in history) afterwards.
Hope this helps to keep you busy implementing for some time
March 13th, 2002, 08:42 PM
Just to be pedantic
O(n log n) is worse than O(n)
March 20th, 2002, 06:07 AM
do it this way
Last edited by martinus; March 20th, 2002 at 06:10 AM.