Am thick in the middle of some Mayan calender conversion but having a mental block with my leap year function. The function is supposed to take the number of days as input and then give out the year (counting from 2000).
From 2000 all years divisible by 4 are 'leap years' and so when passed the value of 366 or 365 it should still return the year 2000, however, 364 seems to be the cut-off point.
// function to find year
int year_func (int a)
{
int y=2000;
int day_count = 0;
int x;
int year=0;
while (day_count <= a)
{
year++;
if (y%4 == 0)
x=366;
else
{
x=365;
}
if (year==x)
{
y++;
year=0;
}
day_count++;
}
return (y);
}
From 2000 all years divisible by 4 are 'leap years' and so when passed the value of 366 or 365 it should still return the year 2000
I think your problem lies in this statement...
"a" is the "number of days elapsed after the start of year 2000", right? But then a=0 is valid and means "no time elapsed; we are right at the start time of the year". Then a=366 means "366 days have *fully* elapsed and we are at the start of day 367 now which means, one leap-year is completely over and it should indeed return 2001! (If you disagree, you would want to return 1999 when passing a=0. And you probably don't want this either, right?)
Think of this: imagine one normal year would have 2 days and a leap year 3 days.
a=0 -> 1st day of 2000
a=1 -> 2nd day of 2000
a=2 -> 3rd day of 2000 (2000 is a leap year, so it has 3 days)
a=3 -> 1st day of 2001
...
As you can see, it returns 2001 if you pass the number of days in a leap year (here: 3).
BTW: The most simple implementation I can think of is:
1 2 3 4
int year_func(int a)
{
return (int)(a/365.25) + 2000;
}
I leave it to you as an exercise to figure out why it works :-P