If we consider optimal attention span, a microcentury is much two long.
Two half microcenturies with a pause in between would be better.
I like this analysis a lot! But there's one more twist to the story: are
we even using the right definition of year? There's actually several
different ways we can define a year:
For each of these we can define the microcentury as a hundred
microyears. So the sidereal microcentury is +35.81 seconds, while the
tropical microcentury is +35.69 seconds. The NIST microcentury is +33.6
seconds, while the Julian microcentury is +35.76 seconds (as you
Personally, I'd define a calendar microcentury as a four-millionth of
400 years to match the leap year cycle. That'd give us +35.692 seconds.
We had a running joke in engineering school, that π seconds is close
to a nanocentury by around 6 nanomonths, we obviously didn't go that far
in calculations, so I wonder if it still holds.