I remember learning the rules about rounding significant figures when the variable to be rounded is a 5 (Round to even numbers, not odd numbers), but my teachers never explained why this was done. While trying to verify whether odds or evens were preferred, I learned the true reason behind why the seemingly weird rounding system exists.
Fantasies about fives
Students are sometimes told to increment the least significant digit by 1 if it is odd, and to leave it unchanged if it is even. One wonders if this reflects some idea that even numbers are somehow “better” than odd ones! (The ancient superstition is just the opposite, that only the odd numbers are "lucky".)
In fact, you could do it equally the other way around, incrementing only the even numbers. If you are only rounding a single number, it doesn’t really matter what you do. However, when you are rounding a series of numbers that will be used in a calculation, if you treated each first-nonsignificant 5 in the same way, you would be over- or underestimating the value of the rounded number, thus accumulating round-off error. Since there are equal numbers of even and odd digits, incrementing only the one kind will keep this kind of error from building up.
You could do just as well, of course, by flipping a coin!
Over 10 years later the rule finally makes sense to me, but I might decide try the coin flipping approach instead the next time this comes up.