CS majors know this topic as Numeric Analysis, and it has turned more than one
aspiring computer scientist toward another field of study. I recall my professor
freely borrowing letters from no fewer than four alphabets--Latin, Greek,
Hebrew, and Cyrillic--to fully expound on the topic. Taking notes was a royal
PITA; deciphering them was even worse.
The second time we took the course (two out of tree dropped or failed it), the
instructor (a different one) stuck to Latin and Greek letters, but the subject
matter was still a bitch. Ah, memories...
From: Discussion of advanced .NET topics.
[mailto:ADVANCED-DOTNET@DISC...] On Behalf Of John Warner
Sent: Monday, August 25, 2008 12:04 PM
To: ADVANCED-DOTNET@DISC... Subject: Re: [ADVANCED-DOTNET] Converting doubles into integers without rounding
<wink>Make your head spin is putting it mildly, a couple of those links
will make you consider giving up coding and seek a new career in digging
ditches. </wink> But you are right extremely informative.