Hey guys, I've run into a really strange problem and I can't seem to figure out what I'm doing wrong.
I'm trying to compute standard deviation in my script but when I go to do some pretty simple division it returns a number which is just pain wrong. I thought it might be because of the data types I was using but I've tried changing my variables to singles, doubles, integers, longs and nothing seems to get me what I need.
Here's my code
The problem line is
SumSq = (SumSq / ArraySize - 1)
which returns 53.37044 instead of 56.1828. Any ideas why this might be happening?
Bookmarks