!@#%#$ Math

Oct. 3rd, 2002 12:27 am
bcholmes: (Default)
[personal profile] bcholmes

Okay, I can't figure out the math on this problem. Can anyone help me?

I have a bunch'a numbers. Response times from a web server. I round off these response times into nice numbers. Closest 100 milliseconds or sumpthin. For every 100 millisecond interval, I count the number of occurences, giving me a pretty graph, like this.

Now I want to do standard deviation stuff. I calculate the standard deviation of the response times. Turns out, in my case, to be about 300 milliseconds. Big spread. Clustered in the [0s-0.5s] range.

Now, standard deviations are ideal for drawing the dreaded bell curves:

And I know that for each sigma, we're covering a larger percentage of the occurrences. In the following picture, for example:

the red area is supposed to cover something like 68% of the data, if the arrow marks out one sigma.

What I ultimately want to do is to render a graph with the standard deviation curve superimposed over the bar chart:

My question: how the hell do I calculate meaningful Y values for the standard deviation curve? I figure that the mid-point should be 50% of my total number of occurrences. How do I get the other points?

(Did I mention that I didn't do all that well in statistics? Or Fourier Analysis, but that's a whole 'nuther story).

This account has disabled anonymous posting.
(will be screened if not validated)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

If you are unable to use this captcha for any reason, please contact us by email at support@dreamwidth.org

Profile

bcholmes: (Default)
BC Holmes

February 2025

S M T W T F S
      1
2345678
9101112131415
16171819202122
2324252627 28 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Powered by Dreamwidth Studios