I am having binary datasets of 0 and 1, the data consist of results achieved from classification tasks as '1' for classified and '0' for not classified. So, how to calculate SD or SE for such datasets?
I have got the formula for SD:
The standard deviation of the 1s and 0s is the square root fo the mean of the squared deviations of the 1s and 0s from the mean of the 1s and 0s. Thus, where x is 1 or 0, and M is the mean x,
the standard deviation of X= SQRT((SUM((x-M)^2))/N
could you please suggest the excel formula to calculate the SD as above (I have tried it as attached, but there is an error), else suggest more solution for SD or SE on binary datasets.
Thanks!
I have got the formula for SD:
The standard deviation of the 1s and 0s is the square root fo the mean of the squared deviations of the 1s and 0s from the mean of the 1s and 0s. Thus, where x is 1 or 0, and M is the mean x,
the standard deviation of X= SQRT((SUM((x-M)^2))/N
could you please suggest the excel formula to calculate the SD as above (I have tried it as attached, but there is an error), else suggest more solution for SD or SE on binary datasets.
Thanks!