statistical property
Videos
What is standard error?
Whatโs the difference between standard error and standard deviation?
Whatโs the difference between a point estimate and an interval estimate?
My understanding is that standard error is essentially a measure of how different the means you obtain when you sample from a population will be. According to statistical theory, if you have a population, and you take a sample of this population, you can calculate standard deviation by comparing each value to the mean of your sample. But then, when you take that number and simply divide it by the square root of your sample size, then voila, you magically know how spread out the mean of every single sample you could ever take of that population is.
To me, that seems like a HUGE stretch that you can make such a huge assumption. It is already a bit of a stretch to think that your sample is a decent representation of an actual population mean, and sure, I get that these formulas are actually just estimates rather than concrete math. But I never would have guessed that the deviation of a sample, divided by a modification of the sample size, could tell you how much any mean sample could ever vary, ever.
Am I way off in assuming this? Am I missing something that should make me think more clearly about this all?