0 out of 1 members found this post helpful.
Did you find this post helpful?
Yes |
No
Originally Posted by
moses
Let's define Variance. To me it's a series of my 17s losing to 18s etc. Getting 20 in a negative deck only to see the dealer hit 4 cards to a 21. Get a blackjack only to see the dealer have a ten with an Ace in the hole. I could go on.
Variance is the square of the standard deviation for the data or in the case of BJ your results. You don't win your EV every hand. When in the short run you fall above or below your EV for the number of hands played that is variance. Variance increases linearly as the number of hands played increases. What you are describing is more like a part of the game. 17's are overall losing hands. In a negative count the dealer isn't as likely to bust and will draw out more often. Variance/100 rounds is over 100 times your EV/100 rounds in AP BJ. So if you EV is $100 per hundred rounds the conservative range of variance is per 100 rounds is $100 plus or minus $10,000 or -$9,900 to $10,100. Use sim results for more accurate results. The amount you are away from your $100 EV is the variance you experienced. Variance is expressed in dollars not the results of the hands or what cards came out.
Now if you try to use variance as a verb you are just refer bring to the normal tendency for your results to be way off your EV. This is normal and you acting like there is some magic bullet to make this not happen is ridiculous. The more you play the closer to expectation you expect to be but variance doesn't get less relative to EV as you play more hands, standard deviation does which is the square root of variance. SD gets less relative to EV with more hands played because it grows proportional to the square root of the number of hands played and EV grows linearly relative to hands played. Since variance is the square of SD, variance and EV both grow linearly as more hands are played. So basically the likelihood of you being closer to expectation goes up with more hands played the variance does not get less relative to EV. In other words the possible range of results relative to EV doesn't change but the likelihood of being in the extreme edge of possible results gets smaller. Like you could win or lose every coin toss in a million trials (1 in every 2^(1,000,000) trials) but that is far less likely than winning or losing all the tosses in 2 coin tosses (1 in every 2^(2) trials).
Using the terms correctly would go a long way for people to agree with you.
Bookmarks