I have developed a stock trading system that has consistently beaten the market. I think I can improve the system if I had the answer to a math question. I posted it here because the math is very similar to Blackjack betting strategy.

I apply several formulas to about 100 stocks. The formulas work better for some stocks than for others. The trades in some companies have a good average return but high variance while trades in other companies have lower return and lower variance. Some days my system indicates that I should invest in several stocks but somehow I have to choose just the best bets. Should I choose the high-return, high-variance bets or the low return, low variance bets? I have calculated the standard deviation of all the trade for each of the different stocks. Here are the results for some of the stocks:

Stock_Trades_Gain__Deviation
__A___28___6.23%___7.46%
__B___17___5.29%___3.64%
__C___11___8.00%___5.01%
__D___17___4.79%___4.25%
__E___28___3.57%___5.26%
__F____3___5.52%___1.03%
__G___18___3.87%___4.07%
__H____5___6.32%___2.40%
__I___32___4.31%___9.97%
__J___17___4.51%___5.83%

The data show the number of trades made for each stock, the average gain, and the standard deviation. The gains are only a few percent, but the stocks are traded quite frequently so this can add up to a very good annual return.

One more factor that compounds the problem is that I have made a limited number of trades for each stock, and that number varies from stock to stock. I need to find a way to combine the numbers for average return, deviation, and number of trades to come up with a way to rank the different stock so that I know which is the better bet when my system indicates that I should invest in several stocks.

One more problem that may be related to this is how to apply the Kelly criterion to these bets. Any ideas?