On average, one expects to see an Ace about once every 13 cards.

1) Assuming 6-deck Strip rules, if one knows with 100% certainty on a specific 'target' hand that (at least) one of her 2 dealt cards will be an Ace, the probability of winning the hand is about 51%, correct?.

But let's now say that rather than knowing with 100% certainty, she 'only' knows with 38% certainty (vs. the probability with which a non-Ace sequencer can expect to encounter, on average, an Ace: 1 out of every 13 cards, or about 7.69%) when, on average, at least one of her 2 dealt cards in 2 (on average) specific 'target' rounds (out of about 35 rounds of heads up play within one 6-deck shoe, Strip rules) will contain an Ace.

2) So, IF WONGING IN FOR ONLY THOSE 2 SPECIFIC TARGET ROUNDS PER SHOE, will the player advantage FOR JUST THOSE 2 ROUNDS per shoe be .51 X .38 = 19.38%?

3) If so, how does one now calculate proper bet size with a 19.38% advantage on 2 rounds per shoe? Should one bet table max? What if one's bankroll is only $200, what would the proper bet be? Do we use Kelly to calculate this bet size as a function of our bankroll, especially because we only have, on average, 2 opportunities to bet big per shoe? As such, variance is very large, correct?

4) So, for 2 specific 'target' rounds per shoe, if she thinks she can predict with, on average, 38% certainty when at least 1 of her 2 dealt cards will be an Ace, is her OVERALL (for ALL rounds; i.e., playing every hand - no Wonging) player advantage something like 2 rounds of betting big at about a 19.38% player advantage per 6-deck shoe offset by about 33 rounds of flat betting at about a -.55% player advantage per 6-deck shoe?

Thank you in advance!