Theory terminology
This is a theory forum. Although we accomodate posters with a wide variety of backgrounds, some of the discussions can become quite technical. These discussions often draw upon concepts, facts, and words from the field of mathematics known as game theory.
Unfortunately, many words commonly used in game theory have very specific, exact meanings which may or may not line up with their usage in everyday speech. In order that we are able to communicate effectively, it is important to understand the technical meanings of a few terms. Way too many threads have degenerated into arguments between posters who mean different things when they say "optimal".
With that in mind, here's a quick intro to some common terminology, some of which comes from game theory.
range
A range is basically just a group of hands, possibly with some frequency information attached. Ranges are particularly useful for specifying strategies. For example, you might want to say that you take a particular action with all of some particular hands, half of others, etc.
equity
The equity of a hand or a range is the percentage of the pot expected to be won if all betting were stopped and all players checked down to showdown. This frequency is averaged over all the cards that can come and all the ranges involved.
game
There's a few different ways to define a game depending on the situation you want to study. Basically, it's just going to be a sequence of decisions which lead to some payoff depending on what decisions you make and what decisions the other players make. That is, your payoff depends on your decisions as well as other peoples' decisions.
Sometimes you have a situation, and you get to make some decisions, and the results only depend on your own decisions (and maybe some randomness from Nature). These might be interesting problems, but they're not questions for game theory. It's the fact that your payoff depends on other peoples' decisions too that puts us in the field of game theory.
For example, rock paper scissors (RPS) is a game with only one decision point, and three payoffs are possible: win, lose, or tie, depending on your choice and your opponent's choice.
strategy
A strategy specifies how you choose your move at every decision point in a game, that is, at every situation you could possibly be faced with in the game.
pure strategy
A pure strategy is a strategy that specifies exactly how you will play in every situation you could face. In RPS, there are three pure strategies: throw rock, throw paper, or throw scissors.
mixed strategy
A mixed strategy is one which associates some probability with each of the possible pure strategies. For example, in RPS, you could play a mixed strategy by using "throw rock" 60% of the time and using "throw paper" the other 40%. It is a good idea to play a mixed strategy in RPS. It's not as clear that this is necessary to be successful in poker.
dominated strategy
A strategy is (strictly) dominated if there is another strategy possible which is more profitable regardless of your opponents' strategies. This actually does not come up too often in poker, since almost anything can be good if your particular opponent happens to play particularly poorly against it.
One example from hold'em is as follows: any strategy that involves folding the nuts on the river is dominated (at least in a cash game context, ignoring rake). In particular, it is dominated by a strategy which is the same except that the nuts are played somehow other than by folding. This second strategy will be more profitable regardless of how your opponents play.
+EV
+EV is short for "positive expected value" or "higher expected value". This is vague for a couple of reasons. First, expected value? expected value of what? Second, higher? higher than what?
Expected value just means the average value of some random quantity. In poker, the quantity we're interested in (and which we're interested in maximizing) is the size of our chip stack. However, when doing calculations, we don't always calculate the average or expected size of our chip stack. Sometimes we calculate the expected change in our stack size over the course of the hand, starting from the beginning, and try to make choices that maximze that. Or, sometimes we calculate the expected change in stack size starting from some specific point in the hand.
Basically, it doesn't matter which way you do it, as long as you are consistent in your choice and clear about what you are doing when you write a post.
Second, higher than what? Sometimes people say "higher than 0", but that doesn't necessarily mean anything depending on what expectation we're considering. Other times, people are implying "higher than than the EV of folding if I computed it with the same convention", but that may or may not be important in any particular situation (say, if we are really interested in deciding between calling and raising).
From a strategic point of view, it's important to choose the option with an EV higher than all your other possible choices. So if you are arguing that something is "+EV", please say what it is that your favored move has a higher EV than, and if you write down EV equations, make it clear what you're finding the expectation of.
best response strategy or maximally-exploitative strategy or nemesis strategy
Suppose all your opponents' strategies are fixed and you know them. Then, you can compute the very most profitable way to play against them. This best strategy is known as a best response or a maximally-exploitative strategy. An (imaginary) player who automatically knows your strategy and always plays maximally exploitatively in response is sometimes refered to as nemesis.
nash equilibrium or (game theory) optimal strategies or unexploitable strategies
A Nash equilibrium is a set of strategies (one for each player in the game) with a couple properties. These properties are equivalent, they're just different ways of looking at the same thing:
- No player in the game can unilaterally change his strategy to improve his expectation.
- Each player's strategy is maximally exploiting those of his opponent(s), at the same time.
Notice that nothing about these definitions implies that the players will break-even on average. However, it turns out that in poker, if all players are playing their equilibrium strategies, they will break even in the long-term average sense when we average over all positions in the game.
The existence of a set of strategies like this has some special consequences (at least in heads-up play!). Whenever players are not playing their equilibrium strategies, and one player is making more money than he would at equilibrium, and thus the other is making less, then the guy who is making less has incentive to switch to his equilibrium strategy. However, whenever both players are playing the equilibrium, neither has any incentive to change. Thus, if both players are rational and smart enough to compute the equilibrium strategies, they those are the strategies they will end up playing. It's only then that neither has any incentive to deviate.
Now it is not immediately obvious that such a set of strategies exist, but John Nash proved it for a class of games that includes poker. Of course, knowing that it exists is different than knowing exactly what it is, and in fact the equilibrium strategies are unknown for all "real" poker games. But they definitely exist, and also, equilibriums for greatly-simplified versions of some games are known. For example, if the SB is restricted to playing shove-or-fold preflop in heads up no limit hold 'em, then the game becomes much simpler, and we can find the well-known shove/fold equilibrium which can be useful for short-stacked play.
Any non-equilibrium strategy may also be refered to as exploitable. Notice that, if you were playing unexploitably, but then you changed your strategy to take advantage of some mistakes of your opponent, then you yourself are now playing exploitably, but that's OK if your opponent isn't taking advantage of it.
Nash equilibrium strategies are also known as optimal or game theory optimal or GTO or unexploitable. The fact that "optimal" does not simply mean "maximally exploitative" really seems to trip people up and is thus unfortunate, but that's the language the mathematicians chose, so we're stuck with it.
The usage of "optimal" or "game theory optimal" to refer to the Nash Equilibrium appears to be somewhat unique to poker. It's genesis may be from the book "The Mathematics of Poker" which uses the term this way. This usage does not appear to be common among game theorists and definitely causes some confusion at times.