Markov Chain is a sequence of states where your future state only depends on your current state. When you add decision-making to it, you get a Markov Decision Process (MDP)
잼난 개발놀이, 노리컴퍼니
Markov Chain is a sequence of states where your future state only depends on your current state. When you add decision-making to it, you get a Markov Decision Process (MDP)