This paper studied a class of discrete-time dynamical games known as dynamic graphical games. Novel coupled Bellman equations and Hamiltonian functions are developed to solve the graphical game. Optimal control solutions for the dynamic graphical game are given in terms of the solutions to a set of coupled discrete-time Hamilton-Jacob-Bellman (DTHJB) equations. The stability and Nash solutions for the dynamic graphical game are proved. An online model-free policy iteration algorithm is developed to solve the dynamic graphical game in real-time. The developed algorithm does not require the knowledge of any of the agents’ dynamics. Policy iteration convergence proof for the dynamic graphical game is given. A gradient descent technique with critic network structures is used to implement the online policy iteration algorithm to solve the dynamic graphical game.