In this paper, we address finite time stability in probability of discrete-time stochastic dynamical systems. Specifically, a stochastic comparison lemma is constructed along with a scalar system involving a generalized deadzone function to establish almost sure convergence and finite time stability in probability. This result is used to provide Lyapunov theorems for finite time stability in probability for Itô-type stationary nonlinear stochastic difference equations involving Lyapunov difference conditions on the minimum of the Lyapunov function itself along with a fractional power of the Lyapunov function. In addition, we establish sufficient conditions for almost sure lower semicontinuity of the stochastic settling-time capturing the average settling time behavior of the discrete-time nonlinear stochastic dynamical system. Furthermore, a stochastic finite-time optimal control framework is developed by exploiting connections between Lyapunov theory for finite time stability in probability and stochastic Bellman theory. In particular, we show that finite time stability in probability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function that can clearly be seen to be the solution to the steady state form of the stochastic Bellman equation, and hence, guaranteeing both stochastic finite time stability and optimality.