A novel definition of the conditional smooth Renyi entropy, which is different from that of Renner and Wolf, is introduced. It is shown that our definition of the conditional smooth Renyi entropy is appropriate for providing lower and upper bounds on the optimal guessing moment in a guessing problem where the guesser is allowed to stop guessing and declare an error. Further a general formula for the optimal guessing exponent is presented. In particular, a single-letterized formula for a mixture of i.i.d. sources is obtained. It is also shown that our definition is appropriate to characterize the optimal exponential moment of the codeword length in the problem of source coding with common side-information available at the encoder and decoder under a constraint on the probability of a decoding error.