In recent years, Fisher linear discriminant analysis (FLDA) based classifi-cation is one of the most successful approaches and has been shown effectiveperformance in different classification tasks. However, when the learning data(source domain) have a different distribution against the testing data (tar-get domain), the FLDA-based models may not be optimal, and the perfor-mance will be degraded, dramatically. To face this problem, in this paper, wepropose an optimal domain adaptation via Bregman divergence minimization(DAB) approach, in which the discriminative features of source and target do-mains are simultaneously learned via domain invariant representation. DABis designed based on the constraints of FLDA, with the objective to adaptthe coupled marginal and conditional distribution mismatches with Breg-man divergence minimization. Thus, the resulting representation can havewell functionality like FLDA and simultaneously have better discriminationability. Moreover, our proposed approach can be easily kernelized to dealwith nonlinear tasks. Extensive experiments on various benchmark datasetsshow that our DAB can effectively deal with the cross domain divergence andoutperforms several state-of-the-art domain adaptation approaches on cross-distribution domains.