Domain adaptation has proven to be successful in dealing with the case where training and test samples are drawn from two kinds of distributions, respectively. Recently, the second-order statistics alignment has gained significant attention in the field of domain adaptation due to its superior simplicity and effectiveness. However, researchers have encountered major difficulties with optimization, as it is difficult to find an explicit expression for the gradient. Moreover, the used transformation employed here does not perform dimensionality reduction. Accordingly, in this article, we prove that there exits some scaled LogDet metric that is more effective for the second-order statistics alignment than the Frobenius norm, and hence, we consider it for second-order statistics alignment. First, we introduce the two homologous transformations, which can help to reduce dimensionality and excavate transferable knowledge from the relevant domain. Second, we provide an explicit gradient expression, which is an important ingredient for optimization. We further extend the LogDet model from single-source domain setting to multisource domain setting by applying the weighted Karcher mean to the LogDet metric. Experiments on both synthetic and realistic domain adaptation tasks demonstrate that the proposed approaches are effective when compared with state-of-the-art ones.