Abstract

The effect of electron correlations in the impurity conductance of the shallow-donor impurity band in a semiconductor quantum wire, connected by two ideal leads, is studied by using the Hubbard model in an alloy-analogy approximation. The hopping integral and the intrasite Coulomb interaction energy are estimated numerically from variational wave functions for random impurity configurations. For one electron per impurity, it is shown that there is a considerable reduction in the impurity conduction due to electron correlations. For a given impurity concentration, the disordered wire turns into an insulator at a much shorter sample length than that estimated previously by neglecting correlations

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call