Abstract

By simply splitting the drain of a conventional MOS transistor into two, we can convert the transistor into a magnetic sensor. The ease of integrating split-drain magnetic field-effect transistors (MAGFETs) in conventional CMOS technology and the potential of sensing small magnetic fields have fascinated many researchers. Yet many parameters, such as the maximum sensitivity and biasing dependence of the device, are not yet known. In this paper, we describe a model for the split-drain MAGFET. The model shows that the sensitivity of the sensor is primarily a function of the roll-off of the induced Hall potential in the channel and that the contribution due to channel inversion charge redistribution is very minor. The model also shows that the magnetic-field signal in terms of ΔId/Id is insensitive to geometry, linear to magnetic-field strength and affected by the gap between the two drains. Furthermore, we show that the sensitivity is typically limited by the Hall mobility, insensitive to operating regions and attributed to saturation voltage shifts in the saturation region. A maximum sensitivity of less than 5.8% T−1 is predicted. The development of the model is assisted by computer simulations and verified by experimental results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.