Fall detection based on radar technology has recently attracted increasing attention. Human motions are often multidirectional and have complex trajectories. Numerous time–frequency representation methods have been used to extract micro-Doppler fall features. Peak-to-peak Doppler frequency measurements are limited due to the relatively weak energy in the high-Doppler-frequency area. Fall features are not obvious when using a single millimeter-wave (mmWave) radar if human motions are nearly perpendicular to the radar. A fall feature enhancement and fusion method using the Stockwell transform with dual mmWave radars is proposed in this article. Two mmWave radars synchronously collect human motions in different directions, and the micro-Doppler features of these human motions are obtained by using the Stockwell transform. To enhance the features obtained for recognition, the ridgelines extracted from the two radars are fused in the time–frequency domain. In addition, a model trained by a convolutional neural network (CNN) is used for fall recognition; 960 fall motions and 720 nonfall motions are collected from 60 volunteers. The experimental results show that the proposed method can produce a recognition accuracy of up to 94.14%, which is beneficial for the realization of an indoor fall detection system.