Synaptic transistors have been proposed to implement neuron activation functions of neural networks (NNs). While promising to enable compact, fast, inexpensive, and energy-efficient dedicated NN circuits, they also have limitations compared to digital NNs (realized as codes for digital processors), including shape choices of the activation function using particular types of transistor implementation, and instabilities due to noise and other factors present in analog circuits. We present a computational study of the effects of these factors on NN performance and find that, while accuracy competitive with traditional NNs can be realized for many applications, there is high sensitivity to the instability in the shape of the activation function, suggesting that, when highly accurate NNs are required, high-precision circuitry should be developed beyond what has been reported for synaptic transistors to date.