The integration of distributed energy resources (DER) in low-voltage networks is expected to increase significantly in the next few years. It is known that the large-scale integration of DER in conventional low-voltage (LV) networks poses many challenges such as voltage rise. To mitigate voltage rise caused by DER, utilities are defining simple voltage regulation strategies such as disconnecting DER when phase voltages rise above a defined threshold. In this paper, we model LV networks as three-phase unbalanced circuits and illustrate that such local regulation strategies may cause the cascading of the DER in a feeder. We then propose an optimal control approach to avoid cascading and optimize DER availability. Local control approaches are discussed in the context of global DER availability and their limitations are illustrated.