Advanced modulation schemes together with coherent detection and digital signal processing has enabled the next generation high-bandwidth optical communication systems. One of the key advantages of coherent detection is its superior receiver sensitivity compared to direct detection receivers due to the gain provided by the local oscillator (LO). In unamplified applications, such as metro and edge networks, the ultimate receiver sensitivity is dictated by the amount of shot noise, thermal noise, and the residual beating of the local oscillator with relative intensity noise (LO-RIN). We show that the best sensitivity is achieved when the thermal noise is balanced with the residual LO-RIN beat noise, which results in an optimum LO power. The impact of thermal noise from the transimpedance amplifier (TIA), the RIN from the LO, and the common mode rejection ratio (CMRR) from a balanced photodiode are individually analyzed via analytical models and compared to numerical simulations. The analytical model results match well with those of the numerical simulations, providing a simplified method to quantify the impact of receiver design tradeoffs. For a practical 100 Gb/s integrated coherent receiver with 7% FEC overhead, we show that an optimum receiver sensitivity of -33 dBm can be achieved at GFEC cliff of 8.55E-5 if the LO power is optimized at 11 dBm. We also discuss a potential method to monitor the imperfections of a balanced and integrated coherent receiver.