Ultrashort pulses have a crucial role in the evolution of different areas of science such as ultra fast imaging, femtochemistry and high harmonic spectroscopy and therefore, diagnosing and reconstructing ultrashort pulses is important. The interplay of dispersion and nonlinear effects gives rise to a wide variety of pulse dynamics. On the other hand, dispersion, linear and nonlinear effects have proven to be a fundamental bottleneck for high speed communications. Conventionally, time consuming and computationally inefficient algorithms are used to solve these problems. Since machine learning (ML) has proved to be more powerful than other analytical methods we show a comprehensive comparison of different neural network (NN) architectures to learn the mapping of the nonlinear schrödinger equation (NLSE). We have used a ML based approach to construct the distorted output pulse resulting from nonlinear propagation through the fiber. Additionally, the trained network can also predict the dispersion and nonlinear parameters. We have also reconstructed the temporal and spectral profile of transmitted pulse from the pulse distorted due to propagation through a highly nonlinear fiber (HNLF). These techniques can work without the knowledge of fiber parameters. A detailed comparison of six different NN based techniques namely fully connected NN (FCNN), cascade NN (CaNN), convolutional NN (CNN), long short term memory networks (LSTM), bidirectional LSTM (BiLSTM) and gated recurrent unit (GRU) is presented. Results show that a FCNN regressor outperforms the all other architectures.