In this study, entropy generation of natural convection flow of Cu-water nanofluid under the influence of a uniform inclined magnetic field (MF) is firstly modeled by a machine learning approach using the data obtained from a numerical process. The two dimensional, time dependent dimensionless equations are numerically solved by using global radial basis function (RBF) method in spatial derivatives and the second order backward differentiation formula (BDF2) in time derivatives. In a set of combined dimensionless problem parameters, numerical simulations are performed. In the obtained data, the inputs are a set of combined dimnesionless problem parameters and the outputs are entropy indicators. Then, global and local entropy indicators are modeled by Neural Network (NN). In view of smaller mean squared error metric results, small number of layers or layer sizes in NN seems enough in smaller data while trilayer NN (TNN) is a better modeler in case of larger data.