This work is devoted to the study of the use of machine learning (ML) to predict short-circuit power consumption at the floorplan stage of the physical design of integrated circuits. The study was carried out for 4 cell groups: registers, combinational cells, sequential cells, and cells included in the clock network. An Arithmetic Logic Unit was selected and designed to create a database necessary for the training and testing of ML models. The design is carried out for various combinations of parameters. Input parameters for training models are supply voltage, temperature, clock frequency, library type based on threshold voltage, block utilization and block aspect ratio. The output parameters are the values of short-circuit power after gate level simulation, considering the parasitic parameters. In total, five algorithms were tested: Polynomial Regression, Decision Tree Regression, Random Forest Regression, Supported Vector Regression (SVR) and Multi-layer Perceptron (MLP). Root Mean Squared Error, R2 and Mean Absolute Percentage Error (MAPE) metrics were used to evaluate the performance of the models. The MLP showed the best performance, but the SVR algorithm with standardization of output parameters showed similar results with significantly less training time. The MAPE value for registers is » 0,89 %, for combinational cells » 3,19 %, for sequential cells » 2,42 % and for the clock network cells » 9,55 %. The proposed method of model training is not based on any technology-dependent data, which makes it universal for different technology nodes. The disadvantage of the method is the need for implementation of the whole design flow for a selected design with the selected range of parameters for collecting necessary training data, which requires additional time and machine resources.