The need to maintain the workflow in conditions of forced disconnection has increased the need for software products. But it turned out that the quality of many of them is lower than expected. One of the reasons is that users with different quality criteria began to work with them. Another reason is that the products have objectively become unacceptably low quality, largely due to errors in the management of the project. The scale and significance of a modern software project require its implementation according to one of the classical models: it will ensure quality. However, due to the desire to get ahead of competitors, flexible methodologies are used, reducing deadlines and losing quality. And the larger the project, the higher the risk of quality loss and the higher the cost of losses. How to combine the benefits of both approaches without getting their disadvantages?