Dynamics of an open N-state quantum system is often modeled with a Markovian master equation describing the evolution of the system density operator. By using generators of SU(N) group as a basis, the density operator can be transformed into a real-valued "coherence-vector." A generator of the dissipative evolution, so-called "Lindbladian," can be expanded over the same basis and recast in the form of a real matrix. Together, these expansions result is a nonhomogeneous system of N^{2}-1 real-valued linear ordinary differential equations. Now one can, e.g., implement standard high-performance algorithms to integrate the system of equations forward in time while being sure in exact preservation of the trace (norm) and Hermiticity of the density operator. However, when performed in a straightforward way, the expansion turns to be an operation of the time complexity O(N^{10}). The complexity can be reduced when the number of dissipative operators is independent of N, which is often the case for physically meaningful models. Here we present an algorithm to transform quantum master equation into a system of real-valued differential equations and propagate it forward in time. By using a specific scalable model, we evaluate computational efficiency of the algorithm and demonstrate that it is possible to handle the model system with N=10^{3} states on a single node of a computer cluster.
Read full abstract