The rapid development and widespread applications of information technology have provided unprecedented tools for the investigation and regulation of various types of complex systems, and have yielded new opportunities for the development of systems and control sciences. The future development of information science cannot be achieved without the existing basic researches and insights. After a brief review of the development of control theory, this paper focuses on some basic scientific problems concerning the estimation, control, and games of dynamical systems with uncertainty. We review some related theoretical progress achieved by the authors research group, share some research experiences, and provide new insights and perspectives. We mainly consider the theoretical foundation of the following topics: proportional-integral-derivative (PID) control, adaptive estimation, adaptive filtering, adaptive control, the maximum capability of feedback, adaptive games, collective behaviors, and game-based control systems. Because there are various feedback loops in dynamical systems, the properties of the systems observed data are usually determined using complex nonlinear dynamical equations; therefore, classical statistical assumptions such as independency and stationarity are far from being satisfied, which is a prominent feature of the theoretical investigation in this field.