Abstract

Polynomials are basic objects in mathematics. The behavior of any system usually depends on several constraints (the variables) and so is given by one or several functions of one or several variables. These functions, in turn, provided they are sufficiently continuous (which is usually the case), can be approximated by polynomials, in some range and within some accuracy. So the study of any system, no matter how complicated it is, starts with the study of polynomials. As an example, the position of a robot in a plane may be given by two polynomials (one for each coordinate), each of them depending on several variables: directions of wheels, currenf in each motor, and so on. Polynomials appear also as technical tools. For instance, for a system depending linearly on the data (thus given by a matrix), the stability depends on the zeros of the characteristic polynomial of the matrix (cf. Marden [30]), and so locating zeros is a major problem in the theory of automata. Polynomials play a central role in mathematics, for instance, in analysis (complex or real), number theory, approximation theory, and numerical analysis. Classical results about polynomials fall into two types. The first one is purely qualitative: It says that something exists. An easy example is the Bezout identity: If P and Q are two polynomials, with no root in common, there exist two other polynomials, A, B such that AP + BQ = 1 (no information is provided on A and B; nobody even knows on what this information should depend). The second type depends on the degree. This is the case for Bernstein's inequality, a basic result in complex analysis: If P is a polynomial of degree n and P' its derivative, then

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call