A new experimental method using a 50 μm diameter hand fabricated hot probe of Pt90/10Rh alloy is proposed as a tool to measure the Seebeck coefficient of thin films deposited onto electrically insulated substrates. A thermal model for heat transfer inside the probe and to the sample was developed to simulate the corresponding temperature distributions. Simulations showed that the ratio of the temperature rise in the contact point to the average temperature of the probe evolves basically independent on electrical current and film-substrate properties. Moreover, the values of contact radius and thermal contact resistance should not be necessary known, but only the pairs which are best fitting the probe thermal resistance in contact with the film-substrate. This brings a great simplification in respect with certain AFM based hot probe methods for which is necessary to determine these two parameters by calibration of the two substrates with known thermal conductivity. The values can be further used to evaluate the Seebeck coefficient of the sample with a macroscopic probe, within 1 % precision. By variance to the classical measuring methods, which resort to a heater-heat sink ensemble instrumented with thermocouples, heated metallic rods with built-in thermocouples or microfabricated structures, the proposed method is fast, reliable and of higher spatial resolution. The method was validated by measuring the Seebeck coefficient on CdS, Bi and Cr thin films reference samples, respectively.