In typical film cooling experiments, the adiabatic wall temperature may be determined from surface temperature measurements on a low thermal conductivity model in a low-temperature wind tunnel. In such experiments, it is generally accepted that the adiabatic wall temperature must be bounded between the coolant temperature and the freestream recovery temperature as they represent the lowest and highest temperature introduced into the experiment. Many studies have utilized foreign gas coolants to alter the coolant properties such as density and specific heat to more appropriately simulate engine representative flows. In this paper, we show that the often ignored Dufour effect can alter the thermal physics in such an experiment from those relevant to the engine environment that we generally wish to simulate. The Dufour effect is an off-diagonal coupling of heat and mass transfer that can induce temperature gradients even in what would otherwise be isothermal experiments. These temperature gradients can result in significant errors in the calibration of various experimental techniques, as well as lead to results that at first glance may appear non-physical such as adiabatic effectiveness values not bounded by zero and one. This work explores Dufour effect induced temperature separation on two common cooling flow schemes, a leading edge with compound injection through a cylindrical cooling hole, and a flat plate with axial injection through a 7–7–7-shaped cooling hole. Air, argon, carbon dioxide, helium, and nitrogen coolant were utilized due to their usage in recent film cooling studies.