Does anyone have an idea of how much the nozzle surface temperature differs from the actual melt zone?
I am trying to verify/calibrate my nozzle temperature readings in software vs real life since I’ve been suspecting they are way way off.
It probably differs some from hotend to hotend but I’d be interested in any input you might have on the subject.
The hotends in question are Hexagon and Stepstruder MK7. All metal. Small brass nozzle screwed into an aluminum heater block. No insulation used.
I am measuring the nozzle surface temperature with an external instrument which uses two thermocouples. The thermocouples are covered with one layer of Kapton tape for electrical isolation and then fixed to the nozzle using a few layers of Kapton tape wrapped around.
Initial measurements are shockingly off.