-
Notifications
You must be signed in to change notification settings - Fork 1
Design Consideration_Thermistors Module_optimisation
Thermistors don't directly measure temperature; instead there is a correlation between temperature and the non-linear change in the thermistor's resistance. The Maverick brand temperature probes used in this project employ NTC (negative temperature correlated) thermistors whereby the resistance of the thermistor decreases with increasing temperature. Consequently, the usual measurement practice is to set up a voltage divider where the output voltage between the thermistor and a bias resistor is fed to the ESP8266's 10-bit ADC, converted to a binary number which the microcontroller sketch (code) then uses to convert to temperature.
A typical low temperature measurement practice in setting up the thermistor voltage divider is to select a bias or reference resistor value that has a similar resistance to the thermistor at room temperature, referred to as the R25 value. However, this is not optimum for high temperatures such as measuring BBQ temperature ranges. A challenge with thermistors is that they are non-linear over the full scale of measurement range, where the voltage only approaches more linear behaviour over a subset of the temperature measurement range.
Given the more linear behaviour occurs across a narrow ~80 deg C temperature range, the temperature probe sensitivity must be optimised to shift the near-linear section across the desired measurement range through careful selection of the voltage divider bias resistor.

Image: Thermistor Voltage Divider - note presence and placement of the Bias Resistor
The following details this optimisation process.
In order to optimise the thermistor, one must first characterise the temperature probe thermistor resistance response over a wide temperature range or, even better, have the manufacturer's datasheet which typically provides the necessary information.
No datasheet was provided with the retail Maverick brand replacement ET-732 / 733 temperature probes that I bought through Amazon but I crossed my fingers and contacted Maverick Customer Service asking them if they would provide me a copy. Much to my surprise, they sent me a scan of the thermistor datasheet which I imported into a spreadsheet: the pdf copy of the datasheet can be found here. Well done on excellent customer support, Maverick! 👏 👍
From the datasheet, we learn that Maverick uses a Semitek Ishizuka Electronics Corporation model GT-2 thermistor in these probes. There are a couple items of note to be found in the datasheet
- R25 = 1,000k ohms (ie. 1M ohms or 1,000,000 ohms) - this is a much greater R25 resistance than the typical 10k ohm retail temp probes out there
- quite low dissipation factor of ~0.6 mW/deg C (in comparison with some 10K ohm NTC thermistors where dissipation factors range from 2 - 6 mW/deg C). This dissipation factor, on its own, leads to an expectation of higher self-heating when power is dissipated as current flows through the thermistor.
Fortunately, however, what matters is the interplay between the R25 and dissipation factor. The high thermistor's resistance (eg 1M ohm R25 value), in conjunction with the high selected bias resistor value) will limit current flow to mitigate self-heating errors. I'll come back to this after the discussion of bias resistor selection.
So let's have a look at the Semitek thermistor. Following is a plot of resistance vs temperature for the Maverick (Semitek) thermistors.

Image: Thermistor Resistance vs Temperature (per datasheet)
As detailed above, the temperature probe sensitivity can be optimised over the target temperature range through careful selection of the voltage divider resistor.
Building on Jason Sach's fantastic article Thermistor signal conditioning: Dos and Don'ts, Tips and Tricks, I calculated the voltage divider output voltage for a range of resistor values and plotted this versus temperature.

Image: Voltage Divider Output vs Temperature for Range of Bias Resistors
As observed in the plot, the thermistor exhibits an "s-shape" where sensitivity is poor over the higher and lower temperatures where the curve flattens out but most sensitive where the curve is steepest. Changing the voltage divider resistor, ie the bias resistor, shifts the area of greatest sensitivity.
As Sach points out, the thermistor sensitivity can be calculated simply by plotting the first derivative of the curve (easily estimated using the finite difference method of calculating the instantaneous slope). If you remember your calculus, the first derivative gives the instantaneous rate of change of a function, or slope. More simply, the greater the instantaneous rate of change (slope), the greater the temperature sensitivity.
The next plot shows the thermistor sensitivity for various resistor values. From this, it can be seen that greater resistor values shift the sensitivity out to higher temperature ranges, but at the trade-off of reduced peak resolution (which is not a concern for this project where greater resolution is not required).

Image: Unoptimised Thermistor Sensitivity vs Temp for Range of Bias Resistors
The CloudSmoker project uses two temperature probes, each with a different function:
- MeatProbe - skewer type probe designed to be inserted into meat and measure the internal meat temperature
- PitProbe - blunt-nosed probe to measure the internal smoker temperature at grill level
Each probe targets a different temperature range; that is the MeatProbe targets a lower internal meat temperature range than the higher grill temperature measured by the PitProbe. As such, I selected different resistor values for each probe to ensure they were optimised over the target temperature ranges for their function. In the case of the MeatProbe, I wanted the greatest sensitivity to cover a range of final internal meat temperatures covering a variety of meat types. For the PitProbe, the grill temperature range needs to cover a range from "low and slow" (107-121 deg C) to more traditional higher temperature indirect roasting methods (120-175 deg C)
The following plot shows the optimised sensitivity curves for each probe. Standard 1% resistor values in values I had at hand were chosen:
- MeatProbe: 9.09k ohms / 1%
- PitProbe: 75.0 k ohms / 1%

Image: Optimised Temperature Sensitivity following Bias Resistor Selection
So how much of an issue might self-heating be? Let's estimate the error (note that this is only a rough estimate as the dissipation factor, k, is not a dissipation constant as it is sometime erroneously referred to. The k-factor varies across a measurement temperature range and is typically measured in air (sometimes in an oil bath) at ambient temperature
However, pretending that the disspiation factor is constant, the worst case self heating would occur at the high range of temperatures measured for the meat and pit probes as this is where the NTC resistance would be lowest and current flow, and hence power dissipation, the highest.
Error = Power Dissipation (mW) / k-factor dissipation constant (mW / degC)
where
V is supply voltage
R is total resistance of resistor divider)
Meat Probe
bias resistor: 75e3 ohm
NTC resistance at 100 deg C: 43,900 ohms
Total resistance = 75e3 + 43.9e3 = 118.9e3 ohms
P = = V^2/R = 5^2 / 118.9e3 = = 0.00021W = 0.21 mW
Worse Case Error = 0.21 mW / [0.6 mw/deg C] = 0.35 deg C
Pit Probe
bias resistor: 9.1e3 ohm
NTC resistance at 180 deg C: 4,255 ohms
Total resistance = 9.1e3e3 + 4.255e3 = 13,355 ohms
P = = V^2/R = 5^2 / 13.355e3 = = 0.00187W = 1.87 mW
Worse Case Error = 1.87 mW / [0.6 mw/deg C] = 3.1 deg C
As shown, worse case self-heating errors are not significant and within design criteria