View Full Version : TR4/4A tr4 temperature gauge/sensor reading

10-21-2006, 11:52 PM
After my recent rebuild on my '63 tr4, the temperature sensor now reads 100 deg when warmed up and 40 deg when cold. The temperature sensor resistor reads 880 ohms when warm and 88 ohms when cold. I'm trying to figure out if my sensor is bad, or if the gauge is bad, or if the engine really is running that hot. It seems to run normally, and doesn't overheat, so I'm suspecting the gauge or sensor. Any ideas?

10-22-2006, 12:00 AM
Check it with a digital thermometer at the sender...when my car reads 185, I know it's 177 at the sender.

10-22-2006, 02:39 AM
Hello Derickson,

the sender should decrease in resistance as it heats up so it sounds as though yours is U\S. The standard quick check of both the temperature and fuel gauge is to short the sensor terminal or cable to earth and the instrument should be on full scale.


Geo Hahn
10-22-2006, 03:49 PM
Are those degrees Celsius or Fahrenheit? I do not recall what the change point was between early & late TR4s.

10-22-2006, 05:56 PM
That's a good question, Geo. I was looking at the converter last week. 40 C would be well over 100 F, and 100 C would be over 200 F. I think the converter uses a fraction of about 1.8, and adds 32 degrees. I get confused whenever I look at the differences.

Geo Hahn
10-23-2006, 04:01 PM
I think Triumph was confused too.

The middle of the 'old' gauge is 185F.

The middle of the 'new' gauge is 70C.

The conversion formula is C = (F-32)*(5/9)

So 185F = 85C

But if you do the operations in the wrong order => C = F*(5/9)-32 then you get 185F = 70C

What I suspect: Some shiny-pants junior engineer got the calculation wrong and ordered gauges from Smiths that centered on 70C rather than 85C.

Triumph (and other mfgs) finally resolved the problem by dropping the numbers altogether.

10-23-2006, 06:21 PM
I have put the sender from my TR4A through a lab test and these are the result. Temperatures in degrees F. resistance in OHMS.
100 deg F = 502 ohm
120 deg F= 320ohm
140 deg F= 206ohm
160 deg F= 139ohm
180 deg F= 99ohm
200 deg F= 72ohm
208 deg F = 62 ohm (at my altitude and the pressure of the day this was boiling temperature)
You need less resistance to have more current in order to heat up the metal strip that moves the needle in the temperature gauge when the engine temp goes up.
I have values for every 10 degrees but did not want to type it in.

It would appear that your set up is off as the resistance is varying inversly to what is expected but your gauge is going in the correct direction. Did you type those values in correctly??

10-24-2006, 04:32 PM
I did get my measurements backward. It should be 88 ohms with engine hot, 880 ohm cold. My gauge reads in celsius (I assume, since it's not marked); and is marked at 30, 70, and 100. From the measurements that Adrio took, it looks like my sensor is operating correctly, but the gauge is displaying a higher than expected reading (from his, 88 ohms on my sensor should be about 190 F, which is consistent with operating temp for my 187F thermostat).
I'll also try shorting the sensor lead to ground, but since it currently reads full scale when the engine is warmed up, it can't go any farther.
How do you measure the temp with a digital thermometer? Just contact measurement at thermostat housing or radiator upper hose?

10-24-2006, 05:00 PM
Since the Ohm readings on the temperature sender appear to be OK, perhaps the gauge voltage stabilizer is not putting out the correct voltage.

10-24-2006, 08:43 PM
that is probably your problem. If you have no voltage stabilizer or a defective one you will get a high reading. The reason I did the lab test on my sender was in order to figure out what voltage I needed to make a solid state regulator for in order for my gauge to read correctly. I had no voltage stabilizer before that and I always had full scale on the gauge.

So that is where I would start to look if I were you. Take an analog volt meter and measure what the input voltage to your gauge is (the side of the gauge that does not go to your sender unit). It should be between 10 and 12 volts with the car running at any speed. If you are getting in the 16 volt range then it is your instrument voltage stabilizer that is the problem. An other way to check is to disconnect your generator and drive the car on battery alone (maybe do this after the engine is warmed up). This should tell you if your guage will read closer to the true temperature. If it does then for sure it is your instrument voltage stabilizer

10-25-2006, 12:14 AM
I'll check this, but I recently rebuilt the voltage stabilizer with modern solid-state electronics, and when I checked it a month ago, it was steady at 10V.
Is it possible to re-calibrate the temp gauge? I guess the other possibility would be to add a resistor in series so that when hot the total resistance would result in a correct reading, but that seems cheesy. It should work how it was designed.

10-25-2006, 01:14 PM
If you open the temp gauge you will see that it is little more then an electric heater wire wrapped around a metal strap. As the heater gets warmer (higher current) the metal expands more and that deflects the needle. From this there may be a bit of calibrations that is possible. I played with mine a touch and was able to get it to read correctly, but it was no where near as far off as your was.