• Hey Guest!
    British Car Forum has been supporting enthusiasts for over 25 years by providing a great place to share our love for British cars. You can support our efforts by upgrading your membership for less than the dues of most car clubs. There are some perks with a member upgrade!

    **Upgrade Now**
    (PS: Upgraded members don't see this banner, nor will you see the Google ads that appear on the site.)
Tips
Tips

TR6 TR6 Engine Temp survey

jerrybny

Jedi Knight
Offline
I'm wondering what engine temp other TR6 owners are seeing. I seem to be running at about 185 degrees. I checked today with both a infrared thermometer and the old fashion meat thermometer. Both registered about 185. I checked the coolant as it flowed past the top opening. I should mention that it is about 85-90 degrees out today.
Thanks
 
This is a very good temperature. Right where it should be.
 
mine is running about the same as yours, never a problem with overheating! even on the real hot days!

but this weekend ill try your temp method! /bcforum/images/%%GRAEMLIN_URL%%/banana.gif
 
Its not. The gauge is at 3/4 when the engine temp is 180. I've replace the gauge, sending unit and the voltage stabilizer. Not sure what else could be causing the gauge to read high.
 
head down to Radio Shack, or any other electronics supply store. Get yourself some resistors, play with the sizing. 5 ohms to 50 ohms. Most people seem to settle around 35 ohms, wire into the circuit from your sending unit. Seems the British accuracy of sending units took a turn a decade or two ago.
 
The problem with that approach is that it's very easy to wind up with a gauge that will NEVER read in the red, no matter how hot the engine gets. The gauge is very non-linear, so any added resistance has a much, much larger effect at full scale than at mid-scale.

Might be OK with some people, but not my style.

Not sure about the TR6, but on the Stag, the problem is that most vendors are selling the wrong sender to suit the original gauge. You can either recalibrate the gauge to match your sender; or try buying a sender that works from
https://www.ldpart.co.uk

The gauge has two little round windows on the back that give access to the calibration adjustments. You may have to pry the cork seals out to access the slotted adjustments (but usually they are long gone). The slots don't "turn" as such, rather they are pried from side to side to effect the adjustment. I made a little tool to move mine. The adjustment behind the cold side affects the entire range, while the adjustment behind the hot side affects mostly the top end of the scale.

Two things I would check first; one is that the voltage stabilizer is actually working. Hook a test lamp or voltmeter between the lead to the sender and ground; you should be able to see the light flash or the voltmeter jump around.
The second check is that there is no difference in ground voltage between the engine and the voltage stabilizer. A bad ground to the dash can cause problems.
 
Next time I have one of the 6's gauges out I will look to see if they are the same. Would be nice if they were.

As far as burning? Don't think so, as this was a "factory fix" on Honda Accords in the 77-78 model years if I remember properly. And Honda lawyers would never have given the ok on something that could have gone wrong!

Now this is presuming the owner measures the actual temperature and calibrates the sending unit(which is what we are doing, only externally) to read on the gauge properly.
 
Hello to all.

My usual running temp is between 170 and 185 degrees - closer to 175 degrees. When the engine is warmed up and running my temp gauge reads either straight up center or just to the left of center. In the cooler months the gauge will almost always read left of center by about 1/8 inch.

An observation I once observed was after I got back from a drive I left the car running while I checked the temp with a laser type reader. Although the temp needle began to move to the right the temp of the engine remained within 2 to 5 degrees of normal running temp. From that observation I guess the temp needle may rise when the car comes to a stop but the actual temp may not be rising with it.

Interesting.

Dave
 
My guage reads hot as well. Here is my temp gauge at 180F (checked with pot of water and fluke temp meter)
gauge_180.png


Here is an article I found on Randall's (btw, thanks for the link to transmitters) approach:

Optional: Calibrating the Gauge

Just about every Plus 4 owner I have ever talked to relate that the original temperature gauge reads hot.
This may be a result of Morgan not using a voltage stabilizer or a gauge that is simply out of adjustment.
The following apply to both the original gauges and your TR6 'Transplanted' Gauge.

1. Remove the sender unit from the radiator and suspend it in a pan of water. Clip a grounding lead from the sender to the car

2. With a thermometer in the water, heat the water to 180 F.

3. Place a small screwdriver in the adjusting slot on the back of the gauge and turn to move the needle to the centre of the Normal block.

4. Raise the temperature and once the water starts boiling, the needle should be at the bottom of the hot block . My Gauge reads as follows Bottom of normal = 170 F Top of normal = 190 F Bottom of Hot = 212 F

This article is the copyright of the Morgan Motorcar Club of Texas and may be reprinted without permission for non-profit purposes only.


https://www.gomog.com/allmorgan30.html

Feedback and comments are welcome.
Tim
 
FWIW, I was supplied the GTR108 (my transmitter was not operational when I bought the car). I measured the following when I tested the transmitter after the guage showed 'hot':
degrees F ohms
82 558
100 495
135 233
160 147.5
180 105
200 69.4

Tim
 
TR4s went with a Celsius gauge with 70C as the center point in the highlighted range. Thing is, 70C = 158F which is way on the cool side.

My theory is that they fat-fingered the conversion when they did the specs for the Celsius version. Anyway, it wasn't long after that they gave up on numbers altogether.
 
RonMacPherson said:
As far as burning? Don't think so, as this was a "factory fix" on Honda Accords in the 77-78 model years if I remember properly. And Honda lawyers would never have given the ok on something that could have gone wrong!
You are assuming that Honda Accord temperature gauges work the same as the Triumph gauges. I'd be surprised if that was the case but don't know as I've never owned an Accord.

I'm just telling you what will happen if you add resistance to a Triumph "hot wire" type gauge. The gauge's response to resistance is very non-linear, and the sender is similarly non-linear. Tim Buja did a curve fit to the original Stag sender, and got
5 + 4.4 * (e ^ (-.035 * ((temp degrees C) -172))).
https://snic-braaapp.org/stagtemp.xls

If you do try adding resistance to get 185F at center scale (IMO it should actually be at about 1/3 of the white, but I'm funny that way); be sure to test what happens at boiling.

BTW, although not quite identical to the Stag gauge, the TR6 gauge works the same, and does have the two windows for calibration.
 
jerrybny said:
So the general consensus is that the gauge being at mid point represents appx 180 degrees?
IMO, these things are so poorly made, plus have issues with wrong parts being supplied and so on, that you can't generalize about what is without actually checking it.

I haven't tacked the temp gauges on the Stags yet, but they are substantially different. So are the voltmeters. When I tried to calibrate one of the original voltmeters, I found that the adjustment tab was so loose inside the case that just setting the gauge down on the bench would alter the calibration !
 
Okay, thanks for the update about the gauge calibration. I will have to look into that.

As far as the gauge reading with a resistor. That will have the same affect no matter how hot or cold the gauge gets.

We are talking about a basic series circuit with the resistor inserted just affecting the voltage path. If you want I can get out my military training books and cite you page and paragraph.

I am passing on information, related to experiences that I have encountered. Nothing that I relay is intended to cause harm or detriment to anyone's vehicle. If there is information that I have forgotten, or am naive about, in this case the gauge calibration holes, ok, I will learn. But there are some basic mechanical physics that I have learnt and electrical gauges is one that I had drummed into my head back in 1966 at Camp Pendleton, California. With a refresher at AHM schools in the 70's. So I am aware of the affects of a resistor installed in a voltage to ground measuring circuit. 5 ohms is what AHM had us put in 77/78 Accords because the gauge was reading hotter than normal. They figured that it was the quicker expedient, as they did not have the proper resistance on the sending units in the pipe line and they did not want to keep customers cars down for several weeks while they had new ones made up. In case ohm's law has undergone a transformation that I am not aware of I will gladly listen to any explanation that you can provide proving to me that a resistor will change the reading scale in only part(hot temp according to your email) of the linear voltage chain.
 
Ron,
I didn't say it would only change the reading at one end of the scale, just that it will have far more effect at one end than at the other.

The Triumph gauges do NOT measure voltage per se. The internal movement is a bimetal strip, heated by a resistance element. The reading is (very roughly) linear with temperature of the bimetal strip. And the temperature of the strip is more closely proportional to the power dissipated in the heater (less heat lost to ambient). Since the heater resistance is fairly constant, the power dissipated is the square of the current, times the resistance (I2R).

This isn't just theory, myself (and others, eg the link I posted before to Tim Buja's information) have verified this information on the bench, using actual Triumph components.

BTW, even the voltage is not linear with resistance. To see an example of this, all you have to do is look at an analog ohmmeter. Note how at the high end of the scale (near 1 ohm on the x1 scale), 5 ohms will make a huge difference in the reading. Then look at the low end of the scale, where 5 ohms will barely move the needle.
 
did the investgation on the gauge "calibration".. those two windows only control (have an affect) on needle sweep(or extent of travel of the needle) one for the left side and one for the right side.

I'll look into the voltage resistance thing on the sending unit when I get a chance. Near as I can tell, even if they aren't "linear" with the limited range of temperature reading adding under 100 ohms is not going to cause one end of "scale" abnormal problems.
 
Just an aside, in a past life I was a nuclear reactor operator in the navy. The design of the instrumentation display on the control console was such that most dials and gages were positioned to be in the vertical position when giving a normal reading. This allowed the operator to glance at the console and quickly determine if some reading was out of the norm.

The engine gages on our cars were never precision instruments. As with the design of the reactor control panel, it is the relative position of the needle from the norm that alerts us to a problem.

As to attempting to compensate for replacement temperature senders, may I suggest getting a variable resistor (rheostat, pot, or whatever) and use it to adjust the reading to where you want it. You can either leave the pot in the circuit, or take it out, then using an ohmmeter, measure its resistance and purchase a fixed resistor of the same or similar value.
 
Back
Top