I recently had the same question asked from a friend here in Houston. Here is what I told him. He's been adding in-line resistance to get his measurement centered.
One big thing to take note of, is that the resistance measurement (with an ohm meter) of the sending unit, will significantly change as soon as the current thru the gauge is removed -in other words, the resistance is dependent on the amount of current used to measure it. (A regular ohm meter uses much less current than the 'temperature gauge' -which is actually another ammeter.) So, in order to get a realistic measurement of the resistance at a given temperature, you need to measuse the current thru the meter and/or voltage drop across the sending unit.
Anyway here's my chart:
44mA 160ohms shows 130 degrees
59mA 102ohms shows 170 degrees
85.6mA 52ohms shows 185 degrees
I ended up using a 51 ohm resistor in place of the sending unit/temperature probe, and calibrated my gauge to read 185.
I chose that value because I had 2) 102ohm resistors in parallel.. It also should be noted that I am using a LM2910 voltage regulator to control the input voltage to the gauge at 9.8v. With an ohm meter, my sending unit measures 75 ohms at 185 degrees.
Your results may vary, but I hope this helps a little.