I also could not understand why the Volt gauge presented a reading so much less than what my Fluke multi-meter was telling me....

Having checked voltages with my Fluke multimeter, I concluded that the Volt gauge was basically grossly under reading.

**It was on the basis of this incorrect gauge readings that I decided to upgrade the alternator in the first place!!!**Having now studied the wiring diagram of the car, and making a few more measurements, I concluded that it was measuring the voltage at a spot which was rather INAPPROPRIATE from an electronic perspective.

First though, a tiny lesson in electronics:

Volts = Current (amps) x Resistance (ohm)

So:

If you tiny amps multiplied by tiny resistance, you get tiny Volts.

Eg. 0.1 Amps x 0.1 Ohm = 0.01V

End of electronics lesson. Back to the Volt gauge discrepancy:

1) The Volt meter measures the voltage on the 12V input terminals of the "voltage stabiliser". This is not a good place to measure "Battery" volts because;

2) If you follow the source of that 12V back to the terminal block next to the battery, there are 24 (Twenty four!!) electrical contacts in the circuit from the terminal block (next to the battery where the alternator output connects to) to the volt meter. This is a problem because EACH one of those 24 contacts has a tiny amount of resistance (remember the lesson??);

3) The volt meter itself has a resistance of just 120Ohm. This is bad as the volt meter alone draws approx 0.1A from the power source. This 0.1A is no longer "tiny", this falls more into the "smallish amount" category. This current passes through all 24 of those contacts (each with tiny resistance) causing a measureable voltage drop over each of those contacts which adds up;

4) Not only the Volt gauge draws current though! The Fuel tank gauge draws current, the Coolant sensor draws current and the oil pressure sensor draws current - all from the same terminal on the voltage stabilizer and through the same 24 contact points!. If you combine these currents with that being drawn by the Volt gauge, the voltage drop over those 24 contacts combined becomes quite substantial. (I am still ignoring the voltage drop over the wires and over the fuse).

So, if we go back to the electronics lesson:

Lets assume the other gauges also draw 0.1Amp (collectively) so we have a total of 0.2Amp being drawn over 0.1Ohm contacts. Now we have 0.2 (amp) x 0.1 (ohm) = 0.02V PER CONTACT. So how is this a problem? Well there are 24 of them so that 0.02V must be multiplied by 24 to get the total voltage loss to the Volt gauge which comes to 0.48V!! SO....... If the alternator is putting out 14V, the gauge would only measure 13.52V!!

(The actual numbers above is not exact, used only to demonstrate the principle.)

So what is the solution?

So what is the solution?

1) Remove the terminal with the green wire from the Volt gauge and tape it up so it can't accidentally make contact with anything else;

2) Install a wire terminated with FULLY INSULATED 6mm FEMALE spade terminals at both ends from the Volt gauge to the RIGHT HAND SIDE of fuse block as shown in the picture below. Its the thin green wire with the red terminal. This position is STILL not ideal as there is now still 10 contacts between the 12V source and the measuring point, but at least the number of contacts is now less than half and likewise the effect. I still find a 0.15V difference being measured and the actual voltage (as measured by the multi-meter). But at least the "Battery" gauge is now reporting a situation consistent with what my Fluke is telling me.

I have also found the Volt gauge to be a very crude bi-metallic based instrument. This is why the readings being displayed are not very accurate or consistent. Tapping the lens can cause the needle to move higher up the scale. This is what it looks like inside: