I'm not sure what type of gauge it is, but it draws about 90 mA when the input is connected directly to 0 V. It's the stock fuel gauge in (what was) a 1980 Renault 5, for what that's worth. It has a total of three terminals; +12 V, 0 V, and the input. It reads "full" with a 0.6 V input and "empty" with a 5.3 V input (that's when 12 V is actually 12 V; at 13 V supply voltage those full scale inputs are 0.7 and 6.0 V, respectively).
Is the solution to my problem perhaps as simple as a resistor between the PCB output and 0 V, to give the current somewhere to go so the op-amp chip can produce the desired voltage without having to absorb all the current that such a voltage drop across the gauge necessitates?