If you had actually read the references I provided, you would have learned that an ammeter capable of reading several Amps is actually a mV or mA meter movement with a resistive shunt across it. You would have learned that a voltmeter capable of reading several tens of volts is actually a mV or mA meter movement with a series resistor. No opAmps required!
i apologize for not reading before replying, but i replied from a mobile device, intending to steer the conversation. however I think it was misconstrued because it was a short reply.
anydangway, yes I am aware of how to construct a voltmeter/ammeter with series/parallel resistors. however, these are nowhere close to "ideal." information I am looking for is the basic operation of an "amplified voltmeter" as it is referred to in your linked reference (page 259 of the pdf, page 249 as labeled on the actual text scan.) However they merely mention its existence and not how it works.
the point is, a resistor and a gauge will work, but with a 1ma gauge and just a resistor, that 1ma is going to have to come from somewhere, and its going to come from your circuit, essentially changing the load. enter op-amps, stage right. they have a very high input impedance, and thus approach the "ideal" status.
seeing as I want to design an adjustable power supply, having something on there that sucks down current other than the circuit i am powering is not a good idea. it's also going to need to be manually switched so i dont explode the meter, which isnt very safe. what if i hooked something up wrong in the circuit im powering, and i accidentally dump a ton of power into my volt/ammeter?
I can at least imagine how an amplified voltmeter works, being that opamps have high imput impedance, but what about an "amplified ammeter?" I suppose that I could put something like a .01 or .001 Ω precision resistor in one of my output lines and stick a voltmeter across it, but
is this how its done for "real" ammeters?
.01 would probably be acceptable, at a 10A load it would drop 100mV, which seems like a lot, but for something drawing 10A it probably doesnt need to be precise to 100mV. I wont be doing like, precision control systems on this power supply or anything. but i would like it to be robust and safe (safe as in not going to explode itself or circuits i hook up to it if i do it wrong. safe for my circuits, i could care less about my health =P) its a combination design-learning-experience for me as well as me getting something fairly useful at the end. so once im done with it i want it to be useful. just trying to apply some of the theory learned in class in a practical manner... having actual hands-on experience with things helps understanding a lot. and I figure a power supply has the potential for all sorts of applications. theres the actual power regulation, the metering of it, safety circuits, im sure i can incorporate things like comparators to give me warning lights (just throwing out ideas)... eventually i'd like to add a signal generator. probably a wien bridge oscillator core for sine wave production and then integrators and such to get square/triangle/sawtooth output. and of course amplification, which will be really fun. my own personal lab course that will teach me a lot about many different subjects I have studied.
sorry for the long winded reply. just looking for people who may have built their own supply with volt/ammeters in them that needed to be close to ideal, or otherwise have experience in that sort of thing.