It is a bit two faced, on a fundamental level the voltage itself isn't actually amplified, neither is the current. But the end result is an output voltage which is some fundamental multiple of the input under a given set of conditions.
Amplify intimates that a low value is raised to a higher value without loss, but in the real world vast quantities of energy are transfered to change the 'window' upon which we look at the original signal. What has occurred is not strictly speaking amplification but carefully controlled transfer of the state of the original signal to some other form of signal which may be 'more' useful than the original signal in a practical sense. The current following ratio of a transistor applied across static impedances is practical voltage amplification. I saw following because it's not actually 'transfered' a current in the base follows the E-C voltage, but the actual current comes from the voltage applied to the E-C junction itself, less physical losses.
I didn't go to school for any of this but I've heard on multiple occasions the best way to teach electronics is to lie and then explain the details of the lie later on because they're too complex to introduce right off the bat. You'll find a lot of differences in the wording used for various devices a far cry from the actual physics involved in the actual effect. The best example I can give is that the positive voltage in practical electrical circuits is actually negatively charged (surplus of electrons) on a physical level. Look up the physics of a basic diode, and then look up the physics of a transistor. Great reading, but it won't teach you any practical circuits, just give you a better idea of what's really going on.