We're talking about an RC series circuit here... that is, a resistor in series with a capacitor, and a voltage source across them both.
First - the time constant: If you multiply the capacitance of the capacitor (in Farads), by the resistance of the resistor (in ohms), you end up with a number, whose units are seconds.
Initially, the capacitor is not charged, ie. it has zero volts across it. The capacitor charges up, and the voltage rises.
Now, it happens that, in one time constant, the voltage across the capacitor is 0.63 of the total supply voltage.
Here's a worked example:
A resistor of 1.5k is in series with a capacitor of 100uF. This is placed across a supply voltage of 12V DC.
Time constant = 1500 ohms x 0.0001 F = 0.15 seconds
Voltage across the capacitor after 0.15s = 0.63 x 12V = 7.56V
Now, it gets harder: What is the voltage across the capacitor after 2 time constants?
2 time constants = 2 x 0.15s = 0.30s
V (2T) = 0.63Vs + 0.63(Vs - 0.63Vs)
...which simplifies to...
V (2T) = 0.8631 x Vs = 0.8631 x 12V
V (2T) = 10.3572V
In that second equation, V (2T), the first term (0.63Vs) is the voltage already on the capacitor after the first time constant. The second term, 0.63(Vs - 0.63Vs), is the additional voltage from the second time constant.
Theoretically, the capacitor never charges up to the full supply voltage. However, in practice, we consider the capacitor to be fully charged after 5 time constants.
Does this make things a bit clearer? Let us know if you need more help.