Hi,
Do you know how to solve this if the first transistor was not in the circuit?
That is, the input signal (+5v) would connect directly to the base of the second transistor and the first transistor is removed from the circuit.
Also, specified voltages do not change so they can be treated as constants. The base emitter voltages are specified as 0.7 volts each so they do not change. The input voltage is also specified as +5v so that does not change either. Since the two base emitter diodes are in series, the emitter of the second transistor will be the input voltage minus the two voltage drops of the base emitter diodes. Thus the voltage across the emitter resistor is known with only a little calculation which only involves subtraction.
The single transistor circuit may be easier for you to start with. The dual transistor circuit is almost the same except the gains combine and the voltage drops add up.
The collector current is the base current times the Beta. The emitter current is the base current times Beta plus 1. So the emitter current is:
Ie=Ib*(Beta+1)
so the base current is not Ie/Beta, it is Ie/(Beta+1).
The reason for this is because the emitter current is the sum of the collector current (Ib*Beta) plus the base currrent Ib, which equals:
Ib*Beta+Ib
which factored algebraically is the same as:
Ib*(Beta+1)
Once you calculate everything required you should run through it and make sure everything works out to what you have actually calculated.