The efficiency of a simple buck depends on the voltage drop. As the output voltage gets lower, the duty cycle is reduced which means the peak current needs to be higher for the same output current. This means that the inducor's current rating needs to increase and the switching losses also increase. I would expect a 12V to 3.3V converter will be much less efficient than an 18V to to 12V converter if they're both using the same buck topology.
In my opinion 74% isn't bad for regulator converting if it was converting 12V to 3.3V but it's pretty poor for a 18V to 12V converter. To put this into perspective, a linear regulator converting 18V to 12V would give an efficiency of 66.67% which is only 7% worse than this design.
At 50W your regulator is wasting 17.76W of heat which will require a reasonable sized heatsink. If this isn't a problem for you, then fine, but it shouldn't be hard to halve the losses if you improve the layout and use a larger inductor.
50W at 12V is 4.167A.
12V is 66.67% of the input voltage, therefore the duty cycle will be about 66.67%, but in practise it'll be a bit higher than that.
For an output current of 4.167A, the peak current will be 4.167/0.6667 = 6.25A so make sure your inductor is rated for a much higher current - use an 8A inductor minimum.
Increase the width of all the power carrying PCB traces to lower their inductance and power losses due to their I²R losses. Keep all the traces as short as possible and use a ground plane PCB, if you can.