Sounds like homework but let's see if I get it right...
500kbit/sec > 500*1024bit/sec > 512000bit/sec
If you can transfer 512000 bits in one second, single bit transfer rate
becomes 1sec/512000=0.000001953sec=1.953microsecond (usec).
This is the "width" or duration of the pulse for one single bit.
Since this also has to travel over the distance, we need to calculate
propagation time and simply add the two up.
c should be speed of light or ca 297000km/sec if my memory is ok.
297000km=297000000m
Signal travels at speed of 0.8c so
297000km/sec*0.8= 237600000m/sec
If in one second signal can cross the distance of 237600000meters
than 10 meters travel will last about:
10m/(237600000m/sec)=0.00000004208754 sec. or 0.042usec.
So the time to transfer one single bit is 1.953+0.042 = 1.995usec.
Now all you need to do is figure out how many bits you can transfer
during specified period. It's al basic math...