first dBu is dB referenced to 1 microvolt, not 0.775V. dBu is used for measuring noise levels.
a more convenient reference for line level is dBV, or dB referenced to a volt.
usually (but not always) an amplifier has it's gain set so that 1Vrms drives the amp to it's rated output. this number varies from manufacturer to manufacturer, but 90% of amplifier manufaturers set the gain of the amplifier to be at full rated power with 1Vrms in. so for a 100W amp, where 100W=28.3Vrms into 8 ohms, sets the amp gain at 30 (this of course does not include any level controls on the amp, which for high end audio and pro audio, usually don't exist).
so this is a de-facto industry standard, but not everybody adheres to it.
so the answer to the OP is that most of the industry considers 0dBV (1Vrms) to be the "standard" line level. at an average listening level, you probably have more like 100mVrms on the signal lines. some preamps go as high as 3 or 4 volts out, but that's overkill