If you happen to look at some "cookbook" type books out there, you'll be in over your head pretty quickly. Most of them are just big collections of circuits with slight variations and a couple sentences to explain them - the expectation is that you have enough theory under your belt so that you'll understand the circuit and remember it. The next time you have a similar task, you'll have a rough idea as to the easiest way to go. I'd guess that a good fraction of the books out there are of this type (probably because they're so easy to put together...)
In general, read up on Ohm's law, power, kirchoff's law, RC exponential time constants (i.e. 63% /37%), transistors (BJT, Mosfet, forward voltage, Beta, saturation, gate voltage), and you'll have the *essential bits*. That and learning how to read (and draw) schematics. At the introductory level, the most complex math is just plain algebra, along with a lot of "engineering approximation" (i.e. somewhere in the ballpark). Later on you'll learn when to break out the calcuator/spreadsheets/bode diagrams/etc, and when to just throw in a 1K resistor because it's "close 'nuff".
Once you have the fundamentals down, figure out whether you want to continue along the analog electronics path, do digital electronics, or just dive into the software/firmware/microcontroller side. In terms of real-life "interesting" projects, microcontrollers are pretty much required, but you'll never get there if you don't know the basics...