If the LED or whatever object takes a voltage less than the voltage of the line you're going to run it from, then you need to calculate ohms needed to get rid of the excess voltage. Say you want to run a 2.5 volt LED on a 12v line. First, you take 12 - 2.5 = 9.5v you need to resist. Next, you use the current for the LED. Say its a 25mA LED. You take 9.5/0.025 = 380 ohm. You need to do E/I = R where E is volts, I is amps, and R is ohms.
So, in summary, you take the difference in voltage divided by the current needed in Amps. If it's mA, then you take that number (25) divided by 1000 to get the current in amps....0.025.
Another bit of info. You can use more than 380 ohms of resistance, but if you use less you run the danger of burning out the LED. Not a big deal, but a pain when it's the only one you have,and you need to run out to radio shack to get another one. If you needed 380 ohms resistance, the LED won't dim too much if you use like 500. I say this because sometimes it's just easier to throw on 2 250ohm resistors on rather than trying to get like 3 100s, a 50 and 3 10s to get the exact resistance. That, and it's good to have a little more just in case the voltage spikes a bit, or the resistors don't completely do their job....most have a 5% tolerance, so really if you had the exact resistance, it would be anywhere between 360 and 400. Nothing that will blow up the LED, but if you get into more sensitive electronics, then this is something you may need to factor in.