• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Resistor help

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

RIPSTER

Member
Joined
May 1, 2004
Location
England
I just bought 4 resistors to go with the LED's that I salvaged from a LED fan lighting strip, I found the LED's to run at 3.2v and I'm trying to run them of the 5v line in my Xbox, My question is what ohm and wattage resistor will I need. An online calculator I used said that I needed a 90ohm (100 rounded up), 1/8th watt resistor this was for LED's rated at 20MA I wasn't sure what mine were the calculator said most standard LED's were 20MA , the resistors I bought are 100ohm 0.6 watt resistor. Seeing as the wattage I bought Isn't what was specified will this have a bad effect on the LED's. How can I tell which side of the LED is positive and which negative, I know that usually one side is rounded and one flat but on these LED's they both look identical, I have a multimeter is there a test which could determine which leg is + and which is -.

thnx

RIPSTER
 
RIPSTER said:
I just bought 4 resistors to go with the LED's that I salvaged from a LED fan lighting strip, I found the LED's to run at 3.2v and I'm trying to run them of the 5v line in my Xbox, My question is what ohm and wattage resistor will I need. An online calculator I used said that I needed a 90ohm (100 rounded up), 1/8th watt resistor this was for LED's rated at 20MA I wasn't sure what mine were the calculator said most standard LED's were 20MA , the resistors I bought are 100ohm 0.6 watt resistor. Seeing as the wattage I bought Isn't what was specified will this have a bad effect on the LED's. How can I tell which side of the LED is positive and which negative, I know that usually one side is rounded and one flat but on these LED's they both look identical, I have a multimeter is there a test which could determine which leg is + and which is -.

thnx

RIPSTER

The way I tell which is + and which is -. Hook it up!:D If it lights up, there you go. Since LEDs are diodes, they don't work in reverse....well, they explode in reverse if you really overpower them...found that one out the fun way:D
 
Led's have a round side and a flat side when looking down on the LED. The flat side is the negative and the round side the positive.

The wattage rating is how much heat/power the resistor will dissipate before it blows up. In your case, you are ok.
 
RIPSTER said:
I just bought 4 resistors to go with the LED's that I salvaged from a LED fan lighting strip, I found the LED's to run at 3.2v and I'm trying to run them of the 5v line in my Xbox, My question is what ohm and wattage resistor will I need. An online calculator I used said that I needed a 90ohm (100 rounded up), 1/8th watt resistor this was for LED's rated at 20MA I wasn't sure what mine were the calculator said most standard LED's were 20MA , the resistors I bought are 100ohm 0.6 watt resistor. Seeing as the wattage I bought Isn't what was specified will this have a bad effect on the LED's. How can I tell which side of the LED is positive and which negative, I know that usually one side is rounded and one flat but on these LED's they both look identical, I have a multimeter is there a test which could determine which leg is + and which is -.

thnx

RIPSTER


the basic formulas you want to use in your application are:

voltage = current x resistance, V=IR

power(Watts) = voltage x current, P = VI


Lets say your diode draws 20mA, and your voltage source is 5v. To find the resistance needed in this circuit, you would use

5v = .020A x R

R = 250ohm

The power consumed by the circuit will be :

P(Watts) = 5V x .02A = .1W



1/8W is also .125W. Using higher rating components should not have adverse effects.
 
Back