• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

applying 20v to a 12v device

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Caeberos

Member
Joined
Jan 17, 2003
Location
New York
like the title implies I am currently using a 20v laptop transformer to power a device which has a little sign that says 12v on it. I tried it, it works, but I am wondering what practical downsides would this present in the future if any?

No I have no access to the original 12v source this device requires.

It is a TV to PC converter if anyone is curious.

Thanks in advance
-Cae
 
can you post a link to this device? picture, description?
Most electronic devices have built in voltage regulators, so they can handle some variations. Overall I wouldnt suggest doing it, you will probably shorten the life of the device.
 
Just watch it for heat. Devices have some tolerances for the voltage but that is a fairly large increase. As Borisw37 said, the device probably has voltage regulators in it, but if they are simple linear regulators they will be dissipating more power and heat up. I would recommend looking for 12V adapter somewhere. Even if it doesn't have the correct plug, you can get one from radioshack or an old adapter and splice it in.

If you are going to run it like that, just check for heat. Is there a current rating for the device too?
 
||Console|| said:
Couldnt he use a few resistors to drop that 20v down to 12 or a transformer even

For a transformer, you would need some type of AC not DC. A changing magnetic field through a coil induces a voltage and vise versa a changing voltage across a coil produces a changing magnetic field. The magnitude of the voltage is proportional to the number of loops in the coil. So basically you have a loop of iron with two sets of coils wrapped around it. The first set is connected to an AC source and this changing voltage produces a changing magnetic flux through the iron as the voltage changes. Since a changing magnetic flux is now flowing though the 2nd set of coils, this induces a voltage in the 2nd set of coils and is your output voltage. The voltage difference is the ratio of turns from the output coils to the input coils. The voltage is number of turns times the rate of change in the magnetic flux. So for a DC case the magnetic flux does not change, so the rate of change is 0 and the voltage across the 2nd set of coils is 0.

The problem with using a resistor is the current draw of the device changes with time. The voltage drop of the resistor is going to be current x resistance and that will change. He would have to put in a voltage regulator to drop from 20 to 12V without the voltage drop changing much. For that much work it would be easier to just buy a 12V wall adapter.
 
||Console|| said:
Couldnt he use a few resistors to drop that 20v down to 12 or a transformer even

Please for the <3 of g0d never ever use resistors to drop the voltage to a required level for a power supply. cyberey hit it on the head why.

Secondly for the benefit of all electronics and electrical engineers everywhere. Never plug a random DC wall power supply into something just because the connector fits. The connectors are standard, the voltage and current ratings are not. You will kill electronics like this.
 
I found this online that explains what I was going to type, http://www.montek.com/tutorials/ac_adapters.html.

Look around your house for a NES adapter, sega adapter or an adapter from any guitar pedals, old video game systems or music instruments. Most of those use a 12V supply. Just make sure the polarity is correct as shown in the link above and if there is a current rating on the device, make sure the adapter's current rating is greater than or equal to the device.
 
Thanks, that link was exactly what I was looking for. I have no idea what the amperage rating on the device is, I found it in a trash can a while back and had no idea it even worked until a few days ago. Time to go digging around for a 12v adaptor

-Cae
 
Caeberos said:
Thanks, that link was exactly what I was looking for. I have no idea what the amperage rating on the device is, I found it in a trash can a while back and had no idea it even worked until a few days ago. Time to go digging around for a 12v adaptor

-Cae

If it has a wattage requirement then you can just divide that by voltage to get amps.

Can't you just change the AC to DC then use a transformer and then have a DC to AC converter?
 
ShadowPho said:
If it has a wattage requirement then you can just divide that by voltage to get amps.

Can't you just change the AC to DC then use a transformer and then have a DC to AC converter?

The only really simple voltage convertions are from AC to AC using a transformer or AC to DC, using a rectifier and capacitors. Transformers only convert AC to AC. The voltage and current are scaled, voltage and current at scaled so that voltage in times current in equals voltage out time current out. This is so the input power and output power are the same and energy is conserved.

To go from AC to DC you use a recitifier with is basically diodes configured so that the output is a sine wave that never goes negative. So you will have a voltage waveform of a voltage going from 0 to whatever AC value. You then add capacitors that filter the output so it looks more like a contant DC voltage. The capacitors store charge and provide the voltage when the output from the rectifier drops.

To go from DC to DC you need to use a buck/boost/ or buck/boost converters which is more complicated and you need a small level of circuit theory or physics and simple calculus to understand.

To go from DC to AC, you basically build a circuit that basically turns the DC signal negative and positive very quickly and is a more complicated circuit.

Wikipedia has good articles on dc to dc converters if you search for buck converter and so on.

There is a pretty good book on understanding circuits in a simple fashion called 'tabs guide to understand electronics'. There is also another book that is excellent in understanding circuit theory starting from nothing, I don't know the exact name offhand and the book is in my office, but if you are interested I can give you the name of it. It is a popular book used in the first engineering class EEs and ECEs take.
 
cyberey66 said:
I found this online that explains what I was going to type, http://www.montek.com/tutorials/ac_adapters.html.

Look around your house for a NES adapter, sega adapter or an adapter from any guitar pedals, old video game systems or music instruments. Most of those use a 12V supply. Just make sure the polarity is correct as shown in the link above and if there is a current rating on the device, make sure the adapter's current rating is greater than or equal to the device.


You're going to want to make sure you are using the correct adapter (DC or AC). The NES used a 12v AC wall adapter, where as most modern electronics I've seen all use DC adapters. I'm pretty sure a device that needs a 12vDC input source won't be too happy when it gets a 12vAC source.
 
Back