why is heat bad for hardware
- bum
- 17747114553
- Joined: Sat Nov 08, 2003 9:56 pm
why is heat bad for hardware
ok i know this question may sound stupid but im gona ask anyway.
why exactly is heat bad for hardware ? i mean, everyone know's its bad, everyone knows that fans, heatsinks and liquid cooling solutions make things colder, but why exactly is heat so bad. what does it do to freeze or screw up components ? and as the old age question goes, if chicken can be cooked at 180C and still be good, then why cant my silicon ?
why exactly is heat bad for hardware ? i mean, everyone know's its bad, everyone knows that fans, heatsinks and liquid cooling solutions make things colder, but why exactly is heat so bad. what does it do to freeze or screw up components ? and as the old age question goes, if chicken can be cooked at 180C and still be good, then why cant my silicon ?
- madmag9999
- Joined: Sun Aug 10, 2003 11:50 pm
- Status: Engaged
- Location: Pennsylvania
becouse heat creates friction and slows down the reaction of the transistors. the colder u get those transistors the faster they move.
Moonslayer's Guide to a-m-v.org | AD & ErMaC's Guides to Audio & Video
"I'm sorry but i don't trust anything that bleeds for 5 days and doesn't die."
"I'm sorry but i don't trust anything that bleeds for 5 days and doesn't die."
-
- Joined: Wed May 16, 2001 11:20 pm
(Calling dwchang!! Or mneideng if he ever comes here...))
Try this, bum. Heat causes almost all materials to expand in physical size. That includes the metal wires - if you can call them that, being as small as they are - and junctions in the chips. If they expand enough to cross the space between each other, electrical shorts can occur.
I'm not sure how true that is, but it's the first thing that comes to mind. Physical expansion at nano-scales can be deadly.
This article is much more detailed, but probably uses language that only a few of us can understand. (Not to mention that the English is imperfect.)
The two major problems he describes are:
(1) Electromigration...which I suppose means the metal "wires" will essentially start evaporating. Thinner wires means higher resistance means lower performance.
(2) Oxidation...which happens when heat promotes "dirt" - for lack of a simpler word - to accumulate inside a transisitor. A transistor's performance, and there are millions inside a typical CPU, will degrade and possibly fail outright.
Try this, bum. Heat causes almost all materials to expand in physical size. That includes the metal wires - if you can call them that, being as small as they are - and junctions in the chips. If they expand enough to cross the space between each other, electrical shorts can occur.
I'm not sure how true that is, but it's the first thing that comes to mind. Physical expansion at nano-scales can be deadly.
This article is much more detailed, but probably uses language that only a few of us can understand. (Not to mention that the English is imperfect.)
The two major problems he describes are:
(1) Electromigration...which I suppose means the metal "wires" will essentially start evaporating. Thinner wires means higher resistance means lower performance.
(2) Oxidation...which happens when heat promotes "dirt" - for lack of a simpler word - to accumulate inside a transisitor. A transistor's performance, and there are millions inside a typical CPU, will degrade and possibly fail outright.
- bum
- 17747114553
- Joined: Sat Nov 08, 2003 9:56 pm
thats for that taran and madman. so based on what ya told me if a chip runs at dangerously high temps for a long period of time then the cpu could actualy start eroding away. hmmm this is interesting.
but on the other hand, what if a cpu gets too cold ? freon coolers can take cpu's to -30C (-22F) degrees and lower. could a cpu get so cold that it will slow down the movement of electrons , or possibly something worse ?
but on the other hand, what if a cpu gets too cold ? freon coolers can take cpu's to -30C (-22F) degrees and lower. could a cpu get so cold that it will slow down the movement of electrons , or possibly something worse ?
- Corran
- Joined: Mon Oct 14, 2002 7:40 pm
- Contact:
When approaching Absolute Zero (-273 degrees C) it would become a superconductor.
http://www.ornl.gov/info/reports/m/ornlm3063r1/pt2.html
http://www.ornl.gov/info/reports/m/ornlm3063r1/pt2.html
Heike Kammerlingh Onnes recognized the importance of his discovery to the scientific community as well as its commercial potential. An electrical conductor with no resistance could carry current any distance with no losses. In one of Onnes experiments he started a current flowing through a loop of lead wire cooled to 4 K. A year later the current was still flowing without significant current loss. Onnes found that the superconductor exhibited what he called persistent currents, electric currents that continued to flow without an electric potential driving them. Onnes had discovered superconductivity, and was awarded the Nobel Prize in 1913.
- bum
- 17747114553
- Joined: Sat Nov 08, 2003 9:56 pm
WTMOTHERF ? ok wait, so theoreticaly, if a cpu we're cooled to -274C (persumed absolute zero) then it could have limitless speed, or to be more precise, it could run at the speed of light (300,000 km per second or 186420 miles per second, which is the presumed speed limit of the universe) and thierfor be the most powerful single processor possible. i know this is all very theoretical and i could be missing alot of info as i ramble about on this, but, disregarding the dificulty of the task and conflicts with other physical forces, would what i just sujested be possible ?
-
- Joined: Wed May 16, 2001 11:20 pm
Maybe at absolute zero, nothing moves at all. On the other hand, if the processor is doing anything, it'll make some heat, so you can never really get to the lowest temp.
People are trying to develop optical computers, either all-optics or optical hybrids. Something like that would run at nearly c - which electronic computers can't do. A Google search on "optical computer" will bring up some links for you.
People are trying to develop optical computers, either all-optics or optical hybrids. Something like that would run at nearly c - which electronic computers can't do. A Google search on "optical computer" will bring up some links for you.
- bum
- 17747114553
- Joined: Sat Nov 08, 2003 9:56 pm
madmag laughed at me when he read thatbum wrote:thats for that taran and madman. so based on what ya told me if a chip runs at dangerously high temps for a long period of time then the cpu could actualy start eroding away. hmmm this is interesting.
but on the other hand, what if a cpu gets too cold ? freon coolers can take cpu's to -30C (-22F) degrees and lower. could a cpu get so cold that it will slow down the movement of electrons , or possibly something worse ?

- madmag9999
- Joined: Sun Aug 10, 2003 11:50 pm
- Status: Engaged
- Location: Pennsylvania
actualy, i laughted at this
but if it could really happen that would be very sweet.bum wrote:WTMOTHERF ? ok wait, so theoreticaly, if a cpu we're cooled to -274C (persumed absolute zero) then it could have limitless speed, or to be more precise, it could run at the speed of light (300,000 km per second or 186420 miles per second, which is the presumed speed limit of the universe) and thierfor be the most powerful single processor possible...
Moonslayer's Guide to a-m-v.org | AD & ErMaC's Guides to Audio & Video
"I'm sorry but i don't trust anything that bleeds for 5 days and doesn't die."
"I'm sorry but i don't trust anything that bleeds for 5 days and doesn't die."
-
- Joined: Wed May 16, 2001 11:20 pm