• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Alternatives to AMD/Intel

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Short answer: No.

Long answer: Ehhhhh, kinda? In the consumer world the x86 CPU, which runs just about all consumer electronic, is nearing the end of its life. My personal estimation is around 2020-2030 we will see new types of architectures come into play. Currently, the main contenders are ARM, Quantum, and Neural Computers. Since the consumer market is rather simple and does not require as much computing process as you might think, ARM and x86 will be pushed to its death, resurrected, and than slaughtered before something like Quantum takes complete control.

My personal theory is this: As Intel pushes x86 CPUs to their maximum profit, ARM will be introduced with new memory technology (look into Apache Pass or Intel + Micron memory cooperation). Feature size will stop decreasing, but more features and better algorithms will be developed. These extensions will only help keep the market where it is while investigations are made into what will take over. On the server/high computation side, the scene will be different. Advance learning machines will start to take over. These types of computers are Quantum, and Neural Network based. IBM has started to show this direction, and Intel has just announced they are going to start dumping money into Quantum computing. How this affects OS will be dependent on Microsoft and Google. Microsoft will, of course, dictate the consumer computing sector. Google will dictate consumer electronics and linux based machines.
 
Well, Quantum computing has been around for a while now, at least at the research level. Gotta love the On, Off, and On/Off bit.
Limits on OS restrictions as far as Quantum or Neural computing, as far as I can think of in two minutes, are limits of programming languages, and how far one is willing to push technology. Of course, what the public wants needs to be taken into consideration.

And there will be the fringe groups who claim of Terminator II style "machines are gonna rise up" nonsense *feasibly, it could happen one day, if AI ever becomes sentient, given our penchant for looking down on what is not us* But we are nowhere near there lol.

But going from Binary on and off, to Trinary, on off on/off......Im gonna need a new thinkgeek shirt.


I do think the RISC architecture needs to be looked at again.

Like in this article from last year:
http://www.extremetech.com/computin...ttle-arm-and-x86-by-being-totally-open-source
 
*saw that one coming and ducked*

- - - Updated - - -

I do haveta say.....If the terminators all looked like the female one in rise of the machines..........I welcome our female robot overlords
 
Yes Quantum computing has been around for a long time, but its not the software that is limiting it; well not exactly. Languages are developed based on the hardware that it lives on. If you think about it, x86 could not have been created without the standard RISC type computing chip. Quantum has had a lot of issues, primarily it has to do with whats defined as quantum. That took the longest. There has been a lot of Quantum type computers but a definition needs to be created, to which standards can be designed on top. This was the same process with the x86 architecture.

There is also a lot of research into controlling, and manipulating the state of the atom. It has come down to, how do we fit all of this into a single package solution. Just like in the old days of Tub computing, Quantum lives in attic size rooms. Intel and a bunch of other companies are trying to figure out how to bring everything down to a level that works for others. Once Quantum starts taking off, you can bet that Linux type OSs will start to pop up on these "startup quantum servers". And just like with the x86 history a lot of ideas will be brought forward, and most will be rejected. When all this happens depends on the investment of Quantum. And the investment all depends on when the big players in the x86 world decide that x86 must go the way of Old Yeller.
 
...Advance learning machines will start to take over...

You heard it straight from Mr. Dolk's beak. :)

Seriously though it will be interesting to see where this can take us. Instead of fighting over benchmarking scores us enthusiasts will be competing with each other on what we taught our computers to do.

From the Automotive perspective (my field of work) the computer could run crash analysis tests over and over learning how to create the safest vehicles. Transfer these results to the 3D printer (Aluminum and Carbon Fiber) and you have a completed vehicle with maximum safety...and I get to retire. :salute:
 
Blaylock has it down to a 'ock'

Has anyone seen the new IBM Watson commercials? That is a single 2U? server solution that analyzes your data and creates a solution. These types of machines are called Neural Network computers. They simulate the neural network of a human brain. Relying on an architecture similar to training your dog, the computer will figure out the best solution to a very complicated process. These types of tools are the Skynet esk solution that could later be taught human language and the complexity of our lives.

But don't fret, Stephen Hawkings and others are trying to push legislation that restricts computers from ever becoming full sentient. So no cylon rebellion.... yet.
 
Untill the thing figures out that's the only solution to it's problem is to become smarter. I realize it's not quite there but in a way the Watson does learn.
 
To those that would like to learn more about machine learning here are some wiki articles:

Evolutionary algorithms: These are algorithms that learn about their past results and figure out how to re-write the algorithm create a better solution. This is what causes a machine to really learn to become a better machine.

Neural Network: From what I have mentioned before. Neural networks, from what I have learned, are weight based algorithms. A very complex math function relies on specific variables to be updated based on outcomes, stimulation, and assumptions. A bunch of these weight functions exist in layers, updating and learning from one another. A neural node in the same network layer may not have the same value as another neural node. This helps a system identify the best and worst solutions at the same time.

There was a really good gif showing a neural network in action. It demonstrated how many times it had to learn in order to make an ostrich walk.
 
But don't fret, Stephen Hawkings and others are trying to push legislation that restricts computers from ever becoming full sentient. So no cylon rebellion.... yet.

But can we be sure it isn't Hawking's computer doing the thinking and it plans a back door to the legislation and all his efforts are a clever cover up ?

*adjusts tin foil hat and undergarments*
 
Conspiracy theorists will tell you Hawkings is the first true cyborg. He just needs a mech suit.
 
Turing = Awesome
Hawking = awesome but kinda freaky

And hey, Blaylock. In China they used a giant 3D printer to basically print off parts for a house.

So.......yeah........Cars not far off. Can we get Carbon-Carbon cars.....We need more Carbon in the environment lol.
 
IBM used to advertise the Power8 as superior to any Intel CPU (they didn't even bother to mention AMD). It's possible MS might have an OS that runs on it. The Power8 is a gigantic CPU, it's as big as the palm of your hand.
 
How this affects OS will be dependent on Microsoft and Google. Microsoft will, of course, dictate the consumer computing sector. Google will dictate consumer electronics and linux based machines.

How does Google dictate anything in the Linux world? Google couldn't be more irrelevant to the HPC clusters I work on as well as the Linux mySQL and webservers we run.
 
Back