• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

games now going to be targeted at XBone and PS4: will CPU reqs stabilize?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Status
Not open for further replies.

magellan

Member
Joined
Jul 20, 2002
Since most all developers are going to be targeting/developing their games for XBone and/or PS4 hardware will video game CPU reqs stabilise? Even AMD's latest CPU's should have more than twice the integer and floating point performance of the XBone and PS4 CPU's, based on clock speed alone.
 
All consoles have AMD based APUs with more cores than average desktop APU but are still slow comparing to above average desktops. Game developers are forced to design games for this slow platform and Microsoft is forcing some of them to limit max FPS to 30.
Consoles won't have different specification for longer and they're quite new products so I wouldn't expect any changes to hardware in next ~2 years.

Game developers are focused on console market because of higher profit ( popularity in US and some Asian countiries, less in EU ) and less loses from piracy. They have no other choice than to optimize games for this weak hardware. On the other hand games for PC are not so great looking comparing to theoretical performance and most of them are simply boring. It's hard to find new games that will keep people next to PC for more than 1-5 hours. There are of course exceptions but in general console games give more fun.
Consoles are priority for game designers for some long years. PC games market is constantly shrinking so why they should invest in PC games.
 
Last edited:
I thought most games were made for console and hacked to make work on pc.

Depends on the game and developer. Fortunately because both the Xbox One and the PS4 are based on x86 there's a lot less work required to make a viable "port" to PC (or vice versa), but of course there's differences in APIs and everything between the 2 consoles. Of course all games are coded on the PC anyway, but everything has to be made to run on the lowest common denominator (depending, could be XB or PS4, lately been the XB), so that resolution, textures, fps, etc are all optimized for that platform and then improved upwards.
 
The Skylake seems to increase CPU performance up to 10% compared to last generation (at least for gaming purpose) which is not much, but IGP seems to have a much higher improvement as usual. I lack the time for checking it out but i guess the few infos i got is more or less arrucate. The Skylake IGP is at least 20% faster, so intels focus is pretty clear: Keep pumping up that IGP seems to be their main philosphy. Skylake is clearly the best CPU so far but Intel is holding back as usual. Although for most of the mainstream-gamers there is not much need for better CPUs (better than 6 core-Nehalem) unless someone want to run stuff at over 60 FPS all the time. I think up to the time of Nehalem (2008), the CPUs was a true bottleneck for many gamers, but when the first Nehalem 6 core was released, the situation has changed leaving the GPU in a constant bottleneck but CPU was almost out of buisness (of course, benchmarks and "100 FPS" gamers is special case). My 990X will soon be 5 years from the time of the launch, and it is still able to run any game at 60 FPS as long as the GPU is sufficient.

So, i think CPU demand is not changing a lot for the next years, the GPU will continue to be the hard working mule, although not the "so praised" IGP, a dedicated GPU of course. Be aware, a PS4 for example is working a bit different. A PC is using around 4-6 cores (in usual) on a gaming machine and there is no dedicated cores for the OS (operation system). So that means, in order for the system not to become unresponsive the CPU load should not be maxed out, it should stay at 70-80% or so, not at 100%. However, the PS4 for example is using 6 cores for gaming software only (around 75 GFLOPs on all 6 cores), so that means those cores can be at 100% load and the system will not stop to stay responsive because there is 2 more cores dedicated for system only tasks. So in reality, i think the effective advantage of a Skylake 6-core (150 GFLOPs - 20-30% overhead) is maybe 70-80% or so, not 100%, because of the required 20-30% overhead.

Another issue is, the PS4 is usually running at 30 FPS, in rare occasions it may be up to 60 FPS but it would mean to reduce graphic details level a lot (so its usually only used on Shooters). So in that term, the PS4 is in need of half the CPU peformance compared to a 60 FPS gaming machine.

The conclusion is not hard to understand: There is no need to worry... the PS4 CPU wont hold a PC back, it will always stay a matter of the GPU, luckily the GPU performance was increasing a lot (over 5 times stronger compared to PS3). Even so... the console GPU is still holding back, although a console will only be running at 1080P and many PCs will start to shift toward 2160P, so the meaning is... the PC GPU is in need of at least 4 times the performance in order to keep up with the resolution demand.

Ultimate conslusion: There is not much draw backs for the next 5 years at least. The main issues, GPU and RAM, has been solved... lot more RAM and lot more GPU. CPU is a minor matter, guess it should be clear. Most devs nowadays who are claiming that they cant make a apropriate PC version because of console "drawbacks"; are simply to lazy in order to put some love into it (i guess they think "lets make some cheap port... money is money")... there is not much excuse, according to my mind. Its clearly possible to go all out for a PC version, even if the main game was mainly developed on a console. Of course, in term they decide not to put a lot of additional work into it, there will always be a lot of lack, a PC is not the same system, it cant just be ported over 1:1, unless there is big sacrifice. The main issue is just profit margin (its cheaper to make bad ports and produce even more games instead)... nothing else.

Indeed, it is fun to have good performing hardware, but performance or graphics itself wont make a good game. So, it does not necessarily mean (actually rarely) that a technically proper game is fun to play, the main requirement for good gaming is gameplay, general creativity and art count, all the other factors will add up and increase the original experience to even higher levels, but it wont make a good game out of a bad game. This is the reason why i still have more fun playing on console, although i can now play many of the old console games with even better performance/maxed out graphics on PC and the gaming experience is now maxed out. A PC is able to let a dirty raw diamond to become polished with even higher brillance, but it cant make a diamond out of a game that has never been a diamond. I wish, there would be even more true philosophers hired on many of the game companys, because they are truly able to contribute to great gameplay mechanics and innovations. Most of the devs seems to be harsh technicians, they are good at coding but they may lack the creativity of a true philosopher... so the game may become technically flawless but still not much fun to play. Creating a sandbox system (many of the gameplay fragments) is one thing, but creating the system of a system, this is the true scale..

Late Edit:
But to say my hard truth, from a technical point of view, all the CPUs nowadays are inferior for gaming use. Those CPUs may be almost foolproof but a game can barely utilize a usual CISC CPU because most of the functions including the huge cache is close to useless in gaming terms. The only good thing is that a usual CISC CPU for gaming use may rarely experience stability issues and programming can almost be done on the water closet while using a aged laptop, so the clocks in theory can be insane. Still to less floating/ALU units in order to truly support gaming needs. The industry got no interest, they want to make mainstream happy not "using max potential", so even the highest clocked and most impressive PC is in my eyes not much more than a depleted fragment of missed potential. Take it or leave it, it is sîmply my odd reality.
 
Last edited:
No, and don't say anything when you have nothing to say. [1]

The Skylake seems to increase CPU performance up to 10% compared to last generation (at least for gaming purpose) which is not much,

Skylake has not increased performance in any game, and actually loses performance in half the benchmarks.
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/10
http://arstechnica.com/gadgets/2015/08/intel-skylake-core-i7-6700k-reviewed/


but IGP seems to have a much higher improvement as usual. I lack the time for checking it out but i guess the few infos i got is more or less arrucate. The Skylake IGP is at least 20% faster, so intels focus is pretty clear: Keep pumping up that IGP seems to be their main philosphy. Skylake is clearly the best CPU so far but Intel is holding back as usual. Although for most of the mainstream-gamers there is not much need for better CPUs (better than 6 core-Nehalem) unless someone want to run stuff at over 60 FPS all the time. I think up to the time of Nehalem (2008), the CPUs was a true bottleneck for many gamers, but when the first Nehalem 6 core was released, the situation has changed leaving the GPU in a constant bottleneck but CPU was almost out of buisness (of course, benchmarks and "100 FPS" gamers is special case). My 990X will soon be 5 years from the time of the launch, and it is still able to run any game at 60 FPS as long as the GPU is sufficient.

So why say anything when you haven't done the research?... You know what, after reading that part again.... I can't even fathom where you even conceptualized the very existence of this thought. I'm pretty sure you are hearing voices from another dimension.

Oh and here is the article to disprove you: http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/20

Pretty known fact: Intel's plan is to grow the CPU in die size. Add in more modules.


So, i think CPU demand is not changing a lot for the next years, the GPU will continue to be the hard working mule, although not the "so praised" IGP, a dedicated GPU of course. Be aware, a PS4 for example is working a bit different. A PC is using around 4-6 cores (in usual) on a gaming machine and there is no dedicated cores for the OS (operation system).

Let me introduce Core 0, of every x86 CPU up to windows 8.1. It was the 'dedicated' cpu for ALL single threaded instructions, unless deferred by the CPU fetcher or the compiler.

So that means, in order for the system not to become unresponsive the CPU load should not be maxed out, it should stay at 70-80% or so, not at 100%. However, the PS4 for example is using 6 cores for gaming software only (around 75 GFLOPs on all 6 cores),

What, where did you get this number? How does this benifit your argument? At this point what are you arguing toward?

so that means those cores can be at 100% load and the system will not stop to stay responsive because there is 2 more cores dedicated for system only tasks. So in reality, i think the effective advantage of a Skylake 6-core (150 GFLOPs - 20-30% overhead) is maybe 70-80% or so, not 100%, because of the required 20-30% overhead.

Seriously, what? Thats not even close to how to calculate performance and overhead. Did you factor in IPC, error % rate for pipelines/memory/cache/thermal distrubance/pll misalignment? Whats the performance per branch for each Skylake Core? For which instruction sets? Do you know what the overhead is in each stage of the pipeline? Do you know what architecture Skylake uses?


Another issue is, the PS4 is usually running at 30 FPS, in rare occasions it may be up to 60 FPS but it would mean to reduce graphic details level a lot (so its usually only used on Shooters). So in that term, the PS4 is in need of half the CPU peformance compared to a 60 FPS gaming machine.

The conclusion is not hard to understand: There is no need to worry... the PS4 CPU wont hold a PC back, it will always stay a matter of the GPU, luckily the GPU performance was increasing a lot (over 5 times stronger compared to PS3). Even so... the console GPU is still holding back, although a console will only be running at 1080P and many PCs will start to shift toward 2160P, so the meaning is... the PC GPU is in need of at least 4 times the performance in order to keep up with the resolution demand.

AOOHHH So thats what you were getting at? So I'll give you this: http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

I was surprised to find out how many games ran at 60fps @ 1080p, really shows that consoles have been trying to match PC. Good for them. But your statement holds no strength. Yes, the CPU is not the main bottle neck of the console, but using AMD cores hasn't been beneficial to the GPU either. Most AMD systems are still stuck in PCI-E 2.0 which creates a bottle neck for new GPUs.

Strictly speaking, GPU performance is alright for consoles. Game designers have learned how to incorporate great optimizations for their games across all platforms. There has been a lot of bad history with studios not doing what the customers demand, but things are changing in favor for all. I also want to point out a very big reason as to why consoles have been able to run new games on old hardware. Studios like Activision have been using the same engine for many generations of their games. COD is a great example of this. Exploiting the optimization of the same engine allows the designer to customize the game to the platform. Changing small sometimes unnoticed settings to make the game look the same across all platforms.[/quote]


Ultimate conslusion: There is not much draw backs for the next 5 years at least. The main issues, GPU and RAM, has been solved... lot more RAM and lot more GPU. CPU is a minor matter, guess it should be clear. Most devs nowadays who are claiming that they cant make a apropriate PC version because of console "drawbacks"; are simply to lazy in order to put some love into it (i guess they think "lets make some cheap port... money is money")... there is not much excuse, according to my mind. Its clearly possible to go all out for a PC version, even if the main game was mainly developed on a console. Of course, in term they decide not to put a lot of additional work into it, there will always be a lot of lack, a PC is not the same system, it cant just be ported over 1:1, unless there is big sacrifice. The main issue is just profit margin (its cheaper to make bad ports and produce even more games instead)... nothing else.

Indeed, it is fun to have good performing hardware, but performance or graphics itself wont make a good game. So, it does not necessarily mean (actually rarely) that a technically proper game is fun to play, the main requirement for good gaming is gameplay, general creativity and art count, all the other factors will add up and increase the original experience to even higher levels, but it wont make a good game out of a bad game. This is the reason why i still have more fun playing on console, although i can now play many of the old console games with even better performance/maxed out graphics on PC and the gaming experience is now maxed out. A PC is able to let a dirty raw diamond to become polished with even higher brillance, but it cant make a diamond out of a game that has never been a diamond. I wish, there would be even more true philosophers hired on many of the game companys, because they are truly able to contribute to great gameplay mechanics and innovations. Most of the devs seems to be harsh technicians, they are good at coding but they may lack the creativity of a true philosopher... so the game may become technically flawless but still not much fun to play. Creating a sandbox system (many of the gameplay fragments) is one thing, but creating the system of a system, this is the true scale..

Late Edit:
But to say my hard truth, from a technical point of view, all the CPUs nowadays are inferior for gaming use. Those CPUs may be almost foolproof but a game can barely utilize a usual CISC CPU because most of the functions including the huge cache is close to useless in gaming terms. The only good thing is that a usual CISC CPU for gaming use may rarely experience stability issues and programming can almost be done on the water closet while using a aged laptop, so the clocks in theory can be insane. Still to less floating/ALU units in order to truly support gaming needs. The industry got no interest, they want to make mainstream happy not "using max potential", so even the highest clocked and most impressive PC is in my eyes not much more than a depleted fragment of missed potential. Take it or leave it, it is sîmply my odd reality.


Ultimate Conclusion: You have no idea what you are talking about. Anyone in the development industry would immediately ignore you and tell you to **** off. In this entire rambling that you call a book, none of your statements actually hold truth or argument. Since I imagine that you are 10, I'm going to assume that you have just started to learn about computers and x86 architecture. Let me help you learn. You should read some of my posts, and some books and articles while you are at it.

AMD vs Intel, understanding the difference of x86 architectures
Introduction to Computing Systems: From Bits and Gates to C and Beyond
Computer Architecture, A Quantitative Approach (Seriuosly, a $300 text book reguarly used in uni, for FREE. A def good read)
Real World Tech A website dedicated to x86 architecture. You can learn all about the new architectures and their performance at this website.
RISC vs CISC (PS. Intel and AMD do NOT use either CISC or RISC)

I'd add more, but I lack the time to teach a kid that is incompetent in doing simple research to better understand what he is trying to learn.
 
Last edited:
Thanks, i will read many of the stuff you posted, as soon as some spare time. Since you took the time revealing your view it would be kind reading it and just to make it clear, i do appreciate your input because you certainly got guts in the knowledge term.


However, some things i am not certain, so i made some points:

1. I dint call my "rambling" a book, that was you.
2. In my mind a Intel/AMD CPU (nowadays) is a hybrid approach, it is not RISC nor CISC. RISC can clock much higher. True CISC doesnt seem to exist because hybrid approach.
3. Intels plan stepping up the core count (modules) is nothing new, but so far it seems to lack execution, especially for consumer grade.
4. Skylake is mainly a feature set with many impressive stuff, the exact performance is not clear to me but it can clock ridiculously high, so my expectations was a net 10% increase, if you say its 0%... ok... no problem to me.
5. Sure, i think your core 0 thing is valid, but only on the paper. It seems to lack execution, because system will lag.
6. Changing "unnoticed" settings looks like "bloatware" to me... adding technical stuff that does almost nothing but to suck performance. Good reason not to get yourself tricked by non required settings.
7. They do not run at 60 FPS STABLE unless its running at 720P native res. I would say 1/3 is 30 FPS cap, 1/3 is unlocked FPS means it may move between 30-50 FPS in most cases but only rarely reaching the 60 FPS cap. And probably less than 1/3 may be able to provide a framerate hitting the 60 FPS cap in a regular manner but instead they may become a new tradeoff known as 720P upscaling, means the rate of detail is much lower and clearly visible. It depends if a dev is chosing the performance, mixed approach or quality approach.
8. Skylake architecture: Do you think your car got same horsepower at a mountain compared to sea level? I just made a raw estimation. nothing more... it is impossible taking all the odds into account because in that term i may even have to add the aging of a CPU, because with every bit of age it will deplete aswell, it is unrealistic going into the details because it will fill a whole book.
9. Your imagination regarding my age is more or less true, i am some decade, a single decade and some year old. You can freely chose because time means nothing to me and just as i feel like... i switch between those years back and forth.:D
10. All the numbers are estimated numbers coming out of my mind, i want to proof that there is no CPU drawback because PC is not truly ahead, taking the higher demand into account and the slow/minor CPU improvements.
11. I have had enough of the blue pill already, so i chose to take some additional red pills, it is better for my health.
 
Last edited:
Thanks, i will read many of the stuff you posted, as soon as some spare time. Since you took the time revealing your view it would be kind reading it and just to make it clear, i do appreciate your input because you certainly got guts in the knowledge term.


However, some things i am not certain, so i made some points:

1. I dint call my "rambling" a book, that was you.
2. In my mind a Intel/AMD CPU (nowadays) is a hybrid approach, it is not RISC nor CISC. RISC can clock much higher. True CISC doesnt seem to exist because hybrid approach.
3. Intels plan stepping up the core count (modules) is nothing new, but so far it seems to lack execution, especially for consumer grade.
4. Skylake is mainly a feature set with many impressive stuff, the exact performance is not clear to me but it can clock ridiculously high, so my expectations was a net 10% increase, if you say its 0%... ok... no problem to me.
5. Sure, i think your core 0 thing is valid, but only on the paper. It seems to lack execution, because system will lag.
6. Changing "unnoticed" settings looks like "bloatware" to me... adding technical stuff that does almost nothing but to suck performance. Good reason not to get yourself tricked by non required settings.
7. They do not run at 60 FPS STABLE unless its running at 720P native res. I would say 1/3 is 30 FPS cap, 1/3 is unlocked FPS means it may move between 30-50 FPS in most cases but only rarely reaching the 60 FPS cap. And probably less than 1/3 may be able to provide a framerate hitting the 60 FPS cap in a regular manner but instead they may become a new tradeoff known as 720P upscaling, means the rate of detail is much lower and clearly visible. It depends if a dev is chosing the performance, mixed approach or quality approach.
8. Skylake architecture: Do you think your car got same horsepower at a mountain compared to sea level? I just made a raw estimation. nothing more... it is impossible taking all the odds into account because in that term i may even have to add the aging of a CPU, because with every bit of age it will deplete aswell, it is unrealistic going into the details because it will fill a whole book.
9. Your imagination regarding my age is more or less true, i am some decade, a single decade and some year old. You can freely chose because time means nothing to me and just as i feel like... i switch between those years back and forth.:D
10. All the numbers are estimated numbers coming out of my mind, i want to proof that there is no CPU drawback because PC is not truly ahead, taking the higher demand into account and the slow/minor CPU improvements.
11. I have had enough of the blue pill already, so i chose to take some additional red pills, it is better for my health.

I do not mean to attack you, but your whole entire argument is based on your opinion. There is little to no empirical evidence to back any point you made. Generally speaking, provide us with data and we can have endless arguments with you on the validity of said data. I can state that the distance between the Earth and Sun is 67 million miles away because it is widely accepted that the distance between these two objects is in fact, quite large. However, under a more detailed analysis, backed by empirical data, I understand the distance to be much closer to 93 million miles. This is the academic approach and generally promotes discourse and understanding of the topics that are being discussed. I'd much rather contribute to discourse than fan a fire of hearsay.
 
Thanks, i will read many of the stuff you posted, as soon as some spare time. Since you took the time revealing your view it would be kind reading it and just to make it clear, i do appreciate your input because you certainly got guts in the knowledge term.


However, some things i am not certain, so i made some points:

1. I dint call my "rambling" a book, that was you.
perceived thought

2. In my mind a Intel/AMD CPU (nowadays) is a hybrid approach, it is not RISC nor CISC. RISC can clock much higher. True CISC doesnt seem to exist because hybrid approach.
You are on the right track, but I'm not going to share now. I still have to digest all the gutsly knowledge after my one on one with Yale.

3. Intels plan stepping up the core count (modules) is nothing new, but so far it seems to lack execution, especially for consumer grade.
I didn't say anything about core count. Cores don't matter unless you can utilize them. With Win10, the OS will distribute instructions across all cores. This however has a limit, but can be dependent based on machine load and instructions.

4. Skylake is mainly a feature set with many impressive stuff, the exact performance is not clear to me but it can clock ridiculously high, so my expectations was a net 10% increase, if you say its 0%... ok... no problem to me.
Refer to [1]

5. Sure, i think your core 0 thing is valid, but only on the paper. It seems to lack execution, because system will lag.
It's not paper, its common knowledge.
And even more
one more


6. Changing "unnoticed" settings looks like "bloatware" to me... adding technical stuff that does almost nothing but to suck performance. Good reason not to get yourself tricked by non required settings.
?

7. They do not run at 60 FPS STABLE unless its running at 720P native res. I would say 1/3 is 30 FPS cap, 1/3 is unlocked FPS means it may move between 30-50 FPS in most cases but only rarely reaching the 60 FPS cap. And probably less than 1/3 may be able to provide a framerate hitting the 60 FPS cap in a regular manner but instead they may become a new tradeoff known as 720P upscaling, means the rate of detail is much lower and clearly visible. It depends if a dev is chosing the performance, mixed approach or quality approach.
Ok, well I'm going to say that sound right, but I'm going to look into it.

8. Skylake architecture: Do you think your car got same horsepower at a mountain compared to sea level? I just made a raw estimation. nothing more... it is impossible taking all the odds into account because in that term i may even have to add the aging of a CPU, because with every bit of age it will deplete aswell, it is unrealistic going into the details because it will fill a whole book.
This is by far, the most off point statement and counter argument. Please refer back to [1]

9. Your imagination regarding my age is more or less true, i am some decade, a single decade and some year old. You can freely chose because time means nothing to me and just as i feel like... i switch between those years back and forth.:D
please don't fall into the self identity bs that your generation so easily finds comforting.

10. All the numbers are estimated numbers coming out of my mind, i want to proof that there is no CPU drawback because PC is not truly ahead, taking the higher demand into account and the slow/minor CPU improvements.
Refer to [1].

11. I have had enough of the blue pill already, so i chose to take some additional red pills, it is better for my health.
 
Will check the stuff another time, to much games to play. Regarding Xbone, MS didnt even visit the Swiss Toy 2015 event anymore (proof of weakness, ah lets just ditch it, we cant handle it), but Sony and Nintendo was there with a lot of stuff. Not the stuff i enjoy but it doesnt matter (the stuff i enjoy is already in my home), at least they was showing presence and interest, but MS apparently is almost out of buisness. So i wont even bring it up anymore (i got any console except Xbone and some handheld), MS had the highest game-power when Xbox360 was alive, i guess we can at least hype the Win10, best OS on the planet? Who knows... i can play with Win7 just as well, my fun isnt showing a difference, ah yeah sorry for OT.
 
Will check the stuff another time, to much games to play

I think this is my fave quote ever.

As far as 10 being "best OS ever" Yeah, I'll stick with 7 for gaming, and wait a while on 10 until more bugs are found, and M$ backtracks on more stuff. I think they said WindowsME was the best OS ever.........yeah, that didn't work.

Kinda looking forward to Nintendo's replacement to the Wiiu, or Wii2, or whatever it's called. Just kinda pissed that Bayonetta 2 was a Nintendo exclusive.

And hey, there is the phrase Oldie but a Goodie for a reason :p
 
I do not mean to attack you, but your whole entire argument is based on your opinion. There is little to no empirical evidence to back any point you made. Generally speaking, provide us with data and we can have endless arguments with you on the validity of said data. I can state that the distance between the Earth and Sun is 67 million miles away because it is widely accepted that the distance between these two objects is in fact, quite large. However, under a more detailed analysis, backed by empirical data, I understand the distance to be much closer to 93 million miles. This is the academic approach and generally promotes discourse and understanding of the topics that are being discussed. I'd much rather contribute to discourse than fan a fire of hearsay.
1. Who is accepting it? Me not, but you say "widely accepted" so that means that the people in general are accepting it, those who actually got no clue, cool stuff. So, a few people of academic grade actually are telling the stuff and the others are usually not asking if right or wrong because it can't be wrong, its done by academic people after all. Finally, the sun is moving, so there is no way the distance can be "fix". In space, there is generally way to much theory involved, not a hard nailed and immovable science.

2. Who is providing the empirical evidence? Don't you know that as good as every statistic can be faked, in term someone want to proof something they usually find a way in order to hand out "faked data" that may look very real, because no one can actually prove that the data is wrong in many cases. The people that truly may have knowledge are usualy biased, they work for a certain industry with certain interests and they do not have a free mind. The current system is not working in a way that we can all provide our "evidence" without real consequences, usually hitting our money bag in a good or bad way. So almost any real science is bound on innovations able to provide money, it is not set apart from economical and purely materialistical evidence and the needs of it. Do you know what did happen with nuclear technology? In theory a wonderful new science, but it caused endless destruction because humans are not able to use technology wisely, they lack what i call "wisdom", they only got intelligence, not wisdom. I am almost certain, the destruction wont be over, there will be more of it.

3. Dont you remember the Nvidia issue with the 3.5 GB incident? It took us many many months in order to find out the truth because everyone was either saying "we do fully trust the proven and empirical evidence of Nvidia" or they simply was lacking the knowledge in order to check it out. Even Nvidia themself didnt check it out precisely... someone is making a failure and the others are simply accepting it wihout a single noise. So, there is your so called immovable "empirical evidence".

4. So, i dont give much about your so called "empirical evidence", if you got a specific question it is no problem to answer or to check it out, but in general i give not so much regarding "empirical evidence", way to much manipulation. Sure, someone can argue "oh, they simply made a failure, its only humans after all", but those failures are happening unproportionally high.


2. In my mind a Intel/AMD CPU (nowadays) is a hybrid approach, it is not RISC nor CISC. RISC can clock much higher. True CISC doesnt seem to exist because hybrid approach.
You are on the right track, but I'm not going to share now. I still have to digest all the gutsly knowledge after my one on one with Yale.
Too bad... but can i help you digesting it? I am somewhat a expert in digestion issues, unfortunately may not be teached in most of the universitys. Heck, i even know well educated people (universitys and alike) that eat microwave food all the time and they think there is nothing wrong with.

3. Intels plan stepping up the core count (modules) is nothing new, but so far it seems to lack execution, especially for consumer grade.
I didn't say anything about core count. Cores don't matter unless you can utilize them. With Win10, the OS will distribute instructions across all cores. This however has a limit, but can be dependent based on machine load and instructions.
In my mind, i was trying to figure out a clear evidence that make sense to me. The general implementation of many external functions (memory controller, volt regulations in general, bus and bridge stuff) has already started very long ago, even AMD is constantly doing it, thus the die size surely has increased on both manufacturers. But the stuff that is rather new is the plan to increase core count, in my mind this is a module such any other part that can be pinned down as a "dedicated" part of the CPU. We may see 6-8 core instead of 4-6 cores soon for consumer grade CPUs, but i dunno what time it will be done. Core count can be useful, why not? But in term someone cant feed all the "mouths", then it will be of low use. Luckily the mouth-issue is not a big issue anymore on new OS and new game software, its all a matter of capable OS, capable engine and hopefully better coding from Intel/AMD (so it can be done hardware side, not software), but you should know it even better than i do, i dunno why you actually think that the core count doesnt matter.

9. Your imagination regarding my age is more or less true, i am some decade, a single decade and some year old. You can freely chose because time means nothing to me and just as i feel like... i switch between those years back and forth. biggrin.gif
please don't fall into the self identity bs that your generation so easily finds comforting.
Pointing at a certain generation looks like you consider yourself another (probably older) generation. But who did most of the issues happening on the world? I am still far to young for that and will probably never grow old enough, so it was probably mainly your generation, but i guess they may be gone soon. Ultimately, i dunno why you think that the age is a matter to humankind. In the old cultures, often a younger person was considered "higher in hierachy" because simply more wisdom. It was generally accepted and no one was trying to point at generations. Generations arnt the issue, everything that is happening is affecting all of us, and everyone will be responsible for. Generation isnt the question, the question is how we implement the hierachy and how we handle each others. Hierachy usually goes from inner to outer, lower to upper direction. Looking down at someone is a misdirection.


Btw. The blue pill matter is indeed of high importance and i can tell why:
Blue pill = color blue, light gray, frequency average: Reality paired with ignorance toward non-reality, it can go this and this way in reality, it is always sitting in the middle but the color is non-critical for live.
Red pill = color red, light gray, frequency low: Non-reality paired with ignorance toward reality, it is usually heading toward the upper frequency of live because it is a warm and essential color for life.
Purple pill = color red+ blue, light gray, frequency high (basically adding frequencys of two color): It is not ignorant, taking reality and non-reality because it is including all essential (red, blue + finally purple) at once, adding the second critical color known as purple (essential for D3 and more).

So, do i need blue pill? Not me, this is certain, but probably 99% of people actually are using this color, indeed. No problem, just telling why not everyone is like this.
 
Last edited:
Sol may be moving, but Terra moves with Sol due to the gravitational pull exerted by the larger object. And as our solar system is moving as one unit inside of a larger system, the distances between objects do not fluctuate that wildly.

If games would be able to use all 4/6/8 etc etc cores, then core count would matter more, but the last I checked the highest core use in games was 4 cores effectively.

One major obstacle I see for gaming, as far as being able to program to take full advantage of cores, or full advantage of anything, is to get publishers to quit pushing games out the door before they are ready.

When you talk about nuclear tech. There have been no serious man-made nuclear incidents since the 80s, while there have been many disastrous man-made incidents in other forms of power generation *namely storage of massive deposits of fly ash from coal fired plants*

The japanese tsunami created meltdown, was due to location and a massive tidal wave from a massive earthquake.
Most nuclear incidents have been from: running experiments in the core of a full scale nuclear plant, bad design during the early years, etc.
Nuclear power was and still is the cleanest way of producing large amounts of steady power. The waste from it is an issue, only due to the "not in my backyard" argument, but even that is nil, seeing as you can use old mines that are away from ground water...Nuclear power generates less waste per decade than its equivalent in toxic fly ash from coal.

Now onto these pill colors.

I have more than had my fill of Esoterics and Philosophy this semester, so I am not going there lol.


As far as faked data..........can we say "man made climate change?" THAT is the epitome of faked data, climate change = cyclical. Has happened before, will happen again. And the THEY who provide empirical data = whoever you cite to provide information backing your thesis. *you learn this quickly in college*
 
There was more to it than just the japanese incident, remember Tschernobyl (Europe is still suffering nuclear waste) and many of the smaller stuff that has been hidden more or less. Although, both are a large portion of man made failure, because only a witch would set a power plant on the probably most dangerous location on the world, the east side of Japan (heading toward continental plate). Japan had tsunami issues, even strong ones, for thousand of years already, they knew about the issue from the very beginning, but just ignoring it. It is same such as most of the actions we do, we tend to ignore all the stuff that can be a hinderance to our mentality. The stuff many nations did before 80s, was nearly unspeakable, its not gone just by telling "but it was before 80s".

As far as faked data..........can we say "man made climate change?" THAT is the epitome of faked data, climate change = cyclical. Has happened before, will happen again. And the THEY who provide empirical data = whoever you cite to provide information backing your thesis. *you learn this quickly in college*
I dont care climate change, it is a thing brought up by wise people in order to distract from the real issue. The real issue is the massive waste we did to the planet, the waste is exorbitant and will soon be exceeding the wildest nightmare. We was pumping so much dirt into the environment that it may probably take thousand of years for a surface cleaning (without hidden dirt). Most of the innovations we did was done by producing massive amount of waste, every kind of waste, even your so called "esoteric waste" known as electromagnetic waves that has been increasing the past 10 years to astounding levels and it is proven fact that it is beyond healthy and is creating many issues to all life forms inlcuding humans. The climate change is just a thing in order to distract people i think, there is no proven fact that the change was done by human, but there is a proven fact that all the massive dirt was produced and caused by humans and we are already suffering the consequences, the particles can be found absolutly everywhere, inside every body too. But as long as our mind is loaded up with nonsense and as long as we are already overloaded we forget to target the real issue.
 
Last edited:
@Ivy

You have turned my brain into jelly....
I am totally lost.

Lets just say YAY Halo 5 might come out on PC *well, it is rumored*
Square seems to think HD remakes are the best thing *please FFVII Remastered*
And Nintendo needs to die if they do not come out with a truly remarkable new addition to the console market *and they owe me for migraines from their Virtual Boy*
 
Nintendo? They make more cash using Amiibo and some old handheld refresh titles than you could ever think of. Innovation is a thing of demand, and the demand is at some other spot i guess. But in general, every company need to act profit or at least based on a self sustainable way, and the profitable mainstream simply isnt allowing for great movements. Nintendo is fine in my eyes but they are the prisoner of their own major fraction of customers.

Square simply makes the best thing they can such as many other companys, and i think releasing a HD remake of FF10 was good stuff. The FF7 remake i am a bit sceptical because special things should remain untouched unless you can truly pay the required tribute to it. So it can become a bit problematic to handle the "holy grail", they can not be cheap on this matter.

Halo 5, i dunno, it is not my genre so i wont speak. But in general i think there is a lot of hype but it is basically same on any franchise game. Lot of warmup, remakes... and sadly to less true innovations.
 
Nintendo? They make more cash using Amiboo and some old handheld refresh titles than you could ever think of.

First, lets keep assumptions of thought capability out of this discussion.

Second. The Amiibo model is a standard model of hook the user, and constantly release new content for it. Same as World of Warcraft and its neverending expansions.
The strategic move Nintendo made, was to partner with Disney, which is like the crack cocaine of childhood. Kids literally seem to be programmed from birth to demand anything Disney. Nintendo saw that, and capitalized on it. The fact you point out about Nintendo basically living off its current stock of customers is one of my major peeves about the company. During the 80s, Nintendo appealed to the entire gaming audience. Everything from family games, to RPG/Racing/fighting/arcade classics/etc.
Aside from Bayonetta 2, they have more or less focused on family oriented gaming only.

Gamecube had Metroid Prime, which was awesome, and a Resident Evil version or two. Even though the GC is/was considered a failed console, it had a much more varied range of games available, and did not try and live off gimmicks like a controller that is also a portable console........Remember the Dreamcast with it's removable memory unit/game thing? In my opinion, Nintendo could do itself a MAJOR favor: Get out of the full console market as Sega did, and focus on creating great games like it did in the 80s/90s. Let them keep making handheld units. But for the love of god, do not ship a modern version of the gameboy without an AC adapter.

I love square.........when they actually assemble the right team to get a job done. I bought the special edition of FFXIV......before its inital launch. Before the initial development team was terminated....I loved my Miqo'te archer. But it got to the point all I was doing was fishing and crafting, there was no story. It took how many years until FFXIV: A Realm Reborn was launched? Heck, Square dropped the ball on half of the collectors edition extras I was supposed to get.

As for CPU's in gaming..........how much of the game performance is actually CPU based, and how much is based upon the power of the GPU in the system?
 
Status
Not open for further replies.
Back