Does it make a rat's ass of a difference at 1440p?

All about them.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Fri Dec 25, 2015 2:28 pm

I have a very snazy 1440p 27" monitor.

Does it make a rat's ass of a difference what cpu is use to get acceptable frame rates at high settings?

I have been reading for months and cannot find a NON high end cpu benchmark that utilizes a GTX 970 card. Everyone shows them with 5775C or now the Skylake, or various CPU's and 1080P gaming. I duno who the hell makes a new system to game at 1080P. I mean, i had 1200P... SEVEN years ago. I was above 1080p gaming back then (yes only by a little but still proves a point.)

I cant see gaming at 1080p i want to kill reviewers and people talking about this. gives me rage.

Anyone have any good-ish reviews of 970GTX (980 or 960 would be ok too) of various CPU's tested? I see lots of forum posts all over but not benchmarks.

I have an AMD 1090T chip and figuring just to buy a 970 instead of upgrading the whole system. I also will be adding a 1TB samsung 850 evo to it.

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Re: Does it make a rat's ass of a difference at 1440p?

Post by CA_Steve » Fri Dec 25, 2015 2:57 pm

Look in a different direction - some games are CPU dependent and some are more GPU dependant. The easy way out is for a review site to use a high end CPU in their gfx card reviews so it doesn't have a noticable affect on framerates. On the other hand, if you go look at game benchmark reviews, some sites will show how much CPU horsepower is required before it's irrelevant. Techspot does a decent job. For a couple of examples, here's their reviews of Witcher 3 and Project Cars. You can get a feel for fps vs platform; vs clock speed.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Re: Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Fri Dec 25, 2015 3:28 pm

AH!! i find it now! my 1090T is super similar to 1100T, so that is what I look for. looks like I have not that much to worry about at 1440p with high settings.

It clearly does make a difference but not a huge one. it seems worth upgrading at some point this year. I wish amd would release Zen. darn. the 1090T is amazing processor.

washu
Posts: 571
Joined: Thu Nov 19, 2009 10:20 am
Location: Ottawa

Re: Does it make a rat's ass of a difference at 1440p?

Post by washu » Fri Dec 25, 2015 7:45 pm

It also depends a great deal on what else you do with your PC. Do you do anything else processor intensive with your computer? What kind of CPU will those tasks benefit from?

At a previous job the standard issue developer workstation was a 1055T. Due to a lack of spares, I had to temporarily swap one with a Celeron G550 while I waited for a warranty replacement. There was much complaining about the lowly Celeron, but by the end of the day the developers were fighting over it because of how much faster it was in real world tasks. The user outright refused to give it up once their AMD machine was fixed.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Re: Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Sat Dec 26, 2015 1:46 am

squishing music and videos sometimes, SETI @ home, some multi-core games. 1055T is 400mhz slower, makes a notable difference in most things.

I have used a celeron like that. Really dont know what the guy was smoking. or what minimal crap he was doing. I cant imagine. I imagine there were more advanced parts of the chip he was accessing as it a more optimized chip for things. It is sandy bridge as I understand. I think i would rather stab myself in the eye than take a 2 core celeron and do anything with it other than throw it someone's head compared to a true 6 core.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Re: Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Sat Dec 26, 2015 2:06 am

I was looking at dx12 benchmark on your tech report site. It was showing problems with more intel cores being activated. I wonder at times if things are written so poorly in games and in OS's that this sort of thing happens. It seems that nothing is optimized for anything aside from a few special instructions pointed to specifically in programming, and then it runs smooth. I have experienced that this 1090T can seem dated for certain things and BOOM all of a sudden the 6 cores crushes a task. It is nice to have that. seems that intel's hyperthreading works so well for them that no one needs true cores anymore.

the 350 dollar price tag for the 6700k is a bit steep at the moment. But I dont think it is worth getting another processor being that I upgrade only once every 4-5 years now. I was on a best of AMD kick for a while, A64 3200, 939 x2 (was awesome), Deviated for a bit to Intel Conroe core 2 duo 6600. The intel board i put it on, then an asus board i put it on, switched ram, that was the crappiest bunch of crashes and problems. My AMD 3200 still works in my basement as a linux machine. I have been impressed by AMD board manufacturers and their lifespan. My 1090T I have had since it came out. I know that single threaded it sorta sucks and isnt optimized for many of the dinky functions that are most used now in programming.

Vicotnik
*Lifetime Patron*
Posts: 1831
Joined: Thu Feb 13, 2003 6:53 am
Location: Sweden

Re: Does it make a rat's ass of a difference at 1440p?

Post by Vicotnik » Sat Dec 26, 2015 2:45 am

Sandy Bridge was were Intel really solidified it's dominance over AMD I think. A cheap dual core Celeron became the standard choice in most of my builds since. Two cores, no HT and a little bit smaller cache but still lots of power for the standard tasks. Cheap and power efficient. Not the build you want for encoding or any other specific task where lots of cores are useful though.

If you need more power you need more cores and Intel will make sure you pay for that. Since they dominate the market so completely they can segment their products to the extreme. At one point in time a friend of mine went with AMD for a gaming rig. In the end it was a little bit more powerful then the Intel alternative for the same money. But he ended up with an idle power consumption worse than my rig at load. ;) And looking at his build I experienced a bit of nostalgia while we dealt with some overheating problems. That's what really prevents me from going with AMD today; the amazing power efficiency of the Intel stuff.

washu
Posts: 571
Joined: Thu Nov 19, 2009 10:20 am
Location: Ottawa

Re: Does it make a rat's ass of a difference at 1440p?

Post by washu » Sat Dec 26, 2015 7:38 am

~El~Jefe~ wrote: I have used a celeron like that. Really dont know what the guy was smoking. or what minimal crap he was doing. I cant imagine. I imagine there were more advanced parts of the chip he was accessing as it a more optimized chip for things. It is sandy bridge as I understand. I think i would rather stab myself in the eye than take a 2 core celeron and do anything with it other than throw it someone's head compared to a true 6 core.
It's not complicated. While the Celeron only has two cores, they are much faster than two cores of the AMD chip. This makes a much more pleasant end user experience. Sure compiling took longer on the Celeron, but that was still a significant wait on the AMD so it was still an interruption.

AMD Workflow: Less pleasant use of PC, take coffee break while compiling
Celeron Workflow: More pleasant use of PC, take longer coffee break while compiling

Another good example of this effect is the RAW photo editor Lightroom. While it can use all your cores while doing batch processes, it's interface and manual editing is very much single threaded. It's also rather slow at the best of times, so for pleasant experience you want the highest single core performance you can get. A low core count Intel chip will give a much better user experience over a many core AMD chip, even if it might take longer to batch process once you are done editing.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Re: Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Sat Dec 26, 2015 2:00 pm

OK, that makes a lot of sense.

I use my CPU for three things: to game at 1440p graphics and to change formats on music formats and videos and also at times SETI@hom. Compressing a lot of stuff at times. I also do a lot of I/O tasks.

I wonder if there is a better board and chip that handles things like I/O tasks with least impact.

It seems then I would do well for the cost with an i5 Skylake. 6600k. It would give me good single threaded for games and 4 cores for workloads. I would think the 4 medium clocked skylake cores would be faster than my 1090T 6 cores for whatever I would do with my current setup.

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Re: Does it make a rat's ass of a difference at 1440p?

Post by CA_Steve » Sat Dec 26, 2015 2:40 pm

Here's a very brief comparison to give you a feel for the single thread vs multi-thread performance.

washu
Posts: 571
Joined: Thu Nov 19, 2009 10:20 am
Location: Ottawa

Re: Does it make a rat's ass of a difference at 1440p?

Post by washu » Sat Dec 26, 2015 2:48 pm

For 80%+ of what you do a newer i3 would be better and a newer i5 would be better at everything. The hex-core phenoms were great for multi-threaded applications in their time, especially considering they don't have any real replacement from AMD. However their single thread performance lags so much now that they really cannot compete. Even adjusting for clock speed, a single Haswell core has about 2X the performance of a phenom core. Skylake pushes that even further.

If you are really concerned about I/O then go Skylake. The DMI bus is 8 GB/sec, up from 4 GB/sec in Sandy Bridge to Broadwell. Plus the chipset has far more PCIe lanes than previous versions. You can also connect your I/O devices directly to the CPU provided PCIe, they don't have to be used for video cards only. There is of course Haswell-E with 40 PCIe lanes if you want to spend the money.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Re: Does it make a rat's ass of a difference at 1440p?

Post by ~El~Jefe~ » Sat Dec 26, 2015 4:34 pm

Oh wow nice. So looks like a 6600k probably will be my best bet. I imagine there will be a 6800k one day for 380 dollars that will be 4.5 hz. always something like that happens.

nice to know the I/O is so cranked and has lots of pci-e lanes

I was thinking of getting the 5775C chip. Tech report have noticed that it has the LOWEST chance to stutter in a game possible. that edram 128 stuff is sick sick.

That probably would be my ideal chip. Looks rare though, sadly. especially im guessing at Christmas time.

Post Reply