It is currently Thu Jun 22, 2017 1:06 pm

All times are UTC - 8 hours




Post new topic Reply to topic  [ 12 posts ] 
Author Message
 Post subject: Does it make a rat's ass of a difference at 1440p?
PostPosted: Fri Dec 25, 2015 2:28 pm 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
I have a very snazy 1440p 27" monitor.

Does it make a rat's ass of a difference what cpu is use to get acceptable frame rates at high settings?

I have been reading for months and cannot find a NON high end cpu benchmark that utilizes a GTX 970 card. Everyone shows them with 5775C or now the Skylake, or various CPU's and 1080P gaming. I duno who the hell makes a new system to game at 1080P. I mean, i had 1200P... SEVEN years ago. I was above 1080p gaming back then (yes only by a little but still proves a point.)

I cant see gaming at 1080p i want to kill reviewers and people talking about this. gives me rage.

Anyone have any good-ish reviews of 970GTX (980 or 960 would be ok too) of various CPU's tested? I see lots of forum posts all over but not benchmarks.

I have an AMD 1090T chip and figuring just to buy a 970 instead of upgrading the whole system. I also will be adding a 1TB samsung 850 evo to it.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Fri Dec 25, 2015 2:57 pm 
Offline
Moderator

Joined: Thu Oct 06, 2005 4:36 am
Posts: 6651
Location: Monterey Bay, CA
Look in a different direction - some games are CPU dependent and some are more GPU dependant. The easy way out is for a review site to use a high end CPU in their gfx card reviews so it doesn't have a noticable affect on framerates. On the other hand, if you go look at game benchmark reviews, some sites will show how much CPU horsepower is required before it's irrelevant. Techspot does a decent job. For a couple of examples, here's their reviews of Witcher 3 and Project Cars. You can get a feel for fps vs platform; vs clock speed.

_________________
1080p Gaming build: i5-4670K, Mugen 4, MSI Z87-G45, MSI GTX 760 2GB Gaming, 8GB 1866 RAM, Samsung Evo 250GB, Crucial MX100 256GB, WD Red 2TB, Samsung DVD burner, Fractal Define R4, Antec True Quiet 140 (2 front + rear) case fans, Seasonic X-560. 35-40W idle, 45-55W video streaming, 170-200W WoW, 200-230W Rift, 318W stress test (Prime95 + Furmark)

Support SPCR through these links: NCIX, Amazon and Newegg


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Fri Dec 25, 2015 3:28 pm 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
AH!! i find it now! my 1090T is super similar to 1100T, so that is what I look for. looks like I have not that much to worry about at 1440p with high settings.

It clearly does make a difference but not a huge one. it seems worth upgrading at some point this year. I wish amd would release Zen. darn. the 1090T is amazing processor.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Fri Dec 25, 2015 7:45 pm 
Offline

Joined: Thu Nov 19, 2009 10:20 am
Posts: 571
Location: Ottawa
It also depends a great deal on what else you do with your PC. Do you do anything else processor intensive with your computer? What kind of CPU will those tasks benefit from?

At a previous job the standard issue developer workstation was a 1055T. Due to a lack of spares, I had to temporarily swap one with a Celeron G550 while I waited for a warranty replacement. There was much complaining about the lowly Celeron, but by the end of the day the developers were fighting over it because of how much faster it was in real world tasks. The user outright refused to give it up once their AMD machine was fixed.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 1:46 am 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
squishing music and videos sometimes, SETI @ home, some multi-core games. 1055T is 400mhz slower, makes a notable difference in most things.

I have used a celeron like that. Really dont know what the guy was smoking. or what minimal crap he was doing. I cant imagine. I imagine there were more advanced parts of the chip he was accessing as it a more optimized chip for things. It is sandy bridge as I understand. I think i would rather stab myself in the eye than take a 2 core celeron and do anything with it other than throw it someone's head compared to a true 6 core.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 2:06 am 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
I was looking at dx12 benchmark on your tech report site. It was showing problems with more intel cores being activated. I wonder at times if things are written so poorly in games and in OS's that this sort of thing happens. It seems that nothing is optimized for anything aside from a few special instructions pointed to specifically in programming, and then it runs smooth. I have experienced that this 1090T can seem dated for certain things and BOOM all of a sudden the 6 cores crushes a task. It is nice to have that. seems that intel's hyperthreading works so well for them that no one needs true cores anymore.

the 350 dollar price tag for the 6700k is a bit steep at the moment. But I dont think it is worth getting another processor being that I upgrade only once every 4-5 years now. I was on a best of AMD kick for a while, A64 3200, 939 x2 (was awesome), Deviated for a bit to Intel Conroe core 2 duo 6600. The intel board i put it on, then an asus board i put it on, switched ram, that was the crappiest bunch of crashes and problems. My AMD 3200 still works in my basement as a linux machine. I have been impressed by AMD board manufacturers and their lifespan. My 1090T I have had since it came out. I know that single threaded it sorta sucks and isnt optimized for many of the dinky functions that are most used now in programming.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 2:45 am 
Offline
*Lifetime Patron*

Joined: Thu Feb 13, 2003 6:53 am
Posts: 1753
Location: Sweden
Sandy Bridge was were Intel really solidified it's dominance over AMD I think. A cheap dual core Celeron became the standard choice in most of my builds since. Two cores, no HT and a little bit smaller cache but still lots of power for the standard tasks. Cheap and power efficient. Not the build you want for encoding or any other specific task where lots of cores are useful though.

If you need more power you need more cores and Intel will make sure you pay for that. Since they dominate the market so completely they can segment their products to the extreme. At one point in time a friend of mine went with AMD for a gaming rig. In the end it was a little bit more powerful then the Intel alternative for the same money. But he ended up with an idle power consumption worse than my rig at load. ;) And looking at his build I experienced a bit of nostalgia while we dealt with some overheating problems. That's what really prevents me from going with AMD today; the amazing power efficiency of the Intel stuff.

_________________
Main: ASRock B85M-ITX | i3-4330 | 16GB DDR3 | Intel 730 240GB | HDPLEX H1-S | picoPSU | No moving parts | Idle 13.9W
HTPC: ASRock H81M-ITX | Pentium G3420 | 4GB DDR3 | Intel 535 120GB | HDPLEX H1-S | picoPSU | No moving parts | Idle 11.2W
Gaming: Intel DH77EB | i5-3570K | GTX 1060 6GB | 16GB DDR3 | TJ08-E | RM750X
Server: ASRock N3150-ITX | ~30TB | G-360 | Idle ~25W


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 7:38 am 
Offline

Joined: Thu Nov 19, 2009 10:20 am
Posts: 571
Location: Ottawa
~El~Jefe~ wrote:
I have used a celeron like that. Really dont know what the guy was smoking. or what minimal crap he was doing. I cant imagine. I imagine there were more advanced parts of the chip he was accessing as it a more optimized chip for things. It is sandy bridge as I understand. I think i would rather stab myself in the eye than take a 2 core celeron and do anything with it other than throw it someone's head compared to a true 6 core.

It's not complicated. While the Celeron only has two cores, they are much faster than two cores of the AMD chip. This makes a much more pleasant end user experience. Sure compiling took longer on the Celeron, but that was still a significant wait on the AMD so it was still an interruption.

AMD Workflow: Less pleasant use of PC, take coffee break while compiling
Celeron Workflow: More pleasant use of PC, take longer coffee break while compiling

Another good example of this effect is the RAW photo editor Lightroom. While it can use all your cores while doing batch processes, it's interface and manual editing is very much single threaded. It's also rather slow at the best of times, so for pleasant experience you want the highest single core performance you can get. A low core count Intel chip will give a much better user experience over a many core AMD chip, even if it might take longer to batch process once you are done editing.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 2:00 pm 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
OK, that makes a lot of sense.

I use my CPU for three things: to game at 1440p graphics and to change formats on music formats and videos and also at times [email protected] Compressing a lot of stuff at times. I also do a lot of I/O tasks.

I wonder if there is a better board and chip that handles things like I/O tasks with least impact.

It seems then I would do well for the cost with an i5 Skylake. 6600k. It would give me good single threaded for games and 4 cores for workloads. I would think the 4 medium clocked skylake cores would be faster than my 1090T 6 cores for whatever I would do with my current setup.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 2:40 pm 
Offline
Moderator

Joined: Thu Oct 06, 2005 4:36 am
Posts: 6651
Location: Monterey Bay, CA
Here's a very brief comparison to give you a feel for the single thread vs multi-thread performance.

_________________
1080p Gaming build: i5-4670K, Mugen 4, MSI Z87-G45, MSI GTX 760 2GB Gaming, 8GB 1866 RAM, Samsung Evo 250GB, Crucial MX100 256GB, WD Red 2TB, Samsung DVD burner, Fractal Define R4, Antec True Quiet 140 (2 front + rear) case fans, Seasonic X-560. 35-40W idle, 45-55W video streaming, 170-200W WoW, 200-230W Rift, 318W stress test (Prime95 + Furmark)

Support SPCR through these links: NCIX, Amazon and Newegg


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 2:48 pm 
Offline

Joined: Thu Nov 19, 2009 10:20 am
Posts: 571
Location: Ottawa
For 80%+ of what you do a newer i3 would be better and a newer i5 would be better at everything. The hex-core phenoms were great for multi-threaded applications in their time, especially considering they don't have any real replacement from AMD. However their single thread performance lags so much now that they really cannot compete. Even adjusting for clock speed, a single Haswell core has about 2X the performance of a phenom core. Skylake pushes that even further.

If you are really concerned about I/O then go Skylake. The DMI bus is 8 GB/sec, up from 4 GB/sec in Sandy Bridge to Broadwell. Plus the chipset has far more PCIe lanes than previous versions. You can also connect your I/O devices directly to the CPU provided PCIe, they don't have to be used for video cards only. There is of course Haswell-E with 40 PCIe lanes if you want to spend the money.


Top
 Profile  
 
 Post subject: Re: Does it make a rat's ass of a difference at 1440p?
PostPosted: Sat Dec 26, 2015 4:34 pm 
Offline
Friend of SPCR

Joined: Mon Feb 28, 2005 4:21 pm
Posts: 2887
Location: New York City zzzz
Oh wow nice. So looks like a 6600k probably will be my best bet. I imagine there will be a 6800k one day for 380 dollars that will be 4.5 hz. always something like that happens.

nice to know the I/O is so cranked and has lots of pci-e lanes

I was thinking of getting the 5775C chip. Tech report have noticed that it has the LOWEST chance to stutter in a game possible. that edram 128 stuff is sick sick.

That probably would be my ideal chip. Looks rare though, sadly. especially im guessing at Christmas time.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 12 posts ] 

All times are UTC - 8 hours


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group