EVGA GTX 750 Ti FTW Graphics Card Review

NVIDIA has another architecture release day for us! The Maxwell GPU core has arrived at Overclockers in the form of an EVGA GTX 750 Ti FTW. The Maxwell GPU core architecture promises great power efficiency, low power consumption, and performance that surpasses its predecessors. This particular card also comes with EVGA’s ACX cooler, which has tested very well in the past on other EVGA video cards. So, let’s find out if EVGA put this new technology to good use as we run the GTX 750 Ti FTW through our testing procedure.

Specifications and Features

As we look at the specifications below, right off the bat we can see that EVGA has factory overclocked the card from reference design speeds. The reference design card has a base clock of 1020 MHz and a boost clock of 1085 MHz, but here we have 1189 MHz and 1268 respectively. I’d say that’s a pretty hefty overclock right from the start. The card also features 2 GB of GDDR5 memory set to a quad-pumped speed of 5400 MHz. Another added bonus is the inclusion of a 6-pin PCI-E power connection, whereas there is no such connector found on the reference design cards. Specifications provided by EVGA.

EVGA GTX 750 Ti FTW Specifications
Graphics Processing Clusters 1
Streaming Multiprocessors 5
CUDA Cores 640
Texture Units 40
ROP Units 16
Base Clock 1189 MHz
Boost Clock 1268 MHz
Memory Clock (Data rate) 5400 MHz
L2 Cache Size 2048KB
Total Video Memory 2048MB GDDR5
Memory Interface 128-bit
Total Memory Bandwidth 86.4 GB/s
Texture Filtering Rate (Bilinear) 40.8 GigaTexels/sec
Fabrication Process 28 nm
Transistor Count 1.87 Billion
Connectors 1 x Dual-Link DVI
1 x HDMI
1 x DisplayPort 1.2
Form Factor 2 Slots
Power Connectors 1 x 6-Pin PCI-E
Recommended Power Supply 400 Watts/20A +12V
Thermal Design Power (TDP) 60 Watts
Thermal Threshold 95° C

A quick glance at GPU-Z confirms much of what we see above. History tells us that the actual boost clock usually comes in quite a bit higher than the official specification. This held true with this card too, and it actually boosted up to 1345 MHz when under load.

evga_gtx750ti (39)

For features, we first have a list of those that are more related to NVIDIA and commonly found on most of their newer GPUs.

evga_gtx750ti (1)

We already mentioned the GTX 750 Ti FTW is outfitted with the ACX cooler. I’ve tested EVGA cards in the past that used the ACX cooler and have always came away impressed with how well it works. On the software side, EVGA’s Precision X is quickly becoming the go-to utility for overclocking NVIDIA based GPUs. OC Scanner X is another useful tool for checking overclock stability and monitoring information.

When you buy into the EVGA product line, you get one of the best online community experiences found anywhere. From EVGA’s game servers, social networking, 24/7 tech support to the EVGA forums, you’re always in the loop and can quickly find help when needed.

evga_gtx750ti (2) evga_gtx750ti (3)
evga_gtx750ti (4) evga_gtx750ti (5)
evga_gtx750ti (6) evga_gtx750ti (7)
evga_gtx750ti (8)

I attended a web conference with NVIDIA on February 13th, so they could present information on the new Maxwell platform. I’d be remiss if I didn’t at least parlay some of that information here. It does in fact bring quite a few new features to the GPU world. The first item of note is the GTX 750 Ti and GTX 750 will be the first to use the new Maxwell GM107 GPU core. The GTX 750 Ti is aimed squarely at AMD’s R7 260X, while the GTX 750 stacks up against the R7 260. The GeForce GTX 750 Ti will replace the GeForce GTX 650 Ti in NVIDIA’s GPU lineup, but the GTX 650 and GTX 660 will continue to be produced.

Buying into the NVIDIA GTX family of graphics cards allows the user to take advantage of several GTX Gaming technologies. ShadowPlay, G-Sync, and SHIELD are just a few of the unique NVIDIA technologies that promise enhanced gaming options not found elsewhere. Whether your intent is to stream game play, obtain smooth and stutter free visuals, or even make a video of your game session, you’ll find all the tools you need to accomplish this within the NVIDIA ecosystem.

As NVIDIA put it during the web conference, “The soul of Maxwell is improving performance per watt.” The Maxwell architecture brings a new design to the Streaming Multiprocessor (SM) that improves performance per watt and performance per area. Logic partitioning, workload balancing, clock-gating granularity, number of instructions issued per clock cycle, and compiler-based scheduling are just a few of the many improvements over the Kepler architecture. The number of SMs has increased to five compared to Kepler’s two and done so with only a 25% increase in die area. The L2 cache sees a huge increase from Kepler’s 256 KB to Maxwell’s 2048 KB. With more L2 cache on chip, there is a large drop in requests to the DRAM, which reduces power demand and improves overall performance. To further maximize energy efficiency, NVIDIA states the engineers have aggressively tuned the implementation of each unit in the Maxwell GPU down to the transistor level. In a nutshell, this all boils down to NVIDIA’s claim that Maxwell based GM107 GPUs can deliver two times better performance per watt when compared to a Kepler based GK107 GPU, while continuing to use the same 28nm manufacturing process.

evga_gtx750ti (9)
evga_gtx750ti (10)

One GPC, five Maxwell Streaming Multiprocessors (SMM), and two 64-bit memory controllers (128-bit total) make up the contents of the GM107 GPU. This is the uncut implementation of the GM107 and is what’s found on the GTX 750 Ti. Below is the full chip block diagram showing the GPU configuration.

evga_gtx750ti (11)
Some of you may be interested in the block diagram of the SMM itself, so here that is.

evga_gtx750ti (12)
For a down and dirty comparison, here is a chart summarizing the major improvements the Maxwell GPU provides over the Kepler GPU. Keep in mind, a few of these numbers vary depending on NVIDIA partner card designs, especially when it comes to clock speeds.

GPU GK107 (Kepler) GM107 (Maxwell)
CUDA Cores 384 640
Base Clock 1058 MHz 1020 MHz
GPU Boost Clock N/A 1085 MHz
GFLOPs 812.5 1305.6
Texture Units 32 40
Texel fill-rate 33.9 Gigatexels/sec 40.8 Gigatexels/sec
Memory Clock 5000 MHz 5400 MHz
Memory Bandwidth 80 GB/sec 86.4 GB/sec
ROPs 16 16
L2 Cache Size 256KB 2048KB
TDP 64W 60W
Transistors 1.3 Billion 1.87 Billion
Die Size 118 mm² 148 mm²
Manufacturing Process 28nm 28nm

I wanted to touch on two other aspects I found appealing about the Maxwell based GTX 750 Ti GPU. The first being how it stacks up against the R7 260X in performance and efficiency. Based on the graphs below, NVIDIA claims the GTX 750 Ti outperforms the R7 260X with just a fraction of the power.

evga_gtx750ti (14)
evga_gtx750ti (13)

Secondly, the extremely low power draw makes it an attractive option for HTPC, mITX, or basic store-bought systems that may have smaller wattage PSUs. If you buy a reference design GTX 750 Ti, there won’t be a 6-pin power connector needed, which adds a wider variety of potential usage options. Currently, the GTX 750 Ti is the fastest available graphics card that does not have a power connector. As is the case with today’s review sample, you’ll probably find that NVIDIA partners will add a 6-pin power connector to some of their Maxwell models. The other item that makes the GTX 750 Ti an easy upgrade for a wide variety of systems is the card’s physical size. In its reference form, the card only measures 5.75″ long. Even the EVGA GTX 750 Ti FTW only measures out at a smidgen over 9″ long. Either way, the card will fit into almost any system design out there.

evga_gtx750ti (15)

Upgrade Your Basic PC

As you can see, the new Maxwell GPU brings a lot of new technologies to the table, and it should make for interesting times ahead!

…Now, back to our regularly scheduled program.

Packaging and First Look

The information on the box does a nice job of explaining the product found within. Looking at the box front, you see a few high-level features mentioned, as well as that the card is in the “FTW” family. Around back, the features and specifications we talked about above are displayed. The box sides are a placard for additional branding and a multilingual list of key features.

This slideshow requires JavaScript.

Inside the box, you will find the GTX 750 Ti FTW well protected in a bubble wrap envelope. Also included are lots of documentation, a DVI to VGA adapter, a PCI-E adapter cable, and the driver/utility CD. EVGA also tosses in a poster, case badge, and a couple different stickers.

This slideshow requires JavaScript.

First Look/Photo Op

Here are a few pictures of the EVGA GTX 750 Ti FTW taken from different angles. We’ll have a closeup look at the card as the review progresses.

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

EVGA GTX 750 Ti FTW

The EVGA GTX 750 Ti FTW Up Close

The first thing worth noting is the lack of SLI support on the GTX 750 Ti series of cards. It’s understandable that SLI wouldn’t be a feature of this card, as it’s probably not in the class of cards most would consider for that type of setup anyway. The card supports up to three concurrent displays through the available dual-link DVI, HDMI, and DisplayPort connections. As we mentioned earlier, EVGA opted to include a 6-pin PCI-E power connector. Adding this power connector is said to provide a 25 watt boost (30% increase) in power delivery, which should improve overclocking potential.

Display Connectivity

Display Connectivity

6-Pin PCI-E Power Connector

6-Pin PCI-E Power Connector

The ACX cooler features a dual fan design that sits on the aluminum fin stack. There are two large copper heatpipes that pass through the cooling block as they weave their way through the fin stack. This isn’t the most robust implementation of the ACX cooler I’ve seen; but given the smaller size of the card and the low power draw the Maxwell GPU has, it should work just fine. We’ll find out later in the review.

ACX Cooler and Shroud Removed

ACX Cooler and Shroud Removed

Heatpipe Design

Heatpipe Design

With the ACX cooler removed, we can get a clear view of the PCB layout. It appears we have a 3+1 power phase design implemented here (3 GPU + 1 Memory).

Bare PCB

Bare PCB

GPU Power Phases

GPU Power Phases

Memory Power Phase

Memory Power Phase

The 2 GB of onboard memory is courtesy of the Samsung K4G41325FC-HC03 GDDR5 memory ICs. Finally, we have a close-up look at the Maxwell GM107 GPU.

Samsung Memory

Samsung Memory

Maxwell GM107 GPU Core

Maxwell GM107 GPU Core

EVGA’s Precision X Software

Precision X has evolved into one of the better GPU overclocking utilities over the past couple of years. It allows for real time monitoring of vital GPU information, overclocking capabilities, and the ability to control the ACX cooler fans. The fan control option allows for setting the speed manually or by using the fan curve option. In the case of this particular video card, there is no power target adjustment available. However, we do have the option to raise the temperature target as high as 95° C. There is a slight core voltage adjustment available as well, but it’s a mere +31 mV. Another item of note is the ability to save up to ten different profiles.

Precision X

Precision X

Precision X

Precision X Voltage Option

To compliment Precision X, you might want to get a copy of OC Scanner X too. It’s a quick way to test GPU overclocking stability and check for any artifacting issues. It even comes with its own monitoring capabilities.

OC Scanner X

OC Scanner X

Overclocking For Stability

It was mentioned by NVIDIA that the GTX 750 Ti would overclock to 1250 base clock without much trouble; and that’s exactly where it landed, while still being able to complete all of the tests in our suite. That might not sound like a lot compared to the 1189 MHz clock the card is factory overclocked to; but when compared to the reference clock of 1020 MHz, it’s actually pretty impressive. The actual boost clock landed at 1407 MHz when the GPU was overclocked to 1250 MHz. On the memory side, I was able to push it to an additional 200 MHz, which landed us at 1550 MHz (6200 MHz quad pumped). Nothing to complain about there either!

1250 MHz GPU/1550 MHz Memory Stable

1250 MHz GPU/1550 MHz Memory Stable

1250 MHz GPU/1550 MHz Memory Stable

1250 MHz GPU/1550 MHz Memory Stable

Ok, now that we have our 24/7 stable overclock established, let’s get to the benchmarks!

Benchmarks

Test System

System Components
Motherboard ASUS Maximus VI Formula
CPU Intel i7 4770K Haswell
Memory G.SKill TridentX DD3-2666 MHz 2x4GB
SSD Samsung EVO 500 GB SSD
Power Supply Corsair HX1050 Professional Series
Video Card EVGA GTX 750 Ti w/ACX Cooler
Cooling Swiftech Apogee HD CPU Water Block – 360 mm Radiator – MCP35X Pump

We’ll adhere to the Overclockers.com GPU test procedure we have been using since the release of the Haswell platform. If you’re not familiar with our testing method, click on the link provided for more information. For quick reference, below is a synopsis of what we do.

Minimum System Requirements

  • i7 4770K @ 4 GHz
  • Dual Channel DDR3-1866 9-9-9-24
  • GPU @ stock and overclocked
  • Monitor capable of 1920×1080

Synthetic Tests

  • 3DMark Vantage – DirectX 10 benchmark running at 1280X1024 – Performance preset.
  • 3DMark 11 – DirectX 11 benchmark running at 1280X720 – Performance preset.
  • 3DMark Fire Strike – DirectX 11 benchmark running 1920X1080 – Standard test (not extreme).
  • Unigine Heaven (HWBot version) – DX11 Benchmark – Extreme setting.

Game Tests

  • Batman: Arkham Origins – 1920X1080, 8x MSAA, PhysX off, V-Sync off, The rest set to on or DX11 enhanced.
  • Battlefield 4 – 1920X1080, Ultra Preset, V-Sync off.
  • Bioshock Infinite – 1920X1080, Ultra DX11 preset, DOF on.
  • Crysis 3 – 1920X1080, Very high settings, 16x AF, 8x MSAA, V-Sync off.
  • Final Fantasy XIV: A Realm Reborn – 1920X1080, Maximum preset.
  • Grid 2 – 1920X1080, 8x MSAA, Intel specific options off, Everything else set to highest available option.
  • Metro Last Light – 1920X1080, DX11 preset, SSAA on, Tessellation very high, PhysX off.

For the sake of comparison, I’ve chosen four different new generation AMD cards. Because NVIDIA claims better performance than the R7 260 series cards, we’ll use the R7 260X and R7 260 in the comparison graphs. Taking a step up in price and performance, I also used the R9 270 and R9 270X for comparison. How close can the mid-range level GTX 750 Ti FTW come to the higher priced R9 270 series cards? Let’s find out.

Our synthetic tests show a pecking order that actually held true throughout our entire test suite. The EVGA GTX 750 Ti FTW had no trouble topping the R7 260 and R7 260X in all the tests. Once it was overclocked, it closed the gap substantially when compared to the R9 270 and R9 270X.

HWBot Heaven Results

HWBot Heaven Results

3DMark Fire Strike Results

3DMark Fire Strike Results

3DMark 11 Results

3DMark 11 Results

3DMark Vantage Results

3DMark Vantage Results

The game benchmarks show the same pattern with the EVGA GTX 750 Ti FTW topping the R7 260 series cards handily. Of note here are the Bioshock Infinite results, where the EVGA GTX 750 Ti FTW actually topped the R9 270 at stock speed. Then, when overclocked, it topped the R9 270X just for good measure! The Battlefield 4 and Crysis 3 results also show this card holding very close to the R9 270 series cards.

Batman: Arkham Origin Results

Batman: Arkham Origin Results

Battlefield 4 Results

Battlefield 4 Results

Bioshock Infinite Results

Bioshock Infinite Results

Crysis 3 Results

Crysis 3 Results

FFXIV: ARR Results

FFXIV: ARR Results

Grid 2 Results

Grid 2 Results

Metro: Last Light Results

Metro: Last Light Results

If you were paying attention, you will have noticed that every game surpassed the 30 FPS threshold we call “playable”, except for Crysis 3 and Metro: Last Light. That’s some pretty impressive stuff for a card in this class. In the end, a pretty sweet showing here for the EVGA GTX 750 Ti FTW!

Power Consumption and Temperatures

With the aid of a Kill-A-Watt meter, I used HWBot Heaven and the Combined Test within 3DMark 11 to get the maximum power draw from the video card. I’ve said it before, and I’ll say it again, I’m amazed when I see how little power these modern PCs use. Even with the EVGA GTX 750 Ti FTW overclocked and under load, the total system draw still remained under 200 watts. When idle, the entire system used no more wattage than your standard 100 watt light bulb. Yea, amazing!

Power Consumption

Power Consumption

Equally impressive are the temperatures this card runs at. With temperature readings normalized to 25° C ambient, the card never even sniffed the threshold temperatures. While it’s true we didn’t have a lot of voltage manipulation at our disposal, it’s easy to see that the thermal improvements Maxwell GPUs bring forward do indeed work quite well. With temperature readings like these, there is no reason to take the fan off of auto control… even if overclocked. The ACX cooler performs quite admirably to say the least.

Temperature Readings

Temperature Readings

Pushing the Limits

I wasn’t able to get a whole lot more out of the card before stability issues arose, but I did manage a 3DMark Fire Strike run with an additional 15 MHz added to the GPU and 25 MHz added to the memory speed. This gave us just short of another 100 points added to our previous overclocked score.

evga_gtx750ti (55)

Conclusion

I must admit, I was thoroughly impressed with the EVGA GTX 750 Ti FTW from the moment I started working with it. From impressive overclocking to the great performing ACX cooler, everything worked terrific right out of the box. The improvements the Maxwell GPU bring are impressive, especially on the performance per watt front. EVGA has done a great job with their first implementation of a Maxwell based graphics card. The card is aesthetically pleasing and will fit into just about any PC application I can think of, which allows a good gaming experience to be brought to systems that couldn’t before handle the power demands.

Pricing on Maxwell based GPUs will run anywhere from $119 up to $169, depending on how much of a premium is put on partner card’s factory overclocks and proprietary coolers. Here is the breakdown of suggested reference design pricing NVIDIA passed along to us.

  • GTX 750 Reference – $119
  • GTX 750 Ti 1 GB Reference – $139
  • GTX 750 Ti 2 GB Reference – $149

As for the EVGA GTX 750 Ti FTW, it will be available for $169 MSRP. That means a $20 premium for the factory overclock and the ACX cooler, which is a bargain in my opinion. The Maxwell based GPUs should be available at e-tailers on launch day. While the price is slightly higher then the competing R7 260X, the performance gains easily make up the difference. To that end, I think the pricing lands right where it should.

The highest sales volume for video cards is the sub $200 market, and EVGA certainly has a very attractive offering in that category with the release of the GTX 750 Ti FTW. Price, performance, overclocking potential, and the latest Maxwell technologies… it’s all there! It’s a no brainer this time around -  Overclockers Approved.

Overclockers_clear_approvedClick the stamp for an explanation of what this means.

- Dino DeCesari (Lvcoyote)

Tags: , , , , ,

75 Comments:

ivanlabrie's Avatar
Do you know if there will be any 4gb versions of the card?
Can you do sli with these? (I want 4, or a multi gpu card equipped with them)
Performance per watt is incredible!

I'd love one/four

Also, can you try some yacoin mining using cudaminer?
cullam3n's Avatar
Thanks for the review, looks like an awesome budget card

How do I know you didn't read?

Lvcoyote's Avatar
No 4 GB versions and no SLI support.
ivanlabrie's Avatar
I did read, but I'm at work, skipped to the overclocking part...(all that matters!)
GTXJackBauer's Avatar
Thanks for the review! I might grab one or two of these.
ATMINSIDE's Avatar
This makes me happy for the 800 series.

Apparently another review site broke into the ~1420MHz range with their 750Ti.
Massive OC potential on these.
ivanlabrie's Avatar
And that whilst still keeping the gpu at 60-70w give or take!

Heads up: Kepler Bios Tweaking tool works with these...

https://www.dropbox.com/s/3sgcundg3j...048.140113.rom

That's a tweaked bios for the Asus card with increased tdo limits and no throttling.
TeknoBug's Avatar
The 750Ti has me interested for my HTPC FM2+ system for the low power consumption and still being able to play games casually on the TV.

And no there won't be a 4GB version, unless Nvidia wants to sell them to dumb consumers (128bit memory doesn't like anything over 2GB AFAIK).
EarthDog's Avatar
If like means its too slow of a bus, sure. It can use it, but its pointless, really due to the bus.
ivanlabrie's Avatar
It would boost my yacoin hashrate twofold lol...gimme 4gb Nvidia dammit
EarthDog's Avatar
More available ram on the same card increases hash rates? Interesting...
ssjwizard's Avatar
The lack of SLI I highly doubt has any effect on Ivans intended purpose for the cards. The 2Gb limit is fine for GPU purposes, but some of these new exotic cryptos really like alot of vRAM even if its slow. Ya never know some manufacturer might put together a 4Gb version. Heck they made a 4Gb GTX450, for what purpose who knows.
ivanlabrie's Avatar
Yacoin / high N factor uses more memory per thread...each shader adds more memory requirements, ergo, low shader count (but faster per shader) cards will outperform big cores with small amounts of ram at yacoin mining.

R7 240 4GB = 2.8kh/s
7850 1GB = 1.5kh/s

(I can game on one card whilst mining on the other 3 too... )
GTXJackBauer's Avatar
WAIT A MIN!!!!!!!!!!! Doesn't "Unified Virtual Memory" mean that if you SLI now it will add up the ram now? I've been told this before by a few folks that seemed they knew what they were talking about unless they were wrong. This is the solution. Imagine 2 of these @ 4 GBs and 2 GPUs.



SOB!!! It can't SLI! I just wasted a lot of energy. Maybe it doesn't need a SLI adapter like AMD?

This happens when I read the most recent comments even though I read the whole thread earlier and forgot about a few key things. lol
ivanlabrie's Avatar
Sli doesn't add up ram, it combines it and mirrors what one card runs in the next.
ssjwizard's Avatar
Not sure about SLI and ram it may have changed, though I dont think so. However, even if it were additive you still would not be changing the shaders / ram ratio so it would have no performance improvement.
EarthDog's Avatar
In SLI it is still mirror images on the vram, so 2GB each on two cards is still 2GB.

Thanks for the info guys.
GTXJackBauer's Avatar
Sorry I should have been more accurate in my wording. From my understanding, from now on (Maxwell) when you add more than one GPU to SLI, it will combine the Vram now as opposed to the way it has been. That's what I am told by some when you see Nvidia talk about "Unified Virtual Memory" in its roadmap. I didn't believe it at first too but was convinced thats what it meant.
EarthDog's Avatar
http://www.anandtech.com/show/7515/n...emory-for-cuda

I dont see where it talks about that...but it may have been a bit over my head on the first read, LOL!
GTXJackBauer's Avatar
Yeah, see now you got me thinking again. Not sure if that paragraph was taken out of context, plus I am not sure how all that stuff works exactly so you're probably right.
MattNo5ss's Avatar
To me, it looks like that article is about a unified System RAM and GPU RAM pool, and copying from system memory to the GPU memory pool. I don't think that has anything to do with SLI, just a new "quality of life" change for CUDA devs.
EarthDog's Avatar
It was taken out of context, sure, but read the whole article to see the surrounding context. I wouldn't cherry pick to prove a point, just to get to the point... but as I said, not sure what it all really means without reading more about it.

EDIT: I dont see much about SLI except when it says performance and the memory?

EDIT2: My bad, doesn't even say that... not sure then at all to be honest... But this is an overview of the UVM that was mentioned and it doesn't say anything about SLI as I can more clearly see now (thanks MattNo!).
ivanlabrie's Avatar
I don't think that applies to mining...not that I know of, the cudaminer developer didn't bring that up yet.
ssjwizard's Avatar
Well memory copies are happening, and the older version of CM probably does this manually. What it sounds like from the descriptions and the bits of the article that I read through is that they have added a new layer of abstraction to the CUDA memory stack. One that can automate memory management like so many other high level programming languages carry as standard in the tool box anymore, and probably should have been implemented a long time ago to CUDA.
bob4933's Avatar
I think I found my next card!!!

Was looking around, some of these non reference cards are going 30+% overclocks on them. Thats incredible!
Angrychair's Avatar
The power to performance ratio on the maxwell platform is incredible, it's exactly what I've been hoping for. The roadmaps from late last year state that the 800 series cards should be rolling out in February (now) any word on if that's still true?

The GTX 860 if it follows the 760 (sub $300, 256bit memory, 150 watt thermal) looks to be quite the monster. Hopefully the cryto currency miners don't make the nvidia market crazy like they have the radeon.
EarthDog's Avatar
Last I heard was VERY late 1H 14 or early 2H 14...

Nvidia isn't nearly as profitable at the moment, so until that changes, the market should remain the same for them.
ivanlabrie's Avatar
Oh, it sure is now! But people don't know how to tweak the cards nor which coins to mine. I won't tell the masses xD
EarthDog's Avatar
It is profitable, but not close to AMD, even with those tweaks... right? At least that is what I took away from IMOG's thread on it...
Silver_Pharaoh's Avatar
Cudaminer has come a long way.........
GTXJackBauer's Avatar
SSHHHHHhhhh lol You're going to startle the market!
ivanlabrie's Avatar
If anything gpu prices will normalize if miners get enough cards from both camps.
Angrychair's Avatar
Haven't they already normalized? At significantly above MSRP...
Silver_Pharaoh's Avatar
Not a chance. Prices for Radeons are still high
ATMINSIDE's Avatar
Just because they're high doesn't mean that it hasn't normalized.
Angrychair's Avatar
Yes, but that's where they're staying for now.

You can thank the criminals who need untraceable money and crypto-currency speculators for that.
Silver_Pharaoh's Avatar
I mine cryptos, and I'm no criminal.
J/k all the miners like me want these good cards
EarthDog's Avatar
True. If the cards remain with little stock and their value remains high due to mining, things won't change a bit and this is the new 'normal'.

It will take gobs of stock, mining to come back down to earth, or NVIDIA cards to be a lot more profitable then they are now to bring prices down a lot closer to MSRP.

Silly principle of supply and demand!
Angrychair's Avatar
I didn't mean to imply that people mining cryptocurrency are criminals. That's not what I was trying to say.

I meant that criminals wanting a way to launder money and make untraceable transactions are serving to run up the price of the currency along with speculators. In turn the increased value of the currency makes it worthwhile for jackwagons to run up the price of radeon video cards purely for the purpose of hashing instead of for those of us that actually want to play some PC games.
TeknoBug's Avatar
Drug and illegal weapon rings has been using cryptocoin for transation (since they can't be traced and can be transferred to real cash later).

Besides that, no the AMD cards has not yet normalized and it probably won't for a while, the 280X, 290 and 290X are waaaaaaaay above MSRP while the GTX 770, 780, 780Ti are still around the same price they were listed 5 months ago.

A 280X is a refreshed 7970GE and they sold for $270-290, with the R9 280X release they were listed $280-300 (bought mine for $299) today the 280X is listed for $470, that's R9 290 MSRP price range and for that much I probably just might as well buy a 780 which is a big step up from a 280X in comparison.
ivanlabrie's Avatar
Yeah, I mine and I would never pay current radeon pricing...hence I go to greener pastures.
ATMINSIDE's Avatar
I see what you did there.
EarthDog's Avatar
Looks at thread title... looks at content...
ATMINSIDE's Avatar
When are you getting a GTX 880 for review? Hmm?

You know, getting back on architecture.
EarthDog's Avatar
I would gather in a few months, when they release...(late 1H, early 2H).
ATMINSIDE's Avatar
So, mid year. That gives me time to save for GPU/block.
GTXJackBauer's Avatar
Rumors have it its possible they won't come out any time soon. Apparently they are still working on the 20nm fabrication. Probably the same time frame as the 780+ is my guess. UNLESS ED has some news he can't talk about just yet.
KonaKona's Avatar
Yo it was mentioned in the very first post, but is it possible for Lvcoyote to do some testing on various scrypt algs with this card? I've seen reviews with SHA 256 and scrypt results posted but that stuff is old and busted. Scrypt jane and it's variants are the new hotness and where all the money is to be made.

I'm sure the crypto subforum would love to help with getting all the settings perfect as well, as I believe we all want to know how well one of these would mine UTC and PTS and YAC and so on.
GTXJackBauer's Avatar
I want to know first before everyone else so they don't sellout or ridiculous price increases! lol
ivanlabrie's Avatar
I'll have to wait ten days for mine to get here...the investor I work with is sending me a few to test them.
GTXJackBauer's Avatar
Let us know how it goes. Keep the intel under the rug if you will.
ivanlabrie's Avatar
They will probably suck heh
RoXQi3x's Avatar
My first 24hrs with 750Ti: Got the MSI Gaming one, replaced a 760 next to my 780 (wierd no power cables), newest drivers, manually got my gpus.txt, and...lots of crashing! Or failing, rather.

ERROR:exception: Force RMSE error of 406.817 with threshold of 5
Folding@home Core Shutdown: BAD_WORK_UNIT

Finally running - Core 15 :\ And after that WU, ned to pause/reset to try again.

I have no separate available rig to test atm, but I'd love to hear what others experience!
TeknoBug's Avatar
Did you reinstall the drivers, always best to uninstall and fresh install when swapping video cards.
EarthDog's Avatar
Im wondering if Maxwell is even supported yet...
ATMINSIDE's Avatar
Probably not since its a new architecture.
ivanlabrie's Avatar
Hmmm, but maybe it'll work with kepler stuff?
ATMINSIDE's Avatar
When I still folded daily it used my 770 without a hitch.
ivanlabrie's Avatar
I mean for the 750ti. Maxwell card.
EarthDog's Avatar
Just for clarity...

750Ti = maxwell
770 = Kepler

Is Core15 for Maxwell? If not, then that is why I said what I posted. The 770 should mine fine being Kepler.
TeknoBug's Avatar
Someone posted a chart for mining, claims that the 750Ti is only a few Kh/s behind the 770. This kind of worries me considering the market seeing how retailers price gouge AMD cards.

Happy now Earthdog?
ATMINSIDE's Avatar
And with it taking 1/3 the power of a 770? That's impressive.
Angrychair's Avatar
Our greatest hope is that the price of crypto currency crashes soon to make this insanity stop.

All it would take is governments cranking down the regulations on exchanges to make it less appealing for criminals and therefore less appealing to miners and speculators.
Silver_Pharaoh's Avatar
No.

None of the miners want this, and as a miner I am banking on BTC to help me out with fuel costs
Angrychair's Avatar
I'm not talking about what miners want, I'm talking about what everyone else who isn't a miner and wants to play some PC games wants.
Silver_Pharaoh's Avatar
Yeah, I know your pain. Even though I am a miner, buying a card to mine on was ridiculous. These prices are outrageous.

On the other hand, if mining flops tomorrow I lose out big, and other miner lose huge. At the same time, the prices will come down so everyone else can actually buy a card....

It's like a win-lose-win situation...
GTXJackBauer's Avatar
Eventually the markets will correct itself do to supply and demand the next time around. At least that's what I believe will take place. There was a surprise in demand when supply wasn't there because of mining. I just hope these companies utilize the extra features that makes "Hashes" go up. So a company can advertise, "buy my card! because it mines better than this guys", or maybe specific GPU's just for mining and some gaming. Kind of like the Titan. (Quadro/GeForce hybrid)
magellan's Avatar
Is it normal for a GPU manufacturer to release its flagship tech in a mid-range tier? Has this ever been done before? Isn't the release of new GPU tech usually released as a top tier product? I had thought the GTX 770 was considered mid-range? Doesn't the GTX 770 beat out the 750 TI?
EarthDog's Avatar
Yes.
Yes.
It varies.
It's a rebrand if a 680 essentially.
Easily, yes.
TeknoBug's Avatar
I gave my Zotac 750Ti a whirl, it plays every game I have no problems with a little tweaking of settings (about the same as settings I use with the 560Ti), and Shadowplay is quite impressive with it (Bandicam killed the framerates while recording).
Dave Long's Avatar
I have the card specified in the review. I left a rather lengthy review over at Newegg.com in the comments so you can get more specifics there, but suffice to say that as an upgrade from a 5770 on a box with a 350W power supply, it's been stellar.

It's a little noisier than the 5770 was, but with two fans that's expected. I haven't even touched the overclocking potential yet, but even with the stock clock this thing screams. Skyrim and Battlefield 3 were the main tests so far. Very happy.
Leave a Comment