Page 1 of 2 1 2 LastLast
Results 1 to 20 of 24
  1. #1
    Registered
    Join Date
    May 2010
    Posts
    28

    Will my Sony Bravia KDL-40EX500 cripple the GTX 580?

    Hey all,

    I currently have a Sony Bravia KDL-40EX500 LCD tv.

    I'm making a build which will include the GTX 580, and I was wondering if my TV, which will be used as a monitor via HDMI, will cripple the card. I'm guessing not, considering the TV boasts 120Hz, 1080p and it's 40 inches diagonal.

    I just don't want to dump the cash into the 580 and then realize that my tv ends up being the bottleneck here.

    I wish it was a 3D tv, but I love this tv, so I'm sticking wit it.

  2. #2
    Member
    Join Date
    Jul 2010
    Posts
    411
    Bottle neck is when someting causes loss of performance of somting. Less resolution, better performance... Your TV is very capable from what i can see. Not going to be a problem at all.

  3. #3
    Member kayson's Avatar
    Join Date
    Jan 2005
    Location
    USA
    Posts
    3,061
    A display can't really be a bottleneck... If you mean you're wondering whether the gtx 580 can handle running at that resolution/refresh rate, then you'll still be totally fine. That gpu is a best.
    Project Silver
    Intel Core i7-4770k (4.3GHz @ 1.265V)
    Asus Maximus VI Hero
    G.Skill Sniper 2x4GB DDR3-1866
    EVGA nVidia GeForce GTX 760
    Corsair HX850
    Samsung 840 Pro 256GB
    Thermalright Venomous X || 2x 120mm Panaflo || Arctic Silver V
    Lian-Li PC-A77FA || Vantec Nexus Fan Controller
    (Build Journal)
    Heatware

  4. #4
    Registered
    Join Date
    May 2010
    Posts
    28
    Thanks for the help guys!

    Sorry if I used the wrong term.

    What I meant was, will I get almost all out of the GTX 580 with the Bravia, minus the 3D? What is a display called if it isn't considered a bottleneck? I think I see what you mean though.

  5. #5
    Member kayson's Avatar
    Join Date
    Jan 2005
    Location
    USA
    Posts
    3,061
    Quote Originally Posted by qcom View Post
    Thanks for the help guys!

    Sorry if I used the wrong term.

    What I meant was, will I get almost all out of the GTX 580 with the Bravia, minus the 3D? What is a display called if it isn't considered a bottleneck? I think I see what you mean though.
    I see what you're saying. Technically, I guess you're using the right term, but I've never heard of anyone thinking about performance in that direction. I suppose yes, if you don't have a big enough display, your GTX580 will run at some ungodly high fps, wasting its potential. Keep in mind, though, that this depends a lot on what kind of video/games you're running on it and what the settings are.

    Suppose you run your game at max settings, its certainly possible that you will get an fps much higher than you need. 1080p = 1920x1080, and you can find 24" desktop monitors with higher resolutions which would tax the gpu far more than a 1080p TV over hdmi. Honestly, though, at such high performance levels, any differences would not be noticeable.
    Project Silver
    Intel Core i7-4770k (4.3GHz @ 1.265V)
    Asus Maximus VI Hero
    G.Skill Sniper 2x4GB DDR3-1866
    EVGA nVidia GeForce GTX 760
    Corsair HX850
    Samsung 840 Pro 256GB
    Thermalright Venomous X || 2x 120mm Panaflo || Arctic Silver V
    Lian-Li PC-A77FA || Vantec Nexus Fan Controller
    (Build Journal)
    Heatware

  6. #6
    mxthunder's Avatar
    Join Date
    Dec 2004
    Location
    Northeast Ohio
    Posts
    1,756
    Yeah, hooking that card to anything LESS than that TV would be putting the card to waste.
    i7 3770K @ 4.8GHz
    Gigabyte GA-Z77X-UP7
    2x4GB Ripjaws @ 1600 7-8-7-24-24 - 1T
    XSPC res + D5, RS360 + EK 120mm rad
    GeForce GTX780Ti w/custom bios @1300mhz
    Corsair TX950
    WD Black 750GB/Hitachi 7k1000/Intel 330 SSD cache drive
    LG 27" 1080p
    Corsair Obsidian 700D

    My GPU collection

  7. #7
    Registered
    Join Date
    May 2010
    Posts
    28
    Quote Originally Posted by kayson View Post
    I see what you're saying. Technically, I guess you're using the right term, but I've never heard of anyone thinking about performance in that direction. I suppose yes, if you don't have a big enough display, your GTX580 will run at some ungodly high fps, wasting its potential. Keep in mind, though, that this depends a lot on what kind of video/games you're running on it and what the settings are.

    Suppose you run your game at max settings, its certainly possible that you will get an fps much higher than you need. 1080p = 1920x1080, and you can find 24" desktop monitors with higher resolutions which would tax the gpu far more than a 1080p TV over hdmi. Honestly, though, at such high performance levels, any differences would not be noticeable.
    OK, but yes, I would be using the latest games with my 1920x1080 resolution, at the complete MAX settings.

    Quote Originally Posted by mxthunder View Post
    Yeah, hooking that card to anything LESS than that TV would be putting the card to waste.
    OK, so are you saying that the Bravia will be able to push the GTX 580 relatively well?

    So, in other words, should I get a less expensive card because my TV won't be able to push it well enough? Or do you think the TV will be ok?

  8. #8
    Member Tokae's Avatar
    Join Date
    Jun 2010
    Location
    London Canada
    Posts
    2,050
    1920 x 1080 is the same whether its on a 20" monitor or on a 40" tv. Your display regardless of it being a tv or monitor, has no bearing on how 'hard' the video card works. If anything it will look better on the tv because they usually have a much better contrast ratio than a typical lcd monitor. You will see better colors in otherwords. I had my 5870 hooked up to my bravia and I now am trying to find a way to use it 'permanently' as my main monitor. The missus though doesn't seem to think that we should be looking at another tv. Guess who wins!
    Gaming / Streaming Rig:
    CPU: i7 980x @ 4.2
    Cooler: NH-12U
    RAM: GSkill 1600Mhz 12GB
    Mobo: Asus R3E
    GPU: MSI R9 290x
    Storage: OS: Adata 128GB SSD Data: 4 x 1TB in RAID5
    Case:800D
    PSU: Revolution 85+ 1020w

  9. #9
    Member

    Dooms101's Avatar
    Join Date
    Dec 2009
    Location
    under a heatsink
    Posts
    1,667
    I think the question you're asking is "would a GTX 580 be overkill for a 1080p display?" The answer is probably yes. Much cheaper cards would be able to run at that relatively low resolution (1920x1080) with a good framerate and maxed settings. The GPU's performance depends on your CPU, RAM, and Motherboard in order to maintain a good bandwidth, not the display. The resolution can't "cripple" a card in terms of hindering it's actual performance. A card will perform just the same on any 1080p screen with the same refreshrate. However, your resolution will make a difference on how well the card can perform. Btw a card's performance is usually best measured in fps. So the higher the resolution the lower the fps. The GTX 580 would eat almost anything at 1080p for breakfast without even a wince. Therefore a GTX 580 is pretty much overkill, but you wouldn't have to upgrade in a while though . Also, you would be able to run a high-res monitor (say... 2560x1600) in the future at max settings and good framerates.
    [Dell Latitude E6530]............................[i7-3540M 3Ghz, 16GB, 750GB, 1080p LED]
    [Hackintosh].......................[i3-3225 3.3Ghz, 8GB DDR3, 250GB SSD, GTX650 1GB]
    Area51 Case Mod- - - - - - - - ODD Window Mod - - - - - - - -NZXT Phantom Triple Rad

  10. #10
    Registered
    Join Date
    May 2010
    Posts
    28
    Sounds good.

    I think what I'll do is wait until the AMD HD 6900 series GPUs come out, and check out their specs and benchmarks in comparison to the 580.

    Not only may it surpass the 580, it nay be cheaper as well.

  11. #11
    Member Tokae's Avatar
    Join Date
    Jun 2010
    Location
    London Canada
    Posts
    2,050
    Quote Originally Posted by qcom View Post
    Sounds good.

    I think what I'll do is wait until the AMD HD 6900 series GPUs come out, and check out their specs and benchmarks in comparison to the 580.

    Not only may it surpass the 580, it nay be cheaper as well.
    You never know with AMD!

    I might sell my 5870 and grab the new card.. I'm hoping they are beasts..
    Gaming / Streaming Rig:
    CPU: i7 980x @ 4.2
    Cooler: NH-12U
    RAM: GSkill 1600Mhz 12GB
    Mobo: Asus R3E
    GPU: MSI R9 290x
    Storage: OS: Adata 128GB SSD Data: 4 x 1TB in RAID5
    Case:800D
    PSU: Revolution 85+ 1020w

  12. #12
    Member kayson's Avatar
    Join Date
    Jan 2005
    Location
    USA
    Posts
    3,061
    Ok nowwww it all makes sense. It's not that the tv is a bottleneck but the 580 is way freaking overkill. At 1080p you're probably fine with a 470 or 480 or one of the amd 6900's when they come out
    Project Silver
    Intel Core i7-4770k (4.3GHz @ 1.265V)
    Asus Maximus VI Hero
    G.Skill Sniper 2x4GB DDR3-1866
    EVGA nVidia GeForce GTX 760
    Corsair HX850
    Samsung 840 Pro 256GB
    Thermalright Venomous X || 2x 120mm Panaflo || Arctic Silver V
    Lian-Li PC-A77FA || Vantec Nexus Fan Controller
    (Build Journal)
    Heatware

  13. #13
    Registered
    Join Date
    May 2010
    Posts
    28
    Quote Originally Posted by kayson View Post
    Ok nowwww it all makes sense. It's not that the tv is a bottleneck but the 580 is way freaking overkill. At 1080p you're probably fine with a 470 or 480 or one of the amd 6900's when they come out
    Yeah, sorry about my OP phrasing before.

    Yeah, that's what I meant.

    I think I still might get it for coolness factor, and not only that, but I might update displays somewhat soon.

  14. #14
    Member kayson's Avatar
    Join Date
    Jan 2005
    Location
    USA
    Posts
    3,061
    Quote Originally Posted by qcom View Post
    Yeah, sorry about my OP phrasing before.

    Yeah, that's what I meant.

    I think I still might get it for coolness factor, and not only that, but I might update displays somewhat soon.
    Not gonna lie. Sony bravia tvs are amazing!
    Project Silver
    Intel Core i7-4770k (4.3GHz @ 1.265V)
    Asus Maximus VI Hero
    G.Skill Sniper 2x4GB DDR3-1866
    EVGA nVidia GeForce GTX 760
    Corsair HX850
    Samsung 840 Pro 256GB
    Thermalright Venomous X || 2x 120mm Panaflo || Arctic Silver V
    Lian-Li PC-A77FA || Vantec Nexus Fan Controller
    (Build Journal)
    Heatware

  15. #15
    Member JonSimonzi's Avatar
    Join Date
    Jan 2008
    Location
    Providence, RI
    Posts
    2,207
    Quote Originally Posted by kayson View Post
    Not gonna lie. Sony bravia tvs are amazing!
    I just bought the exact same model the OP is talking about, it's rather nice.
    Gigabyte Sniper M3 - i7 2700K - Corsair H80
    Asus GTX780 DCII - Dell U2712HM & Dell U2212HM
    16GB G.Skill Sniper - 256GB Crucial M4 - 3TB Hitachi 7K3000
    Silverstone ST65F-G - Silverstone SG10
    Heatware

  16. #16
    Registered
    Join Date
    May 2010
    Posts
    28
    Quote Originally Posted by kayson View Post
    Not gonna lie. Sony bravia tvs are amazing!
    I know, right! There quite b0$$!

    Quote Originally Posted by JonSimonzi View Post
    I just bought the exact same model the OP is talking about, it's rather nice.
    I'm assuming you mean the exact model TV, right?

    If so, what do you think of it?

    I have mine set up to be retail mode, with some settings configured the way I like.

  17. #17
    Member

    Dooms101's Avatar
    Join Date
    Dec 2009
    Location
    under a heatsink
    Posts
    1,667
    1080p looks fine to me lol... I use 1440x900 =D
    I would wait for the 6970, it'll probably stomp on a 580 and cost less (that's what ATi loves doing)
    [Dell Latitude E6530]............................[i7-3540M 3Ghz, 16GB, 750GB, 1080p LED]
    [Hackintosh].......................[i3-3225 3.3Ghz, 8GB DDR3, 250GB SSD, GTX650 1GB]
    Area51 Case Mod- - - - - - - - ODD Window Mod - - - - - - - -NZXT Phantom Triple Rad

  18. #18
    Badbonji's Avatar
    Join Date
    Apr 2008
    Location
    Birmingham, UK
    Posts
    3,520
    ATi's single gpu cards generally sit behind Nvidia's top end card, but the HD6990 should be able to beat the GTX580.
    i7 3770K 4.5GHz 1.2V | Phanteks PH-TC14PE | 16GB TeamGroup Elite 1600MHz | GIGABYTE G1.Sniper M3 | EVGA GTX680 SC 1254/1500MHz | M4 256GB + 150GB Raptor | Corsair HX+ 850W | Antec 1200
    i7 965 | 8GB OCZ Gold 1600MHz | GIGABYTE Extreme X58 | HD5450 | M4 64GB + 7TB | OCZ 500W
    MacBook Pro 15" | i7 2.3GHz | 8GB 1600MHz

  19. #19
    Member

    Dooms101's Avatar
    Join Date
    Dec 2009
    Location
    under a heatsink
    Posts
    1,667
    I agree with that, but if the rumored specs of the 6970 (although seem to be ridiculous) are true, then it would certainly perform better than the 580.
    [Dell Latitude E6530]............................[i7-3540M 3Ghz, 16GB, 750GB, 1080p LED]
    [Hackintosh].......................[i3-3225 3.3Ghz, 8GB DDR3, 250GB SSD, GTX650 1GB]
    Area51 Case Mod- - - - - - - - ODD Window Mod - - - - - - - -NZXT Phantom Triple Rad

  20. #20
    Member JonSimonzi's Avatar
    Join Date
    Jan 2008
    Location
    Providence, RI
    Posts
    2,207
    Quote Originally Posted by qcom View Post
    I'm assuming you mean the exact model TV, right?

    If so, what do you think of it?

    I have mine set up to be retail mode, with some settings configured the way I like.
    Yep, same TV. $650 from Ultimate Electronics, worth every penny. I was using a crappy 720p 32" Magnavox TV before that. I love the TV, now I just need to get a PS3 to play Blu Rays
    Gigabyte Sniper M3 - i7 2700K - Corsair H80
    Asus GTX780 DCII - Dell U2712HM & Dell U2212HM
    16GB G.Skill Sniper - 256GB Crucial M4 - 3TB Hitachi 7K3000
    Silverstone ST65F-G - Silverstone SG10
    Heatware

Page 1 of 2 1 2 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •