GeForce 500 series

A refresh of the Fermi based GeForce 400 series, the GeForce 500 series is a series of graphics processing units developed by Nvidia, first released on November 9, 2010 with the GeForce GTX 580.

GeForce 500 Series
Release dateNovember 9, 2010 (November 9, 2010)
CodenameGF11x
ArchitectureFermi
ModelsGeForce series
  • GeForce GT series
  • GeForce GTX series
Transistors292M 40 nm (GF119)
  • 585M 40 nm (GF108)
  • 1.170M 40 nm (GF116)
  • 1.950M 40 nm (GF114)
  • 3.000M 40 nm (GF110)
Cards
Entry-level510
GT 520
GT 530
Mid-rangeGT 545
GTX 550 Ti
GTX 560
GTX 560 Ti
GTX 560-448 Ti
High-endGTX 570
GTX 580
EnthusiastGTX 590
API support
Direct3DDirect3D 12.0 (feature level 11_0)[1]
OpenCLOpenCL 1.1
OpenGLOpenGL 4.6
History
PredecessorGeForce 400 series
SuccessorGeForce 600 series

Overview

The Nvidia Geforce 500 Series graphics cards are significantly modified versions of the Nvidia GeForce 400 Series graphics cards, in terms of performance and power management. Like the Nvidia GeForce 400 Series graphics cards, the Nvidia Geforce 500 Series graphics cards support DirectX 11.0,OpenGL 4.6, and OpenCL 1.1.

The refreshed Fermi chip includes 512 stream processors, grouped in 16 stream multiprocessors clusters (each with 32 CUDA cores), and is manufactured by TSMC in a 40 nm process.

The Nvidia GeForce GTX 580 graphics card is the first in the Nvidia GeForce 500 Series to use a fully enabled chip based on the refreshed Fermi architecture, with all 16 stream multiprocessors clusters and all six 64-bit memory controllers active. The new GF110 GPU was enhanced with full speed FP16 filtering (the previous generation GF100 GPU could only do half-speed FP16 filtering) and improved z-culling units.

On January 25, 2011, Nvidia launched the GeForce GTX 560 Ti, to target the "sweet spot" segment where price/performance ratio is considered important. With its more than 30% improvement over the GTX 460, and performance in between the Radeon HD 6870 and 6950 1GB, the GTX 560 Ti directly replaced the GeForce GTX 470.

On February 17, 2011, it was reported that the GeForce GTX 550 Ti would be launching on March 15, 2011. Although the GTX 550 Ti is a GF116 mainstream chip, Nvidia chose to name its new card the GTX 550 Ti, and not the GTS 550. Performance was shown to be at least comparable and up to 12% faster than the current Radeon HD 5770. Price-wise, the new card trod into the range occupied by the GeForce GTX 460 (768 MB) and the Radeon HD 6790.[2]

On March 24, 2011, the GTX 590 was launched as the flagship graphics card for Nvidia. The GTX 590 is a dual-GPU card, similar to past releases such as the GTX 295, and boasted the potential to handle Nvidia's 3D Vision technology by itself.[3]

On April 13, 2011, the GT 520 was launched as the bottom-end card in the range, with lower performance than the equivalent number cards in the two previous generations, the GT 220 and the GT 420. However, it supported DirectX 11 and was more powerful than the GeForce 210, the GeForce 310, and the integrated graphics options on Intel CPUs.

On May 17, 2011, Nvidia launched a less expensive (non-Ti) version of the GeForce GTX 560 to strengthen Nvidia's price-performance in the $200 range. Like the faster GTX 560 Ti that came before it, this video card was also faster than the GeForce GTX 460. Standard versions of this card performed comparably to the AMD Radeon HD 6870, and would eventually replace the GeForce GTX 460. Premium versions of this card operate at higher speed (factory overclocked), and are slightly faster than the Radeon 6870, approaching the performance of basic versions of the Radeon HD 6950 and the GeForce GTX 560 Ti.

On November 28, 2011, Nvidia launched the "GTX560Ti With 448 Cores".[4] However, it does not use the silicon of the GTX560 series: it is a GF110 chip with two shader blocks disabled. The most powerful version of the 560 series, this card was widely known to be a "limited production" card and was used as a marketing tool making use of the popularity of the GTX560 brand for the 2011 Holiday season. The performance of the card resides between the regular 560Ti and 570.

Counterfeit Usage

The cards of this generation, particularly the smaller length 550TI model, are common cards of choice by counterfeit re-sellers, who take the cards and edit their BIOS chips to have them report as more modern cards such as the GTX 1060 and 1060TI models. These cards are then sold via EBAY, Taobao, Aliexpress and Wish.Com by the scammer. They may have a minimum of functionality to ensure at a first glance they appear legitimate, but defects caused by the fake bios, manufacturing and software issues will almost always cause crashes in modern games & applications, and if not, the performance will still be extremely poor. [5]

Products

An EVGA GTX 590 classified

GeForce 500 (5xx) series

  • 1 Unified shaders : Texture mapping units : Render output units
  • 2 Each Streaming Multiprocessor (SM) in the GPU of GF110 architecture contains 32 SPs and 4 SFUs. Each Streaming Multiprocessor (SM) in the GPU of GF114/116/118 architecture contains 48 SPs and 8 SFUs. Each SP can fulfil up to two single precision operations FMA per clock. Each SFU can fulfil up to four operations SF per clock. The approximate ratio of operations FMA to operations SF is equal 4:1. The theoretical shader performance in single-precision floating point operations (FMA)[FLOPSsp, GFLOPS] of the graphics card with shader count [n] and shader frequency [f, GHz], is estimated by the following: FLOPSsp ˜ f × n × 2. Alternative formula: FLOPS sp ˜ f × m × (32 SPs × 2(FMA)). [m] – SM count. Total Processing Power: FLOPSsp ˜ f × m × (32 SPs × 2(FMA) + 4 × 4 SFUs) or FLOPSsp ˜ f × n × 2.5.
  • 3 Each SM in the GF110 contains 4 texture filtering units for every texture address unit. The complete GF110 die contains 64 texture address units and 256 texture filtering units.[6] Each SM in the GF114/116/118 architecture contains 8 texture filtering units for every texture address unit but has doubled both addressing and filtering units.

All products are produced using a 40 nm fabrication process. All products support DirectX 12.0, OpenGL 4.6 and OpenCL 1.1.

Model Launch Code name Transistors (Million) Die size (mm2) Bus interface SM count Core config1,3 Clock rate Fillrate Memory Configuration Compute capability GFLOPs (FMA)2 TDP (watts) Launch Price
Core (MHz) Shader (MHz) Memory (MHz) Pixel (GP/s) Texture (GT/s) Size (MB) Bandwidth (GB/s) DRAM type Bus width (bit) USD GBP
GeForce 510 ? GF119 ? 79 PCIe 2.0 x16 1 48:8:4 523 1046 1796 2.09 4.18 1024
2048
14.0 DDR3 64 2.1 100.4 25 OEM OEM
GeForce GT 520 April 12, 2011 GF119 ? 79 PCIe 2.0 x16 1 48:8:4 810 1620 1800 3.24 6.5 1024
2048
14.1 DDR3 64 2.1 155.5 29 $59 £45
GeForce GT 530 May 14, 2011 GF108 585 116 PCIe 2.0 x16 2 96:16:8 700 1400 1800 5.6 11.2 1024
2048
28.8 DDR3 128 2.1 268.8 50 OEM OEM
GeForce GT 545 DDR3 May 14, 2011 GF116 1170 238 PCIe 2.0 x16 3 144:24:24 720 1440 1800 17.28 17.28 1536
3072
43 DDR3 192 2.1 417.7 70 $109 ?
GeForce GT 545 GDDR5 May 14, 2011 GF116 1170 238 PCIe 2.0 x16 3 144:24:16 870 1740 3996 13.92 20.88 1024 62.4 GDDR5 128 2.1 501.1 105 OEM OEM
GeForce GTX 550 Ti March 15, 2011 GF116 1170 238 PCIe 2.0 x16 4 192:32:24 900 1800 4104 21.6 28.8 1024
2048
98.5 GDDR5 192 2.1 691.2 116 $149 ?
GeForce GTX 555 May 14, 2011 GF114 1950 332 PCIe 2.0 x16 6 288:48:24 736 1472 3828 8.8 35.3 1024 91.9 GDDR5 192 2.1 847.9 150 OEM OEM
GeForce GTX 560 SE March 2012 GF114 1950 360 PCIe 2.0 x16 6 288:48:24 736 1472 3828 17.7 35.3 1024 92 GDDR5 192 2.1 847.9 150 ? £100
GeForce GTX 560 May 17, 2011 GF114 1950 360 PCIe 2.0 x16 7 336:56:32 810-950 1620-1900 4004-4488 25.9 45.4-49.8 1024
2048[7]
128 GDDR5 256 2.1 1088.6-1276.8
1075[8]
150 $199 ?
GeForce GTX 560 Ti January 25, 2011 GF114 1950 360 PCIe 2.0 x16 8 384:64:32 822 1645 4008 26.3 52.61 1024
2048
128.27 GDDR5 256 2.1 1263.4 170 $249 ?
GeForce GTX 560 Ti 448 Core November 28, 2011 GF110 3000 520 PCIe 2.0 x16 14 448:56:40 732 1464 3800 29.28 41 1280 152 GDDR5 320 2.0 1311.7 210 $289 £239.99
GeForce GTX 570 December 7, 2010 GF110 3000 520 PCIe 2.0 x16 15 480:60:40 732 1464 3800 29.28 43.92 1280
2560
152 GDDR5 320 2.0 1405.4 219 $349 ?
GeForce GTX 580 November 9, 2010 GF110 3000[9] 520[9] PCIe 2.0 x16 16 512:64:48 772 1544 4008 37.06 49.41 1536
3072
192.4 GDDR5 384 2.0 1581.1 244[10] $499 £399
GeForce GTX 590 March 24, 2011 2× GF110 2× 3000 2× 520 PCIe 2.0 x16 32[11] 1024:128:96[12] 607 1215 3414[12] 58.75[12] 77.7 2× 1536[11] 327.7 GDDR5 2x384 2.0 2488.3 365 $699 £570[11]

GeForce 500M (5xxM) series

The GeForce 500M series for notebook architecture.

Model Launch Code name Fab (nm) Bus interface Core config1 Clock speed Fillrate Memory API support (version) Processing Power2
(GFLOPS)
TDP (watts) Notes
Core (MHz) Shader (MHz) Memory (MHz) Pixel (GP/s) Texture (GT/s) Size (MiB) Bandwidth (GB/s) Bus type Bus width (bit) DirectX OpenGL OpenCL Vulkan
GeForce GT 520M January 5, 2011 GF119 40 PCIe 2.0 x16 48:8:4 740 1480 1600 2.96 5.92 1024 12.8 DDR3 64 12.0
(11_0)
4.6 1.1 N/A 142.08 12
GeForce GT 520M GF108 40 PCIe 2.0 x16 96:16:4 515 1030 1600 2.06 8.24 1024 12.8 DDR3 64 197.76 20 Noticed in Lenovo laptops
GeForce GT 520MX May 30, 2011 GF119 40 PCIe 2.0 x16 48:8:4 900 1800 1800 3.6 7.2 1024 14.4 DDR3 64 172.8 20
GeForce GT 525M January 5, 2011 GF108 40 PCIe 2.0 x16 96:16:4 600 1200 1800 2.4 9.6 1024 28.8 DDR3 128 230.4 20-23
GeForce GT 540M January 5, 2011 GF108 40 PCIe 2.0 x16 96:16:4 672 1344 1800 2.688 10.752 1024 28.8 DDR3 128 258.048 32-35
GeForce GT 550M January 5, 2011 GF108 40 PCIe 2.0 x16 96:16:4 740 1480 1800 2.96 11.84 1024 28.8 DDR3 128 284.16 32-35
GeForce GT 555M January 5, 2011 GF106

GF108
40 PCIe 2.0 x16 144:24:24
144:24:16
96:16:4
590
650
753
1180
1300
1506
1800
1800
3138
14.6
10.4
3
14.6
15.6
12
1536
2048
1024
43.2
28.8
50.2
DDR3
DDR3
GDDR5
192
128
128
339.84
374.4
289.15
30-35
GeForce GTX 560M May 30, 2011 GF116 40 PCIe 2.0 x16 192:32:16
192:32:24
775 1550 2500 18.6 24.8 2048
1536, 3072
40.0
60.0
GDDR5 128
192
595.2 75
GeForce GTX 570M[13] June 28, 2011 GF114 40 PCIe 2.0 x16 336:56:24 575 1150 3000 13.8 32.2 1536 72.0 GDDR5 192 772.8 75
GeForce GTX 580M June 28, 2011 GF114 40 PCIe 2.0 x16 384:64:32 620 1240 3000 19.8 39.7 2048 96.0 GDDR5 256 952.3 100

Chipset table

Discontinued support

Nvidia announced that after Release 390 drivers, it will no longer release 32-bit drivers for 32-bit operating systems.[14]

Nvidia announced in April 2018 that Fermi will transition to legacy driver support status and be maintained until January 2019.[15]

See also

Notes

  • David Kanter (September 30, 2009). "Inside Fermi: Nvidia's HPC Push". realworldtech.com. Retrieved December 16, 2010.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.