GDDR4 SDRAM
GDDR4 SDRAM, an abbreviation for Graphics Double Data Rate 4 Synchronous Dynamic Random-Access Memory, is a type of graphics card memory (SGRAM) specified by the JEDEC Semiconductor Memory Standard.[1][2] It is a rival medium to Rambus's XDR DRAM. GDDR4 is based on DDR3 SDRAM technology and was intended to replace the DDR2-based GDDR3, but it ended up being replaced by GDDR5 within a year.
History
- On October 26, 2005, Samsung announced that it developed the first GDDR4 memory, a 256-Mbit chip running at 2.5 Gbit/s. Samsung also revealed plans to sample and mass-produce GDDR4 SDRAM rated at 2.8 Gbit/s per pin.[3]
- In 2005, Hynix developed the first 512-Mbit GDDR4 memory chip.[4]
- On February 14, 2006, Samsung announced the development of 32-bit 512-Mbit GDDR4 SDRAM capable of transferring 3.2 Gbit/s per pin, or 12.8 GB/s for the module.[5]
- On July 5, 2006, Samsung announced the mass-production of 32-bit 512-Mbit GDDR4 SDRAM rated at 2.4 Gbit/s per pin, or 9.6 GB/s for the module. Although designed to match the performance of XDR DRAM on high-pin-count memory, it would not be able to match XDR performance on low-pin-count designs.[6]
- On February 9, 2007, Samsung announced mass-production of 32-bit 512-Mbit GDDR4 SDRAM, rated at 2.8 Gbit/s per pin, or 11.2 GB/s per module. This module was used for some AMD cards.[7]
- On February 23, 2007, Samsung announced 32-bit 512-Mbit GDDR4 SDRAM rated at 4.0 Gbit/s per pin or 16 GB/s for the module and expects the memory to appear on commercially available graphics cards by the end of year 2007.[8]
Technologies
GDDR4 SDRAM introduced DBI (Data Bus Inversion) and Multi-Preamble to reduce data transmission delay. Prefetch was increased from 4 to 8 bits. The maximum number of memory banks for GDDR4 has been increased to 8. To achieve the same bandwidth as GDDR3 SDRAM, the GDDR4 core runs at half the performance of a GDDR3 core of the same raw bandwidth. Core voltage was decreased to 1.5 V.
Data Bus Inversion adds an additional active-low DBI# pin to the address/command bus and each byte of data. If there are at more than four 0 bits in the data byte, the byte is inverted and the DBI# signal transmitted low. In this way, the number of 0 bits across all nine pins is limited to four.[9]:9 This reduces power consumption and ground bounce.
On the signaling front, GDDR4 expands the chip I/O buffer to 8 bits per two cycles, allowing for greater sustained bandwidth during burst transmission, but at the expense of significantly increased CAS latency (CL), determined mainly by the double reduced count of the address/command pins and half-clocked DRAM cells, compared to GDDR3. The number of addressing pins was reduced to half that of the GDDR3 core, and were used for power and ground, which also increases latency. Another advantage of GDDR4 is power efficiency: running at 2.4 Gbit/s, it uses 45% less power when compared to GDDR3 chips running at 2.0 Gbit/s.
In Samsung's GDDR4 SDRAM datasheet, it was referred as 'GDDR4 SGRAM', or 'Graphics Double Data Rate version 4 Synchronous Graphics RAM'. However, the essential block write feature is not available, so it is not classified as SGRAM.
Adoption
The video memory manufacturer Qimonda (formerly Infineon Memory Products division) has stated it will "skip" the development of GDDR4, and move directly to GDDR5.[10]
See also
References
- "Standards & Documents Search: sgram". www.jedec.org. Retrieved 9 September 2013.
- "Standards & Documents Search: gddr4". www.jedec.org. Retrieved 9 September 2013.
- "Samsung Electronics Develops Industry's First Ultra-Fast GDDR4 Graphics DRAM". Samsung Semiconductor. Samsung. October 26, 2005. Retrieved 8 July 2019.
- "History: 2000s". SK Hynix. Retrieved 8 July 2019.
- Samsung Develops Ultra-fast Graphics Memory: A More Advanced GDDR4 at Higher Density
- Samsung sends GDDR4 graphics memory into mass production
- Samsung releases fastest GDDR-4 SGRAM Archived 2007-02-12 at the Wayback Machine
- Samsung accelerates graphics memory to 2000 MHz
- Choi, J.S. (2011). DDR4 Mini Workshop (PDF). Server Memory Forum 2011. This presentation is about DDR4 rather than GDDR4, but both use data bus inversion.
- Softpedia report