GeForce GTX 660M
GPU·Budget

GeForce GTX 660M

NVIDIA

G3D Mark

1,452

March 22 2012
50 Watt
Kepler (2012−2018)

Performance Spectrum - GPU

About G3D Mark

G3D Mark is a standard benchmark that measures graphics performance in real-world gaming scenarios. It simplifies comparing cards from different brands, where higher scores directly correlate with better fps and smoother gaming experiences.

Overview

The GeForce GTX 660M is manufactured by NVIDIA. It was released in March 22 2012. It features the Kepler architecture. The core clock ranges from 835 MHz to 950 MHz. It has 384 shading units. The thermal design power (TDP) is 50W. Manufactured using 28 nm process technology. G3D Mark benchmark score: 1,452 points.

Value Upgrade Path

This is the official ChipVERSUS Value Rating, comparing raw performance (G3D Mark) per dollar. The GeForce GTX 660M is positioned at rank #190 in our cost-efficiency ranking, representing a Lower cost-benefit for your build. Components placed above yours deliver better value for money.

MSRP is the manufacturer's suggested retail price.
Avg price is the current average price collected from markets across the web.

Performance Per Dollar GeForce GTX 660M

#180
Radeon RX 550X (móvel)
MSRP: $35|Avg: $35
374%
#182
339%
#183
338%
#187
GeForce GTX 1050 (Mobile)
MSRP: N/A|Avg: $50
307%
#188
Radeon RX 6300
MSRP: $60|Avg: $40
305%
#190
GeForce GTX 660M
MSRP: N/A|Avg: N/A
100%
#195
GeForce GT625M
MSRP: N/A|Avg: $45
97%
#197
96%
#203
GeForce GT 730M
MSRP: N/A|Avg: $45
93%
Based on actual market prices and performance benchmarks.

Technical Analysis

Detailed breakdown of GeForce GTX 660M specifications and capabilities.

Graphics Performance

The GeForce GTX 660M scores 1,452 in the G3D Mark benchmark (the standard measure of rasterization GPU performance), placing it in the Entry Level tier as a Legacy generation graphics card. It is built on the Kepler (2012−2018) architecture (codename: GK107), manufactured on a 28 nm process— smaller process nodes improve power efficiency and transistor density. It packs 384 shader units (the primary compute units for rendering pixels), 32 TMUs (texture mapping units), and 16 ROPs (render output units that handle final pixel output). Raw FP32 compute power: 0.7296 TFLOPS — this measures the theoretical peak floating-point performance. Boost clock reaches 950 MHz from a base of 835 MHz.

SpecificationGeForce GTX 660M
G3D MarkPassMark 3D graphics benchmark score1,452
ArchitectureGPU microarchitecture generationKepler (2012−2018)
CodenameInternal GPU die codenameGK107
Manufacturing ProcessFabrication node — smaller = more efficient28 nm
Shader UnitsCore compute units for rendering/compute384
TMUsTexture Mapping Units — handle texture sampling32
ROPsRender Output Units — final pixel/color blending16
Boost ClockMaximum GPU frequency under load950 MHz
Base ClockGuaranteed minimum GPU frequency835 MHz
Compute (FP32)Peak single-precision floating-point performance0.7296 TFLOPS
Texture Fill RateTextures processed per second30.40 GTexel/s
PCIe InterfaceCPU↔GPU data link generationPCIe 3.0 x16
Transistor Count1,270 million
Performance RankPosition in global GPU performance ranking#774
💾

Video Memory (VRAM)

The GeForce GTX 660M is equipped with 2 GB of GDDR5 video memory connected via a 128-bit memory bus, delivering Unknown of bandwidth. Memory operates at 8 Gbps effective speed.

SpecificationGeForce GTX 660M
VRAM SizeVideo memory capacity — more = better for high-res textures2 GB
VRAM TypeMemory technology generationGDDR5
Memory BusWidth of data path — wider = more bandwidth128-bit
Memory SpeedEffective memory data rate8 Gbps
BandwidthMaximum data throughput between GPU and VRAMUnknown
L1 Cache32 kB
L2 Cache256 kB

Advanced Features & APIs

Graphics API support: DirectX 12 (11_0), Vulkan 1.2, OpenGL 4.5— latest API versions enable the most advanced rendering features like ray tracing and mesh shaders.

SpecificationGeForce GTX 660M
NVIDIA Reflex❌ Not Supported
DLSS Frame Generation❌ Not Supported
DLSS Super Resolution❌ Not Supported
DirectXMicrosoft graphics API version12 (11_0)
VulkanCross-platform graphics API1.2
OpenGLLegacy graphics API4.5
Compute APIGPGPU framework (CUDA/ROCm/OpenCL)CUDA
🖥️

Display & Media

Display outputs: 1x Laptop Dependent — supports up to 4 simultaneous displays. Hardware encoder: NVENC 1st Gen, decoder: PureVideo HD VP5— dedicated hardware for video encoding/decoding accelerates streaming, recording, and video playback. Supported codecs: H.264,VC-1,MPEG-2,MPEG-4 ASP.

SpecificationGeForce GTX 660M
Laptop Dependent PortsNumber of Laptop Dependent display connectors1x
Max DisplaysMaximum simultaneously connected monitors4
Hardware EncoderVideo encoding engine versionNVENC 1st Gen
Hardware DecoderVideo decoding engine versionPureVideo HD VP5
CodecsSupported video compression formatsH.264,VC-1,MPEG-2,MPEG-4 ASP
🔌

Power & Dimensions

The GeForce GTX 660M draws 50 Watt under load (TDP) — a 350W PSU minimum is recommended. Power connector: 1x 6-pin. Physical size: , occupying 0 expansion slots— verify your case has clearance before purchasing. Typical temperatures: 35°C at idle, 60°C under load.

SpecificationGeForce GTX 660M
TDPPower consumption under load50 Watt
Recommended PSUMinimum power supply wattage350W
Power ConnectorRequired PCIe power cable type1x 6-pin
Slot WidthExpansion slots occupied0-slot
Idle TemperatureTypical temperature at desktop idle35°C
Load TemperatureTypical temperature under gaming load60°C
💰

Value Analysis

It holds Rank #190 in the cost-benefit ranking — a comprehensive score that weighs raw performance against current market pricing. Overall performance ranking position: #774 among all indexed GPUs.

SpecificationGeForce GTX 660M
Cost-Benefit RankOverall value ranking position#190
Performance RankPosition in global GPU performance ranking#774
Release DateMarch 22 2012
Release Year2012

Similar Performance

Graphics cards with benchmark scores closest to the current hardware, grouped by manufacturer.

Our Recommendation for GeForce GTX 660M

Suggested pairings based on performance balance

Compare GeForce GTX 660M

See how this GPU stacks up against similar alternatives

Compare with other GPUs