Xbox Series X 6nm GPU vs L40S

VS

Primary details

GPU architecture, market segment, value for money and other general parameters compared.

Place in the ranking65not rated
Place by popularitynot in top-100not in top-100
Power efficiency11.19no data
ArchitectureAda Lovelace (2022−2024)RDNA 2.0 (2020−2024)
GPU code nameAD102Scarlett 6nm
Market segmentWorkstationDesktop
Release date13 October 2022 (2 years ago)15 October 2024 (less than a year ago)
Launch price (MSRP)no data$599

Detailed specifications

General parameters such as number of shaders, GPU core base clock and boost clock speeds, manufacturing process, texturing and calculation speed. Note that power consumption of some graphics cards can well exceed their nominal TDP, especially when overclocked.

Pipelines / CUDA cores181763328
Core clock speed1110 MHz1825 MHz
Boost clock speed2520 MHzno data
Number of transistors76,300 million15,300 million
Manufacturing process technology5 nm6 nm
Power consumption (TDP)300 Watt200 Watt
Texture fill rate1,431379.6
Floating-point processing power91.61 TFLOPS12.15 TFLOPS
ROPs19264
TMUs568208
Tensor Cores568no data
Ray Tracing Cores142no data

Form factor & compatibility

Information on compatibility with other computer components. Useful when choosing a future computer configuration or upgrading an existing one. For desktop graphics cards it's interface and bus (motherboard compatibility), additional power connectors (power supply compatibility).

InterfacePCIe 4.0 x16no data
Length267 mm301 mm
Width2-slotno data
Supplementary power connectors1x 16-pinno data

VRAM capacity and type

Parameters of VRAM installed: its type, size, bus, clock and resulting bandwidth. Integrated GPUs have no dedicated video RAM and use a shared part of system RAM.

Memory typeGDDR6GDDR6
Maximum RAM amount48 GB10 GB
Memory bus width384 Bit320 Bit
Memory clock speed2250 MHz1750 MHz
Memory bandwidth864.0 GB/s560.0 GB/s

Connectivity and outputs

Types and number of video connectors present on the reviewed GPUs. As a rule, data in this section is precise only for desktop reference ones (so-called Founders Edition for NVIDIA chips). OEM manufacturers may change the number and type of output ports, while for notebook cards availability of certain video outputs ports depends on the laptop model rather than on the card itself.

Display Connectors1x HDMI 2.1, 3x DisplayPort 1.4a1x HDMI 2.1
HDMI++

API compatibility

List of supported 3D and general-purpose computing APIs, including their specific versions.

DirectX12 Ultimate (12_2)12 Ultimate (12_2)
Shader Model6.76.8
OpenGL4.64.6
OpenCL3.01.2
Vulkan1.31.2
CUDA8.9-

Pros & cons summary


Recency 13 October 2022 15 October 2024
Maximum RAM amount 48 GB 10 GB
Chip lithography 5 nm 6 nm
Power consumption (TDP) 300 Watt 200 Watt

L40S has a 380% higher maximum VRAM amount, and a 20% more advanced lithography process.

Xbox Series X 6nm GPU, on the other hand, has an age advantage of 2 years, and 50% lower power consumption.

We couldn't decide between L40S and Xbox Series X 6nm GPU. We've got no test results to judge.

Be aware that L40S is a workstation graphics card while Xbox Series X 6nm GPU is a desktop one.


Should you still have questions concerning choice between the reviewed GPUs, ask them in Comments section, and we shall answer.

Vote for your favorite

Do you think we are right or mistaken in our choice? Vote by clicking "Like" button near your favorite graphics card.


NVIDIA L40S
L40S
AMD Xbox Series X 6nm GPU
Xbox Series X 6nm GPU

Comparisons with similar GPUs

We selected several comparisons of graphics cards with performance close to those reviewed, providing you with more options to consider.

Community ratings

Here you can see the user ratings of the compared graphics cards, as well as rate them yourself.


4.3 28 votes

Rate L40S on a scale of 1 to 5:

  • 1
  • 2
  • 3
  • 4
  • 5
4.3 9 votes

Rate Xbox Series X 6nm GPU on a scale of 1 to 5:

  • 1
  • 2
  • 3
  • 4
  • 5

Questions & comments

Here you can ask a question about this comparison, agree or disagree with our judgements, or report an error or mismatch.