top of page
2_1x.png

VIRTUAL MACHINES

Explore our extensive range of GPU and CPU-only virtual servers

GPU SERVICE LIST

NVIDIA A100 SXM4

Single-Precision (FP32)

Up to 19.5 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 156 teraflops (TFLOPS)

NVIDIA A40

Single-Precision (FP32)

Up to 27.8 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 55.6 teraflops (TFLOPS)

NVIDIA V100

Single-Precision (FP32)

Up to 15.7 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 125 teraflops (TFLOPS)

H100 SXM5

Single-Precision (FP32)

Up to 60 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 120 teraflops (TFLOPS)

H200 SXM5

Single-Precision (FP32)

Not officially listed as it's a newer model, but it’s expected to be higher than the H100 due to advancements in architecture.

Mixed-Precision (FP16)

Expected to be higher than the H100, with significant improvements in AI and HPC tasks.

T4

Single-Precision (FP32)

Up to 8.1 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 65 teraflops (TFLOPS)

A30

Single-Precision (FP32)

Up to 10.3 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 20.6 teraflops (TFLOPS)

A10

Single-Precision (FP32)

Up to 24.6 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 49.2 teraflops (TFLOPS)

A16

Single-Precision (FP32)

Up to 19.2 teraflops (TFLOPS)

Mixed-Precision (FP16)

Up to 38.4 teraflops (TFLOPS)

GPU COMPUTING POWER RANKING

GPU COMPUTING POWER RANKING

GPU RANKING

SINGLE ACTUARIAL POWER RANKING

COMPUTING POWER

82.58 TFLOPS

NVIDIA RTX 4090 / 24GB

100%

73.54 TFLOPS

NVIDIA RTX 4090D / 24GB

89%

59.35 TFLOPS

NVIDIA L20 / 48GB

72%

40 TFLOPS

NVIDIA RTX 3090Ti / 24GB

48%

NO.4

37.42 TFLOPS

NVIDIA A40 / 48GB

45%

NO.5

Single-Precision RANKING

GPU RANKING

Single-Precision (FP32)

COMPUTING POWER

NO.1

H100 SXM5

100%

60 TFLOPS

27.8 TFLOPS

NVIDIA A40

89%

NO.2

24.6 TFLOPS

A10

72%

NO.3

19.5 TFLOPS

NVIDIA A100 SXM4

48%

NO.4

19.2 TFLOPS

A16

45%

NO.5

Mixed-Precision RANKING

GPU RANKING

Mixed-Precision (FP16)

COMPUTING POWER

53%

125 TFLOPS

48%

120 TFLOPS

47%

65 TFLOPS

40%

55.6 TFLOPS

100%

156 TFLOPS

H100 SXM5

NO.3

NVIDIA A40

NO.5

NO.4

T4

NO.2

NVIDIA V100

NO.1

NVIDIA A100 SXM4

GPU RANKING

SINGLE ACTUARIAL POWER RANKING

COMPUTING POWER

82.58 TFLOPS

NVIDIA RTX 4090 / 24GB

100%

73.54 TFLOPS

NVIDIA RTX 4090D / 24GB

89%

59.35 TFLOPS

NVIDIA L20 / 48GB

72%

40 TFLOPS

NVIDIA RTX 3090Ti / 24GB

48%

NO.4

37.42 TFLOPS

NVIDIA A40 / 48GB

45%

NO.5

NVIDIA A100 SXM4 / 80GB

Single actuarial power

19.5 TFLOPS

Semi-actuarial power

312 Tensor TFLOPS

NVIDIA A40 / 48GB

Single actuarial power

37.42 TFLOPS

Semi-actuarial power

149.7 Tensor TFLOPS

NVIDIA V100 / 32GB

Single actuarial power

15.7 TFLOPS

Semi-actuarial power

125 Tensor TFLOPS

NVIDIA RTX 4090 / 24GB

Single actuarial power

82.58 TFLOPS

Semi-actuarial power

165.2 Tensor TFLOPS

NVIDIA RTX 4090D / 24GB

Single actuarial power

73.54 TFLOPS

Semi-actuarial power

147 Tensor TFLOPS

NVIDIA RTX 3090 / 24GB

Single actuarial power

35.58 TFLOPS

Semi-actuarial power

71 Tensor TFLOPS

NVIDIA RTX A5000 / 24GB

Single actuarial power

27.77 TFLOPS

Semi-actuarial power

117 Tensor TFLOPS

NVIDIA RTX 3080Ti / 12GB

Single actuarial power

34.10 TFLOPS

Semi-actuarial power

70 Tensor TFLOPS

NVIDIA RTX A4000 / 16GB

Single actuarial power

19.17 TFLOPS

Semi-actuarial power

76.7 Tensor TFLOPS

NVIDIA RTX 3080 / 10GB

Single actuarial power

29.77 TFLOPS

Semi-actuarial power

59.5 Tensor TFLOPS

NVIDIA RTX 2080Ti / 11GB

Single actuarial power

13.45 TFLOPS

Semi-actuarial power

53.8 Tensor TFLOPS

NVIDIA TITAN Xp / 12GB

Single actuarial power

12.15 TFLOPS

Semi-actuarial power

12.15 TFLOPS

bottom of page