GPU Computes NVIDIA GeForce GTX 680 Performance for TITAN RTX


A few days ago we put early Linux benchmarks for the NVIDIA TITAN RTX graphics card, Titan's newest flagship, released a few days ago. This initial performance review included an analysis of the performance of TensorFlow and other computing tests, along with some Vulkan Linux game benchmarks. In this article, you can see a more diverse range of GPU computing benchmarks when testing thirteen NVIDIA graphics cards back to the days of the Kepler GTX 680.

In addition to being a diverse range of NVIDIA cards that analyze the raw computing performance of the Linux GPU, the complementation of this performance data is also the power consumption of the AC system and the performance metrics per Watt in addition to the thermal data. All of these data were generated in a fully automated and reproducible manner using the Open Source Phoronix Test Suite benchmarking software. Power data from the AC system was being searched by the PTS using a WattsUp Pro power meter.

All tests were done from the Intel Core i9 9900K system running Ubuntu 18.04.1 LTS with the Linux 4.19 kernel and the NVIDIA 415.23 driver and CUDA 10.0.

Tests today ranged from OpenCL desktop workloads like Darktable to OctaneBench 4.0 to various CUDA / OpenCL, FAHBench, LuxMark, and other scientific programs. Again, if you're interested in TensorFlow's performance with different models and accuracy, check out last week's article for all current numbers. The cards tested in this benchmarking test included:

– GTX 680
– GTX 780 Ti
– GTX 970
– GTX 980
– GTX 980 Ti
– GTX 1060
– GTX 1070
– GTX 1080
– GTX 1080 Ti
– RTX 2080
– RTX 2080 Ti

NVIDIA computing tests were made with the cards I had available for tests that were not busy on other platforms; sans the RTX 2070 that is currently having issues. I'm still in the process of examining the ROCOM 2.0 release of Radeon and having some comparison benchmarks in the next few days. Without further ado, let's check out the computing performance of the green GPU this Christmas.


Source link