Hello,
In upgrading a z620 for 3D modeling /rendering and some compute functions using Wolfram, Matlab, and Arc/GIS, I'm using my first two GPU configuration. At the moment, this is a Quadro K2200 + Tesla M2090.
The original $270 system:
HP z620 (Original) Xeon E5-1620 4-core @ 3.6 /3.8GHz) / 8GB (1X 8GB DDR3-1333) / AMD Firepro V5900 (2GB) / Seagate Barracuda 750GB + Samsung 500GB + WD 500GB
[ Passmark System Rating= 2408 / CPU= 8361 / 2D= 846 / 3D = 1613 / Mem =1584 / Disk = 574 ] 7.13.16
And upgraded- about +$1,100:
HP z620 (2012) (Rev 3) 2X Xeon E5-2690 (8-core @ 2.9 / 3.8GHz) / 64GB DDR3-1600 ECC reg) / Quadro K2200 (4GB)
+ Tesla M2090 (6GB) / HP Z Turbo Drive (256GB) + Seagate Constellation ES.3 1TB / 800W > Windows 7 Professional 64-bit > HP 2711x (27" 1980 X 1080)
[ Passmark: System Rating= 5675 / CPU= 22625 / 2D= 815 / 3D = 3580 / Mem = 2522 / Disk = 12640 ] 9.25.16
This GPU combination has been quite successful so far, producing an OctaneBench score of 83.74 which is similar to a Quadro M5000 or GTX 970.
For comparison, these are the results for the Quadro K4200 in the HP z420:
The OctaneBench reference GPU is GTX 980 so the K2200 /M2090 is working at about 75-85% of GTX 980 whereas the K4200 is about 35-45%.
The problem with using an M2090 in a workstation is that the M2090 uses a large passive heatsink for cooling, depending on very large air volumes in server enviroments - 75-85CFM fans. Plugging in an M2090 without additional cooling will result in thermal shutdown after a couple minutes of load. There are two versions sold, one a chassis with heatsink expossed, and the other with a rear panel bracket and heatsink enclosure. The enclosure is useful as an airflow through it focusses the air through the cooling fins.
In my tests, the cooling setup is casual, using a ThermalTake A1888 (up to 47 cf/m at 3000 rpm) external fan set in place behind the M2090. This PWM fan is powered by USB and has a speed control:
The finned card above the M2090 is an HP Z Turbo Drive 256GB AHCI which contains a Samsung M.2 SM951. The OctaneBench scores above and some tests using Passmark and Cinebench were made with the Thermaltake fan set at 42%. The A1888 simply has a small knob to control the fan, but it apprently can be PWM controlled. Also, Cinebench results:
[Cinebench R15: OpenGL= 119.23 fps / CPU = 2209 cb / Single core 130 cb / MP Ratio 16.84x
10.31.16
I'm suprised the cooling works at all, and keep waiting to pass it's limit. The M2090 has two temperature sensors that normally display only in Server and not Windows. However, if I'm reading HWMonitor correctly, the M2090 appears as "Remote 1 and "Remote 2". The maximum temperature recorded so far has been 80C, whereas the M2090 will throttle at about 90C. I've tried some renderings using both CPU and RT GPU rendering. With the CPU running on all 32 threads at near 100%, the E5-2690's peaked at 52C and seemed to hover in the 3.3-3.5GHz range with occasional excursions to the full 3.8GHz when not all cores were running. The K2200 peaked at 39C in GPU rendering and the average of the remote sensors was 65C. So far, in benchmarking and rednering tests, the thermal limits of the M2090 have not been approached. I noticed that the difference in M2090 when the fan was at 42% and 100% was not dramatic- perhaps 1-2C, so I'm running on 42% and it's hardly audible. That's the main advantage of setting the fan inside.
My first plan was to eventually build an external enclosure in 1/16" Plexi that will draw air more specifically through the heatsink:
This is based on the original idea I saw in a YouTube video where the poster had added a Tesla K80 to a scientific /research system. See: "How to cool the nVidia Tesla K80 Cheaply" https://www.youtube.com/watch?v=UcyP3ASRgBc . I should like to have a somewhat more finished cooler than shown in the video. However, given the internal solution is preferable if it works, I'm going to try a cardboard internal ducted version as well. If that is effective, it would be quieter and less ungainly than the external design.
As I need maximum GPU power for a large project, for visualization, rendering, and Wolfram Mathematica, I'd like to move the Quadro K4200 from the HP z420 to the z620. A poster here, Brian1965, is using K4200+ M2090 in a z620 and those results are exellent, in the Quadro K6000 category. See his very informative posts and the very nicely worked out- proper- liquid cooling solution built for the M2090.
Query: The thing is, the 225W M2090 is using the two z620 6-pin power connectors - one with a 6-pin to 8-pin adapter. The z620 motherboard diagram shows two internal "USB 2.0" connectors, but in my system turn out to be SATA power takeoffs. These can be seen in the system view above, just under the square chipset heatsink. My question is what the specification of these connectors might be. One source suggests one if 5V and the other 12V. Can I simply add an SATA to 6-pin adapter which I can plug into the Quadro K4200? I'm not seeing any spare Molex- is it hiding somewhere?
First tests with this configuration, in 3D navigation and rendering are very encouraging. If this configuration works as well as it seems to and doesn't overheat under stress, I'll be looking for the next step up in Teslas- K8 or K10 before I spend a fortune on an M5000 or K6000. The K4200 cost $520 and the M2090 $86 used- $606- whereas a used M5000 is still in the $1,400-$1,600 category. I've thought too of possibly changing to a z820 and have K4200 and two M2090's providing 16GB of video memory and 1344 + (2X 512) = 2,368 CUDA cores. So, I think of this idea as saving about $800-$,1000. Of course, if the applications can run on GTX, buying a GTX 1070 would be even better value, but I use several Quadro -specific applications.
I'm thinking of trying the M2090 in the z420 which is a slight gamble as that has a 600w PSU.
Cheers,
BambiBoomZ