Mac Pro – Best Graphics Card for 50” 4K Ultra HD LED TV as Monitor

displaygpumac pro

I have a 2009 Mac Pro 4.1, with the following specifications:

2 2.93 GHz Quad-core Intel Xenon processors
32 GB RAM
2 1 TB HD
1 MacPro Raid Card
1 Nvidia GeForce GT 120 512 MB

The work I do benefits from lots of screen real estate. Prior to switching to the Mac Pro, I used a pair of 27" iMacs extending my desktop across the two displays.

I have started thinking about whether I could/should get a 50” 4K Ultra HD LED TV and use it as monitor. The size would work well in my office and it would give me more square inches of display than the pair of iMacs.

I know little about graphics cards and displays, but I understand that a want both a display and a graphics card to support 60Hz to deal with things like cursor/mouse movement delays.

My work is mostly text and some screen drawing, but no extensive graphics. Recreationally, some video's.

So, some questions.

Does a 50” 4K Ultra HD LED TV as monitor make sense?
Positives vs Negatives?
Other recommendations?

What graphics card(s) will work in the Mac Pro for what I want to cobble together?

Any suggestions on the best value given my use?

Best Answer

TL;DR: Don't get a TV, get (almost) whatever graphics card you want

Using a TV as a monitor

Let's start with this issue. There's nothing inherently wrong with using a TV as a monitor, but it's probably not a good purchasing decision.

Here are the issues I'll go over:

  1. Physical space
  2. Usability
  3. Resolution and PPI
  4. Color
  5. Response time
  6. Price

Let's start from the top and work our way down.

Assuming you have the physical space to fit a 50" beast, you're fine. With that said, you'll probably be tilting you head up a lot, as you'll have more vertical space then you'll probably want. You'd be better off with a wider aspect ratio or a combo of narrower monitors placed side by side to minimize strain.

You'll obviously be losing out on some nice features of a monitor if you use a TV: certain inputs, USB hubs, onboard audio, better out-of-the-box color calibration, nicer stand/buttons/bezel, support for easy brightness changes, etc. These are the sorts of things you can probably work around, but might be annoying.

Even at 4k (3840x2160), with a 50" TV, you'll be at a mere 88.12 DPI. That's not awful (72 DIP is generally considered the minimum), but it's not good (the 1440p iMac was at 108.79), and you'll probably notice. You'd probably be happier spending that same money on a smaller display with higher PPI, even at the expense of total resolution or screen space.

Color on modern TVs is generally quite good–especially if you go the OLED route–but out-of-the-box calibration may vary (although this happens with monitors as well.) However, it sounds like you don't really care, so this isn't something too important.

One place where you're very likely to suffer is with response time. TVs usually have awful response times, some above 100 MS. That's getting to the point where it may be legitimately annoying even for some work. Given that you're not gaming, you'll be fine with a higher response time, but still, make sure you know what you're getting.

Price. This can go either way. You can easily spend more either way depending what you get. But the chances are, you won't save by getting a TV.

With that said, all this will change based on exactly what TV you get. But this is a pretty fair generalization.

One final word of advice though: you said you had two 27" iMacs. Why not use those using Target Display Mode? You won't be spending any more money and you'll have two beautiful iMacs with a lot of real estate. Throwing that out is throwing out good money. You can get a few more monitors on top of that, but you don't need to replace them.

TL;DR: Use yo iMacs, and if you really need more real estate, buy a proper monitor.

Graphics Card

Graphics cards + Mac Pro = fun.

There are two kinds of graphics cards we'll be looking at here: flashed and unflashed cards.

Flashing is the process of putting a different piece of firmware on a graphics card. Why would you need to do that? Well, if you've ever looked at a Mac graphics card, they're very expensive compared to their PC counterparts. Generally, the only difference is what firmware is on there. Most PC cards can be reflashed with the Mac firmware. I'll drop this here as further reading that does a much better job explaining what I'm trying to get at.

Sound too complicated? Don't worry, you probably don't need to flash your card at all. Most modern cards (usually AMD Tahiti and newer, Nvidia is a bit weird) will work without being flashed once you get to the OS X desktop. This does mean though that certain things won't work (your screen will be black until you finish booting for example.)

If you want a Mac card that's flashed without the hassle, you can pay MacVidCards to do it for you. They charge a decent amount, but they have solid firmwares that work.

Outside of making sure it works with your Mac, just get whatever is appropriate for your workload. It doesn't sound like you need a lot of power but like pushing a lot of pixels, so a mid-tier graphics card should work.

TL;DR: Get the appropriate card, it'll probably work.

Final Words

Feel free to add comments and I'll respond to them. There's a lot of subjective content here, so what's best for you != what's best for the average person.