Hmmm...honestly not quite sure what went wrong earlier but I have it working now, so I'll go through my steps one by one.
First, simply connect the target monitor direct via HDMI to the graphics card to make sure it works. You might need to fuss with the "screen resolution" options in Windows or your graphics cards' settings to make sure the HDMI monitor is being used and is the primary display (primary display is important to make sure full screen games are mirrored, they will always appear on the primary display).
Once you're sure the HDMI connection is working, introduce the splitter and connect only the graphics card to the splitter In and the monitor to the splitter Out. Confirm everything is working the same. Next, add an HDMI cable for the TV to the splitter Out and into the TV. Confirm all is working.
At this point you'll want to note that every time the HDMI input on the TV changes, your monitors may flicker. This seems to just be a problem with HDMI and/or how graphics cards (or just mine?) handle HDMI. It shouldn't be anything to worry about, as your PC's displays should return to normal after the flicker, it's just annoying. It seems to only happen when my HDMI switch flicks back to my PC's HDMI out, even if the TV isn't on.
For what it's worth, I have a fairly exotic setup so the HDMI is going from graphics card to a splitter to the monitor and a switch and from the switch to a splitter that runs to a capture card, TV and second monitor. Even then, the setup works fine.
If at any point the monitor's image is underscanned (black bars at all edges), make sure to disable underscan in AMD Vision/Catalyst control center in My Digital Flat-Panels > Scaling Options. It should probably read "0", all the way to the right. Make sure the TV/monitor is set to "Just scan", "1:1 pixel ratio" or whatever your vendor calls "no overscan". Not sure if this step is needed on Nvidia cards, but it if is the setup is probably similar.
Computer monitors are meant to show the whole images and most TVs are not, so the underscanning was an attempt by the graphics card to keep everything on screen. However, ideally you should set both graphics card and TV/monitor to 1:1 scaling, since scaling is awful.
Best Answer
Apparently having Hyper-V installed will cause this issue.
I still cannot get HDCP to work on display port, but after un-installing Hyper-V I do get HDCP over DVI.