According to Wikipedia:
The AirTunes part of the AirPlay protocol stack uses UDP for streaming
audio and is based on the RTSP network control protocol.[12] The
streams are encrypted with AES, requiring the receiver to have access
to the appropriate private key to decrypt the streams.[13] The AirPort
Express' streaming media capabilities use Apple's Remote Audio Output
Protocol (RAOP), a proprietary variant of RTSP/RTP. Using
WDS-bridging,[14] the AirPort Express can allow AirPlay functionality
(as well as Internet access, file and print sharing, etc.) across a
larger distance in a mixed environment of wired and up to 10 wireless
clients.
See the page here.
Notice the part where it says that UDP is used for streaming audio.
UDP does no error correction. When using TCP (most of the time), when I send a packet from A to B, I send it, I get confirmation, then I send the next packet. When I get confirmation, part of that confirmation is an error check code. If it doesn't match with the error check code that I generated when I first sent the packet, I know the data has been malformed in some way before reaching B, and therefore I send it again.
If I have to send that packet a bunch of times before it is successfully read on the other side, that would "sound like" something unexpected. Lag is a bit different -- lag happens when there's a delay of some sort, or the line is saturated and it actually takes (some arbitrary amount of) time for the signal to get there. That could happen under either model.
The difference is, with UDP, if packets get lost or corrupt, it doesn't matter. iTunes will send whatever data is required to represent the song now. And now, and now. If you miss it or it gets malformed, oh well. It will keep sending only what is pertinent now.
Long story short: you should be fine. Any glitches will probably be short and sweet. And, out of sync? Probably not discernible to any human.
You wouldn't necessarily get "sharper" resolution - splitting a 1080p HDTV into two monitors side-by-side would result in two 960x1080 displays:
+----------------+ +--------+--------+
| | | | |
| | -> | | |
| | -> | | |
| | | | |
+----------------+ +--------+--------+
Is that what you're looking for? Despite the admonishments in your question, I don't think that's really what you're looking for. You'll get the exact same thing by just resizing your FCP windows to take up half the screen and putting both of them on the same screen. (You're stuck with the 1920x1080 pixels your display has any way you slice it.)
In (more-direct) answer to your question: no, I don't think such programs exist.
Note: I'm not familiar with Final Cut Pro, so I'm assuming it's multi-screen behavior is implemented with multiple windows.
Best Answer
This is not possible.
Apple does not "link" audio devices and video devices in any way. As far as macOS is concerned, you have discrete audio devices and discrete video (display) devices. In fact, you cannot even control audio from a Mac connected via HDMI or DisplayPort
Some Apps (i.e. Vox Music Player) allows you to select a different audio device from the default. If the App you're using (Safari for example) is playing a YouTube video, it would have to be able to be "aware" of the other audio devices (it's not).
Your app would not only have to be aware of the additional audio devices but the location of it's Window in relation to the monitor it's connected to. It's highly unlikely app programmers would include that functionality.