The screen resolution difference with a retina display

applescriptdisplayresolution

I can get the screen resolution of my Macbook pro using the AppleScript code on this site.

https://stackoverflow.com/questions/1866912/applescript-how-to-get-current-display-resolution

on getScreenSize()
    -- from https://stackoverflow.com/questions/1866912/applescript-how-to-get-current-display-resolution
    set resolutions to {}
    repeat with p in paragraphs of ¬
        (do shell script "system_profiler SPDisplaysDataType | awk '/Resolution:/{ printf \"%s %s\\n\", $2, $4 }'")
        set resolutions to resolutions & {{word 1 of p as number, word 2 of p as number}}
    end repeat
    -- return the last one assuming that it should be the one that is not the Laptop screen
    return item (length of resolutions as integer) of resolutions
end getScreenSize

The returned value matches from the information from "About this Mac".

enter image description here

However, when I get the window size from a debugger, it shows half of the value from the system. I make a window to fill in a screen, and check the bounds of the window.

enter image description here

This is confusing and problematic as I'm writing AppleScript code that aligns multiple windows using set bounds command.

set bounds of s to {x1, y1, x2, y2}

What makes difference between the display resolution from a system and bounds of a window? Is it just safe to assume that (system resolution)/2 should be used for setting bounds for retina displays?

Best Answer

MacBookPro has an effective resolution that depends on the Scale you setup for the display.

enter image description here

You need to use the effective resolution in your code. You can get it from this code, but it works only when there are no external montitors.

tell application "Finder"
    set screen_resolution to bounds of window of desktop
end tell

Reference