MacOS – How to diagnose the Python program being killed due to an out-of-memory error

macosmemoryvirtual-memory

I've been building a Neural Network that analyzes large amounts of data (40G), and my iMac kills the process after it's been running for about a day.

In the past, on Linux, I've created a large swap file to get around memory limitations.

I see:

$ python processor.py
[...maybe some std out messages, specific to what I'm doing...]
Killed.

I've come to know this as the "you've used up too much memory, good bye" message. Again, I've been able to solve it using a large swap file on linux.
How can I increase the swap limit on my mac so my processes that use large amounts of memory don't get killed?

Not sure how I could get more information about why it got killed.

Best Answer

Here are some ways to check for issues but I fear they may not be a complete or even the correct solution without more peeking or poking:

In another shell after you start your python process (or if you screen / tmux it):

  1. df /
  2. top -l 1 -S | head -12
  3. vm_stat and vm_stat 600
  4. sudo du -sm /var/vm/*

Once you have a good baseline, you can watch things over time to see how the neural net is behaving each hour for a while. If you think things are about stop, you can run sysdiagnose python (or use the process # if you have more than one python process running). Also, if you don't want to wait the day for things to bulk up, you can inflict memory_pressure on the system before or after starting the neural net in python. See this answer for how to monitor the Activity Monitor when you run this process: