Postgresql – Can’t restart Postgres on Windows 10 with external drive

postgresqlwindows 10

I've been working on a postgres database, with my data directory on my 8TB external USB drive. Occasionally, I need to restart the postgres instance to change configuration settings, but for some reason I can never get the service to restart.

  • Everything works fine in Windows 7
  • Can't restart postgres in Windows 10
  • Data directory in D:\postgresdata\
  • Database is empty at the moment, just the default postgres instance

  • When windows starts, the database engine starts automatically, and functions correctly

  • Running "show data_directory" returns D:/postgresdata/
  • Opening services.msc and restarting my postgres service starts and stops, never successfully starting.
  • Running pg_ctl.exe start -D D:\postgresdata fails to start with these errors:

FATAL: could not access status of transaction 0
DETAIL: Could not open file "pg_notify/0000": Invalid argument.

Things I've tried:

  • I've tested the same scenario on my C: drive, and it always starts and restarts correctly.
  • I've run chkdsk on the D: drive, nothing is wrong with it.
  • I've played around with permissions on D: and nothing makes any difference.
  • I've made a second partition from my C: drive and verified that a data directory on it always starts and restarts correctly.
  • Initializing a new database with pg_ctl init -D D:\someotherpath\ on the D: drive does not work either, same error as restarting in general.

I'm at a loss for what else to try. The weirdest thing is how the system always starts up and works at system boot time, but not in any other circumstance.

Update:
I attempted to run postgres against a USB thumb drive and the error is the same.

Best Answer

What type of partitioning scheme and USB chipset (USB-sata bridge) does the drive use? There are all sorts of unexpected limits you can still hit on a modern USB drive with certain partition types/sizes and chipsets. I've run into issues with 2TB limits imposed by some USB drive controllers. You can format a partition larger than that but when you try to read/write to a higher block number you get errors.

You can also hit limits imposed by the older MSDOS partition scheme which can be overcome by reformatting as GPT (obviously move/backup your data first).

There's a more indepth explanation here: here and here

So your solution might simply be to buy a more modern/high-end external enclosure.

Another possibility is you're hitting some sort of bug related to writeback caching. You could try adjusting the drive cache settings as described in this article

The fact it works in win7 but not win10 could also indicate a bad driver but not sure if there's anything you can do about that other than check the manufacturers website.

You can test all of the above theories by trying to write some really big files to the drive and then testing the files for corruption, one way to do that is to record a giant MPEG with your webcam or VLC screen capture. To be a useful test the files need to be really big (like 2TB+)