Debian – Create Windows 10 Bootable USB Drive Using Terminal (dd)

dddebiangpartedlinuxusb-flash-drive

My goal is so simple, the title says it all, but every way I've tried, I've failed. I've read instructions on various sites (besides here) and they all seem to be missing something… this is what I have:

  • 16 gb SanDisk USB 3 drive.
  • Debian Jessie machine
  • windows machine, macbook pro

Though I can easily create a bootable Windows 10 usb with rufus, my goal is more educational: I want to understand what is going on, and what is the source of my failure, and if possible, to make it work.

When I try to create a Win10 image in the terminal, I tried this command:

sudo dd if=Windows10.iso of=/dev/sdc1 bs=512k

I get a partition that seems to mount on Debian, but is otherwise unrecognized on windows and mac. Gparted shows this:
gparted reports filesystem unknown
by comparison, another normally working usb flash drive (i got 4) reads like this:
enter image description here

I'd read in some places that you're not supposed to output to the partition (sdc1) but to the drive (sdc), so I tried this:

sudo pv Windows10.iso | sudo dd of=/dev/sdc bs=5M

(which to those familiar is the same command, piped through dd, and with sdc instead).
this appears to nuke the entire partition, as you can see from fdisk: my terminal output showing the command and fdisk.

This was upsetting, but I decided to start off fresh. I rebooted, and ran the following commands

sudo umount /dev/sdc1
sudo wipefs -a /dev/sdc
sudo fdisk -l
sudo fdisk /dev/sdc
n, p, 1, [enter], [enter], t, 7, w

that should format a new partition and change it from the default (linux) to the ntfs partition I need. Then I run:

lsblk 

and make the NTFS filesystem with this command:

sudo mkfs -t ntfs /dev/sdc1

after which, I tried running dd, but with an additional option:conv=fdatasync (which some folks say insures nothing stays in the cache and may solve this problem).

pv Windows10.iso | sudo dd of=/dev/sdc conv=fdatasync bs=512k

(I dropped the byte size in case that's a problem). Regardless of how I do it, I noticed the following:

  • dd does write the files and filesystem, and it is readable in linux (I can open the files) but it is useless and lsblk and gparted both say there is nothing there!
  • whether I chose sdc1 or sdc seems only to affect how badly the drive is wrecked. one damages the partition, while the other makes it seem like the whole drive is unallocated.
  • the drive is fine: I went into windows and with the same 'wrecked' usb drive, copied over the same file and verified that it boots up and works fine.

Keep in mind the dd command works with gpartedlive. I ran the following code:

sudo wipefs -a /dev/sdc
sudo fdisk /dev/sdc
lsblk
sudo mkfs -t vfat /dev/sdc1
pv gparted-live-1.1.0-1amd64.iso|sudo dd of=/dev/sdc bs=4M conv=fdatasync

and got a fully working gparted live drive.
This is confusing the heck out of me, so I thought I'd ask for help. I know that I'd save myself the trouble if I just stayed with Rufus, but this is not about going simple, but about understanding what is going on. I know a few gui tools on linux might solve the problem, but, again, my hope is to do it using the old unix terminal if possible. if it's not possible, then I'd like to know why.
so to summarize:

  1. why isn't it working? what am I doing wrong?
  2. why is dd wrecking the partitions yet it seems to work fine with gparted?
  3. where can I learn more about this less common use of copying images to flash drives?

thank you so much for all your help! you'll save me hours of more headaches!

Best Answer

I want to understand what is going on

Rufus developer here.

What way too many people fail to understand, because Linux ISOs are applying this method, but this is essentially a MAJOR HACK CALLED 'ISOHYBRID', is that, in most cases, you cannot simply take an ISO image and copy it byte for byte to a USB drive, and expect that too boot.

That is because the ISO format and the underlying file systems it uses (ISO9660 or UDF) are designed for optical boot, which is a completely different beast from regular HDD or USB boot. For one thing optical media, and therefore (regular) ISO images, don't have a partition table, which is (usually) essential for HDD or USB boot, and they also (usually) don't have a Master Boot record, a.k.a. MBR, which is essential for BIOS boot.

This means that, if you do a 1:1 copy of a regular ISO, such as Windows one, onto a disk, and try to boot this is what's going to happen:

  • A BIOS system or UEFI system in Legacy/CSM mode will not see any MBR, and especially it will not see the 0x55 0xAA sequence in the very last 2 bytes of the MBR that indicates that a disk is BIOS-bootable. Therefore it won't be able to boot that disk in BIOS mode.
  • A UEFI system will (usually) not mount UDF or ISO9660 partitions from a disk or flash drive media, because, even as it has drivers for these files systems, the resulting disk you created will be missing an MBR or GPT partition table. When booting a regular disk, UEFI is designed to first look for a partition, and then look for a bootloader (e.g. /efi/boot/bootx64.efi) on that partition. So if there is no MBR or GPT partition table on the media, which will be the case for a regular ISO, then it doesn't matter if the ISO contains a bootloader file, because the UEFI firmware will not be able to mount the partition it resides on.

So, what utilities like Rufus do when creating a bootable disk media from a Windows ISO, which is a completely standard optical media image, is:

  • Create a partition table, either MBR or GPT according to what the user selected, and create at least one partition, that will typically use FAT32 or NTFS as the file system (notice that it uses completely different file systems from what an ISO uses).
  • If MBR is used, a bit of code in the MBR that locates the secondary boot loader, on the relevant MBR partition, which is in designed to start the execution of the Windows kernel, in disk mode, from that partition. Oh, and it also ensures that the 0x55 0xAA boot marker is added at the end of the MBR so that BIOS sees the disk as bootable. Then it also copies the content of the ISO onto a FAT32 or NTFS partition.
  • If GPT is used, Rufus verifies that there actually exists a UEFI bootloader file, such as /efi/boot/bootx64.efi (well, actually it does that before you it allows you to select GPT, coz there's not much point in trying to create a GPT bootable drive if there is no UEFI bootloader) and then copies it, along with the rest of the ISO files, typically onto a FAT32 partition, since boot from a FAT32 partition is a mandatory requirement of UEFI (but that does not mean UEFI can't boot from NTFS or exFAT if you have the relevant UEFI drivers, which can come handy if you have a Windows ISO with a file that is larger than 4 GB, as FAT32 cannot accommodate such files).

Now, the above only works when the secondary bootloaders (i.e. the ones that comes from Windows and which Rufus doesn't modify) are designed to support both optical and regular boot, which typically mean they need to handle both UDF or ISO9660 and FAT32 or NTFS file systems, as well as the other differences that present themselves when booting from disk vs from optical. But Microsoft did design its bootloader precisely for that, which is the smart thing to do, because, if your target system is UEFI, it means you (usually, as long as the 4 GB max filesize issue of FAT32 doesn't rear its ugly head) don't need a utility to convert an ISO to a bootable USB, but you can just format that USB to FAT32 and copy the ISO files onto it (file copy, not byte copy), and you have a bootable media.

And now that we have gone through all of the above, I can get into a rant and explain why I believe that the Linux distro maintainers, who usually are smarter than that, are actually doing some disservice to their users, even as they are trying to help them:

Almost all recent Linux distros use a MAJOR HACK called "IsoHybrid", where someone managed to figure out a way to make an ISO9660 optical image masquerade as a regular disk image, with a partition table, an MBR and everything... In other words, most Linux ISOs you find these days are abusing the ISO9660 file system to make it look like something it was never designed to look like: a dual disk and optical image.

Obviously, the goal is to create an ISO that can also be used with the dd command, even as an ISO should never be able to work that way. And I agree that, in theory, this sounds awesome, because being able to use a single image for completely different uses should be great for users, but in practice, this leads to issues that are often overlooked:

  • A lot of Linux distro maintainers don't want to bother using a secondary file system that Windows can mount (e.g. they will use ext as the "secondary" file system on top of ISO9660), which means that a lot of Windows people, who are creating a bootable drive to use Linux for the first time, are super confused as to why they can no longer access the content of their flash drive. It's even worse if the "IsoHybrid" also includes an EFI System Partition (ESP) because then these users get the impression that their drive has completely shrunk in size. If you go on reddit or elsewhere, you will many posts from users who are utterly confused as to what happened to their USB media, which doesn't make for a great Linux first impression...
  • As lot of Linux distro maintainers focus so much on making ISOHybrid work that they completely disregard the option of creating a UEFI bootable media by simply copying the content onto a FAT32 formatted partition, which, really, should always be the preferred method of creating UEFI bootable drives (because it's usually a lot less risky to format a partition and then copy files than it is to use the dd command). Because of this, we've seen several issues that make for a subpar user-experience with Manjaro, Ubuntu... This is actually my main point of contention with "ISOHybrid": It should not serve as an excuse to ditch established means of creating bootable media!
  • GPT and "ISOHybrid" can be problematic on account that the secondary GPT table will be seen as corrupted when using dd... which actually leads to a BSOD on Windows 7 (but that's really a Windows bug rather than an ISOHybrid issue). Still, not the best experience for Windows folks creating bootable drives...
  • And finally, because "ISOhybrids" are presented as if they were the most natural media in the world (which they certainly aren't), people like yourself are led to believe that every ISO image can be applied using dd, when it's the exception rather than the rule. This is very unfortunate, because it creates TONS of user confusion, with some Linux users telling people who want to create Windows bootable media that they should just be able to use dd when that most certainly will never work! Also, if you pick any Linux ISO from 10 years ago, I'm pretty confident that you'll find that almost none of them can actually be used to create a bootable media using dd because this "IsoHybrid` thing is actually a recent development.

As far as I know, Microsoft have no plans to switch to the "hack" that is ISOHybrid for their Windows ISOs, which means that you're unlikely to ever be able to use dd to create a bootable USB media from it, and therefore, if you want to create Windows bootable media from an ISO you either:

  • (UEFI) Need to format a drive with a file system that Windows can boot from (NTFS, FAT32 and more recently exFAT) and extract the ISO files onto it. Now, if using NTFS or exFAT, you may have to do a little extra work as well...
  • (BIOS/Legacy) Need to format a drive with a file system that Windows can boot from (NTFS or FAT32 -- exFAT will not work because Microsoft never published BIOS bootloaders for it), and then create the relevant bootloader chain, from MBR boot code to volume boot records.

It's actually not that complicated to achieve, but it does take a bit more work than a 1:1 copy from an ISO file.

Hope that answers your question.

Related Question