I am trying to import the 10,000 common password list into a table to ban those passwords from being used in a given system.
The file is a single column TXT file. Trying to LOAD DATA INFILE
that into a table with a single column gives me a primary key error. So, I thought, I'll set an id column with autoincrement, and then pass null
for each row to trigger the keys to be indexed. That doesn't work either (or I have a syntax problem) because that gives me 10,000 rows with numbers and null password fields.
Can this be done, or should I just write a python script to add an index to each text line, and then import it that way?
Here's what I am trying to do:
LOAD DATA INFILE '/tmp/passwords.txt' INTO TABLE `common_passwords` (password,id) SET id=@id+1,password=@password;
And here's the table:
CREATE TABLE `common_passwords` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`password` varchar(45) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=16384 DEFAULT CHARSET=latin1;
It seems like we should be able to do this rather easily in SQL…
Best Answer
To quote the manual:
and
and
and
-- https://dev.mysql.com/doc/refman/8.0/en/load-data.html
That is, if you have tabs or backslashes, the data will be mangled as it is read. This may explain the problem you are having.
I guess this may be the fix: