Sure, a simple script can parse the results of the tool and insert rows into the database.
You can make composite unique keys in mysql:
CREATE index foo (mac, time);
If you try to insert a dup, it will fail (but will you ever have a duplicate time?).
Even though 10,000 rows are not a lot, I would go for a slightly different solution that will perform a lot better.
Keep you master table the same: id, name.
Create a new alias/search table: itemid, alias
Create an index on the alias and a foreign key constraint on the itemid.
Do not have a delimited list of aliases, rather have a row for each individual aliases, as well as the original (authoritive?) name.
Then make the searches on this table and remove the wildcard from the beginning of the like comparison. Having a wildcard at the beginning of the like value will make the query ignore the index.
So for the example data the you have you would end up with an alias table like the following:
masterid | alias
--------------------------------------
1 | Computer
1 | PC
1 | Mac
2 | Kraft Dinner
3 | Chesterfield
3 | Couch
3 | Sofa
3 | Sete
4 | BMW
4 | Beamer
5 | Microsoft SQL Server
5 | MSSQL
5 | SQLServer
5 | SQL Server
This could then be queried with:
SELECT i.id, i.name, a.alias searchmatch
FROM items i
INNER JOIN itemaliases a ON i.id = a.itemid
WHERE a.alias like 'keyword%'
ORDER BY i.name
LIMIT 15;
Best Answer
The easiest way to remove duplicates is to join the table with itself:
All duplicates will be deleted except one with the lowest
id
.