Duplicates in a list for file audit
I have been tasked with auditing files on our P Drive at work and checking it against an online portal. The issue is someone else tried this a couple of months ago and deleted a heap of files, so my work had a restore created before the damage was down. So, I now need to check three locations for the same files. I can get lists for the local locations using cmd dir /b and I plan on exporting a PDF of the online portion to get three lists of files in all three locations. My issue is how would I compare these three lists and identify missing files. The lists will also not be in the same order. I am open to anything that will speed this up as it's a very manual process currently. I am hoping to use a template that I can open for John Smith and paste the files from all three location and flag what’s missing from what location. Below is a table of something I have in my mind, but it doesn’t need to look like this. I am assuming I will need to use UNIQUE and VTSACK or something to build a list of all files and then check that against the below list.
| Resotre | Current | Online | Missing |
|---|---|---|---|
| ab | ab | Current | |
| bc | ab | bc | |
| cd | bc | Current | |
| de | de | Online |
[link] [comments]
Want to read more?
Check out the full article on the original site