Maybe in a series of planned efforts that focus on one particular step or requirement. You can also use DOS, Powershell, Python, etc. You may find some software that will do most of what you require. This is how I was framing it in my brain anyway, if anyone else has a better method/software to make this whole process easier, preferably open source/free, I'd appreciate it. The 3TB drive will contain copies of the files I'm actively using rather than those I simply will never delete. I'll throw them onto the 4TB drive as a second external backup in case the 8TB drive dies or the 4TB drive dies I'll have it on the 8TB drive. Once I have all the files sorted/organized which will likely only be about 2.8-3.2 TB of data once the dupes are removed. ![]() I don't have dupes in the new sorted folders and B, I don't have dupes on the new 8TB backup drive. ![]() So I'm going to be using a 500GB SSD to organize files into that new folder system I've yet to create, but I want to make sure I'm not putting dupes together again, then as I do one 500gb-ish chunk to move it onto my new 8TB backup drive as I go.īut as I transfer from one drive to the next I want to make sure, A. I want to create a file system to organize these files in a more sensible manner and get rid of all the dupes. ![]() I just bought a couple new hard drives, after I install them I want to take a lot of time to sort through an old external 4TB drive another 3TB HDD with external enclosure and my current 500gb SSD containing lots of duplicate content due to backups here, backups there, backups all over the place.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |