I’m in the process of importing groups of files from my deeply-nested Finder hierarchy, which process has caused me to wonder if it’s necessary to replicate the hierarchy within EagleFiler. If I follow a consistent naming convention, and if I tag the files carefully, can I safely put tens of thousands of files into a single directory, or would it be prudent to bring the folders into EagleFiler, also?
Yes, it’s safe. It doesn’t really matter to EagleFiler except that macOS (at least HFS+, not sure about APFS) is a bit slower with lots of files in the same folder, and it can cause larger inefficiencies for some other software such as Time Machine.
I use both Time Machine and Carbon Copy Cloner. Are there any benchmarks that suggest the point at which it would be best to subdivide the content?
None that I’m aware of. Carbon Copy Cloner should be fine. The issue with Time Machine is that if you have 10,000 files in a folder and change one, in the backup it will make a folder with 10,000 files instead of just a single link for the whole folder. While this doesn’t use a huge amount of extra space (9,999 of them are links), an excessive number of files can slow down backups, slow down HFS+, and make directory corruption more likely.
Also, modifying the metadata of one file would cause EagleFiler to create a new metadata backup for the whole folder. Having lots of Time Machine backups of a large XML file could use a substantial amount of disk space, since Time Machine does not do any compression, nor realize that 99% of the file hasn’t changed.
Helpful. Thanks for the detailed explanation, and for your multiple replies to me today. I’ll try not to make a habit of being high-maintenance.