I know that when I import a whole folder-structure with a lot of files into EagleFiler (EF), files that are already existing in the library will not be imported again unless I set a specific option allowing duplicates.
Is there another way of importing those duplicates without actually using up the space? Aliases, perhaps?
My use-case is: I have a folder that includes backups of a series of documents. EagleFiler’s encrypted storage option is the perfect thing to store all my project data including these weekly backups. When importing them, EagleFiler ignores duplicates - awesome as it saves me a lot of disk space.
Now I imagine a situation where I would have to restore something out of these backups. 100 files have not been changed for years, 20 were changed last month, and 2 last week. What I would do is to look at the most recent backup-folder and use the files there to re-import the data from the backup into the corresponding live system. However, files would be missing as they were ignored on importing because of the duplicate-issue.
Comparing all folders (I am talking about hundreds of folders with thousands of files) is not an option. Now you say I could use tags to make the same file show up in multiple places. Can I make this automatically happen during import or with smart folders, etc.?
After reading your use case, I don’t think the tag workaround is suitable for you. I’m investigating ways that a future version of EagleFiler could better handle this sort of situation. EagleFiler could perhaps notice that a file it’s importing is a duplicate and replace it with an alias. Is that really what you want, though? It sounds like for this use case you’d want the folder to contain the actual file, not an alias. Maybe a hard link (though those can be fragile)?
You are right: Having something like a hard link in this very folder (and in all other folders where it should be) while only importing the data (= the actual file) once would be the perfect solution. (Of course this comes with a bunch of issues like what happens if I (accidentally) delete the oldest folder (where the actual file/data is saved)? Etc., etc.)
But it would solve another big problem I currently have with EagleFiler: Hundreds, no, thousands of empty folders.
I think hard links would handle that case properly. But one issue is that hard links don’t survive many methods of copying files. I think even dragging and dropping in the Finder splits them into independent copies. And Dropbox and similar services would also duplicate the files.
I’ll look into it some more. There may be a way to make it work.
I don’t think there’s currently a good way to import all the files without them taking up the full amount of disk space. An encrypted Time Machine drive is about the closest thing I can think of (although it doesn’t provide data integrity checking like EagleFiler).