I use Hazel to copy files into the EagleFiler (EF) folder structure based on rules that scan for keywords and then target corresponding folders. For instance the Hazel looks at file that contains my checking account number, a rule fires that renames the files, changes the dat created to a date prepended in the filename, and then copies the file into EagleFiler, and then tells EF to scan for new files.
But I’ve found that duplicate files can get in. The Hazel rules are set to “Throw away if duplicate”. But this doesn’t seem to work.
EF has a great system to prevent duplicates, but it only runs when a file is imported into EF directly. IE via Option Function F1 in my case.
So I’m thinking of modifying my Hazel import rules to run an AppleScript that taps the EF scripting API. My thinking is, instead of telling Hazel to copy the file to the right folder based on the keyword matching rules, I could tell Hazel to run an AppleScript to tell EagleFiler to import the file. And thus hopefully leverage the databased checksum duplication checking inherent in EagleFiler.
EagleFiler scripts show many import scripts as examples. And Hazel support injecting Applescripts in the import workflow, with “inputAttributes” that potentially could be used to read imported attributes to automate the destination folder designation. Or I could edit the script to define the destination folder for each matching keyword rule…
Can anyone with mad apple scripting skills suggest a script that can be embedded into Hazel rules to do these things?
All of this is only worthwhile if EagleFiler will be given a chance to scan for duplicates and reject the import if a duplicate is found.
Thanks if you can help with both EagleFiler and Hazel integration.