I love this idea and I would jump on it inmediately but for one deal-killer. When the program stores a webpage, it doesn’t store the URL which for a researcher makes it pretty close to useless since the time involved of doing that manually is prohibitive. This just seems to be a problem that hs never been worked out to my satisfaction in any operating system.
Here is the crux. Even if the program is resigned to store the URL as part of the database, like a tag for example, then that meta information will be lost if I ever decide to switch to another archival method. Since my current data base of webpages is about 5 gb, that is not trivial. So, I need a program that will somehow “stamp” the URL/Title/Date onto the page. Currently, I save webpages to DevonThink which then has a script for doing just that. Unfortunately, you have to save the pages in RTF which can be a problem. It would be much better if this could be done with HTML. When I used to work in Windows (gasp), there was a neat little toolbar that did just that, but the dev eventually went under and the program stopped working with SP2.
So, bottom line, if EagleFiler could ever save webpages with the URL/date/title stored on the actual page, I would sing its praises to anybody who would listen. As it stands, no can do.