So in that case for us the workflow would be (for checking new added files):
- Run the filescanner every day (at a set time)
- Check the files every day for new ones (may using panopticon)
- When there are new files, first check if there were any (automatic) extensions updates.Mark those files as trusted and figure out what files remain that are not part of the extension updates.
But I don't think that is going to work for us.
- In my experience updates for extensions happen very frequently, may be not every day but I do think once every day would be a good guess,
- We are not able to check the filescanner every day. Checking it once or twice weekly is the best we can do.
- Differentiating between 'extension updates' or 'other' files might not be that easy.
So I am still afraid the new files check is - in our situation - maybe not that useful to to find 'bad' files. I am not saying the tool does not work and will be useful when used in the way you suggest. I'm just saying I don't think that will work in our workflow without investing more time in effort in the processing of the new files.
I just want to add this in general:
We have been using an external service to do file checks (new, modified, suspicious). In the last 10 years we have never detected a actual 'bad' file. I know we need to put in the effort to keep sites secure, but there needs to be some kind of balance between the time it takes to use the scanning tools and the results the provide. So again, there is nothing wrong with the tool themselves. I just struggle to implement them in our security workflow without it taking to much time to check files manually.
We have set up our server and website maintenance workflow pretty secure I think. Maybe that's the reason why we have not seen any bad / hacked files in a ling time. We have a dedicated server for sites that we have buy and manage ourselves. Our clients can not:
- Install extensions (they mainly have access to content related extensions on the site)
- Can not connect tot he server in any way (like FTP)
- Have no access to their site database
Also:
- We have a pretty restrictive firewall
- We use CLoudLinux to 'cage' users / website on the server.
- We have set up good password protection for our own access
- We use 2FA for Superusers on the websites.
- And of course we use Akeeba Admin Tools ;-)
- Our server is managed by professionals the give a great support
So this is probably why we don;t have many (any) issues with hacked sites / bad files. And so we need to see how the file scans can be use effectively to prevent us from spending time on checking files that dont need checking. Every minute we spend on 1 site is 100 minutes in total for all out sites,
I think it would be very beneficial - for us / our workflow - to somehow 'auto-aprove' files that are installed by us (manually as super admin or using remote tools like panopticon) so the scanner can focus on files that have maybe have been changed in some other fashion. This would also focus our attention in a better way and probably reduce the time to process notifications in the filescanner.
Again, I don't have any issues with the tool itself, I just struggle to fit it into our security workflow.