Support

Admin Tools

#40726 Changed and duplicate file WebconfigmakerController.php

Posted in ‘Admin Tools for Joomla! 4 & 5’
This is a public ticket

Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.

Environment Information

Joomla! version
5.1.0
PHP version
8.3
Admin Tools version
7.5.3

Latest post by jjst135 on Tuesday, 21 May 2024 06:56 CDT

jjst135

Hi!

PHP filescanner -> After marking all files as 'safe' I ran another scan expecting to see no changed or suspicious files. But there is one file that was modified:

administrator/components/com_admintools/src/Controller/WebconfigmakerController.php

Also it shows up twice. 

Maybe this file is involved in the scanning process itself? Would it be possible to exclude this file (or maybe other files) manually?

nicholas
Akeeba Staff
Manager

Look at your screen. You have two files with similar names but different letter case:

  • WebConfigMakerController.php
  • WebconfigmakerController.php

Only the latter is the correct file. The former should be deleted.

Admin Tools is correct in marking these two files as a problem. When restoring this site on a non-case-sensitive filesystem (e.g. Windows, or newer macOS) one file would "shade" the other by overwriting it due to the way non-case-sensitive filesystems work. The only way we can bring your attention to them is to report them both as modified when their contents and filestamps are not identical, which is exactly what happened here.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

tampe125
Akeeba Staff

Hello,

I think the problem comes from the case of the file. Linux system is case-sensitive, so those are indeed two different files for it.

You can safely ignore those entries.

Davide Tampellini

Developer and Support Staff

🇮🇹Italian: native 🇬🇧English: good • 🕐 My time zone is Europe / Rome (UTC +1)
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

jjst135

Ah, now I see ;-) I overlooked the letter casing... 

I deleted the 'wrong' file on the server and now this file does not show up anymore.

Before I scanned again I updated one of the extensions on the site. This of course shows modified files (about 50 in this case). So when we regularly update extensions this list will always contain a whole lot of files. Going through them would be almost impossible to do (takes a lot of time) for all our sites. So this makes me wonder what the point is of this metric? Or am I missing something? Focussing only on the possible threats seems to be more productive?

I suppose there is no way to skip files from extension updates since the last scan?

And since I am asking stuff: I was thinking, would it be possible to somehow combine the backup (Akeeba Backup of course)  and the file scanning? That way the server only needs to access / process the files once instead of at different times? Just a thought...

Kind regards,
Jip

 

nicholas
Akeeba Staff
Manager

So when we regularly update extensions this list will always contain a whole lot of files. Going through them would be almost impossible to do (takes a lot of time) for all our sites. So this makes me wonder what the point is of this metric? Or am I missing something? Focussing only on the possible threats seems to be more productive?

You didn't read the link I provided. I explain that in there. The TL;DR is that once you update extensions you run a scan. Then you know that the only changed files are those from the update. Insofar you trust the extension developer, you mark those change files safe. The same with core Joomla! updates.

The idea is that when you see a changed file when you don't expect any files to change you need to start looking closer. The Threat Score tells you how likely it is to be something malicious.

I suppose there is no way to skip files from extension updates since the last scan?

They are not recorded by Joomla, so no.

And since I am asking stuff: I was thinking, would it be possible to somehow combine the backup (Akeeba Backup of course)  and the file scanning? That way the server only needs to access / process the files once instead of at different times? Just a thought...

In theory? Yes. In practice? That would be really bad for your backup.

You need to run the backup as fast as possible. Ideally, you should be running a backup with your web server shut down for external traffic to ensure backup consistency. Since this is not practical, we go with running the backup as fast as possible when it comes to database data and .php file, i.e. what would most likely be problematic for consistency.

If you add the scanning in the backup process you will be slowing down the backup of .php file by one or two orders of magnitude. This increases the chance of consistency issues during backup from "it practically doesn't matter" to "there's an one in a hundred chance the restored site won't work". The latter is unacceptable, hence the scanner running as its own, separate process which needs to be temporally removed from the backup process.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

jjst135

Thanks Nicolas.

I still am a bit confused about how we would best use the scan tool to effectively find files that are added that would indicate 'bad behaviour'.

You say we should run a scan after updates we perform and the mark everything as safe. And before we start updating again we first check for newly added files, because we then know they were not added by updates. Correct?

I do get that, but when we auto update extensions on our sites (using an external tool like Panopticon) this procedure won't work because updates will be happening all through the day. Right?

 

 

nicholas
Akeeba Staff
Manager

You will not have updates every day, but you are going to be running the scanner every day.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

jjst135

So in that case for us the workflow would be (for checking new added files):

- Run the filescanner every day (at a set time)
- Check the files every day for new ones (may using panopticon)
- When there are new files, first check if there were any (automatic) extensions updates.Mark those files as trusted and figure out what files remain that are not part of the extension updates.

But I don't think that is going to work for us.

- In my experience updates for extensions happen very frequently, may be not every day but I do think once every day would be a good guess, 
- We are not able to check the filescanner every day. Checking it once or twice weekly is the best we can do.
- Differentiating between 'extension updates' or 'other' files might not be that easy. 

So I am still afraid the new files check is - in our situation - maybe not that useful to to find 'bad' files. I am not saying the tool does not work and will be useful when used in the way you suggest. I'm just saying I don't think that will work in our workflow without investing more time in effort in the processing of the new files.

I just want to add this in general:

We have been using an external service to do file checks (new, modified, suspicious). In the last 10 years we have never detected a actual 'bad' file. I know we need to put in the effort to keep sites secure, but there needs to be some kind of balance between the time it takes to use the scanning tools and the results the provide. So again, there is nothing wrong with the tool themselves. I just struggle to implement them in our security workflow without it taking to much time to check files manually.

We have set up our server and website maintenance workflow pretty secure I think. Maybe that's the reason why we have not seen any bad / hacked files in a ling time. We have a dedicated server for sites that we have buy and manage ourselves. Our clients can not:

- Install extensions (they mainly have access to content related extensions on the site) 
- Can not connect tot he server in any way (like FTP)
- Have no access to their site database

Also:

- We have a pretty restrictive firewall
- We use CLoudLinux to 'cage' users / website on the server.
- We have set up good password protection for our own access
- We use 2FA for Superusers on the websites.
- And of course we use Akeeba Admin Tools ;-)
- Our server is managed by professionals the give a great support

So this is probably why we don;t have many (any) issues with hacked sites / bad files. And so we need to see how the file scans can be use effectively to prevent us from spending time on checking files that dont need checking. Every minute we spend on 1 site is 100 minutes in total for all out sites,

I think it would be very beneficial - for us / our workflow -  to somehow 'auto-aprove' files that are installed by us (manually as super admin or using remote tools like panopticon) so the scanner can focus on files that have maybe have been changed in some other fashion. This would also focus our attention in a better way and probably reduce the time to process notifications in the filescanner.

Again, I don't have any issues with the tool itself, I just struggle to fit it into our security workflow.

 

nicholas
Akeeba Staff
Manager

What you ask is impractical to develop, especially as mass distributed software. It only fits your unique use case.

Beyond that, .php files are source code. There is absolutely no tool whatsoever which can correctly identify a source code file as legitimate or malicious -- and yes, I am aware of LLMs ("AI"), as well as their limitations in deriving context from single source files without being fed the entire environment they execute into.

Moreover, in many if not most cases, that distinction would be meaningless. A knife can be used to cut bread, or someone's throat. Is a knife legitimate or malicious? Even taken in context, it doesn't preclude the use of a knife which has a legitimate place in a kitchen from being used by the sous-chef to murder a waiter in that same kitchen. Some things can be used as a weapon of opportunity. For example, any bugs in the update code of any CMS are treated as security issues because they are knives in the kitchen; if they can be abused, they can inflict significant injury. This is why any tool that scans source code files comes up with a score, not a binary malicious/innocent verdict.

Hosted services suck the contents of your sites as part of their scanning. Therefore, they can always tell you if a specific file, let's say /administrator/components/com_example/src/Model/Foo.php, is likely to be legitimate by comparing it to the same /administrator/components/com_example/src/Model/Foo.php file installed on other sites. If it's identical across enough sites, it's likely to be legitimate, therefore not marked as potentially dangerous.

And here is the crux of the issue.

ARE YOU OKAY WITH A THIRD PARTY HAVING ACCESS TO ALL OF YOUR SITES' FILES WITHOUT ANY NDA AGREEMENT?

Think about it. The only way these services can work is if they have full access to all of your site, at any time. Even if you trust them to be legitimate, do you trust them to never be hacked?

As to whether why we couldn't make such a service for Panopticon. Sure we can. It's easy. It would cost quite a bit, but you could all chip in -- which is problem number one. However, is that given Panopticon's nature (mass distributed software) there is no way to know which requests to the service reporting file locations and contents are legitimate, and which are sent by malicious actors who try to spam the service with hundreds of thousands of fake instances of their malicious file so that when they attack your site their malicious file does not raise any alarms. The only way to prevent that is if you had to pay us not a single fee to use that service, but a per-site fee to register each of your sites with the service. At this point, the whole point of using Panopticon starts becoming moot. It would also turn us into what we hate: the people who have all of your sites' contents, which is definitely not something I am keen on doing at all.

So, here's your choice.

Use a third party site monitoring service which makes things easy for you BUT has zero privacy or non-compete guarantee.

- or -

Use a self-hosted site monitoring software which makes things a bit harder for you BUT has a guarantee of absolute privacy and the matter of non-competition is not even relevant anymore.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

jjst135

After reading your reply I definitely feel more comfortable with the second option 'self-hosted site monitoring'.  I also understand the thought behind Panopticon an the way it is set up. Making this service free n to of that almost makes it a no brainer.

I just have to decide on HOW to use the tool(s) in our use case. I now tend to say: The FileScanner is working great as it is, but will it actually add a next layer of protection or is it more likely to take up only time? This is by the way a question to ask for self-hosted or external services both... In our use case I tend to rely on our other security measures spend the time we would need to analyze the filescanner files on other area;'s we can improve security. So I could decide to use Panopticon for the the other features and nog the filescanner.

Thanks for listening and your patience Nicholas.

There are some more questions I have about Panopticon (sorry about that... but i want to make sure we adopt a workflow that will work well for us for the next few years) but I will ask those questions on GitHub.

Thanks for now!

 

 

 

 

 

nicholas
Akeeba Staff
Manager

So, there are two schools of thought.

One school of thought is that the more redundant layers of security you have, the better. If you ascribe to that school of thought, you need the PHP File Change Scanner just in case.

The other school of thought is that you need to have a practical amount of security. If you ascribe to that school of thought, you don't really want to use the PHP File Change Scanner because it's unlikely it will catch something that's not caught during the attack phase by your other layers of security.

In very practical terms, the PHP File Change Scanner protects you from "slow burn" attacks. I have seen cases where the site was compromised at one point, the attacker didn't immediately make their presence felt, and a few months (or years!) later they used their backdoor to use the site for something nefarious, typically serving malware and/or phishing pages. You can't catch this kind of attacks with anything but the File Change Scanner or a similar tool, especially if the backdoor is well-written to not raise any icky code alerts and the attacker was careful enough.

In your particular use case I am inclined to say that you probably don't need the file change scanner. However, I'd feel more confident telling you that if I knew that plain FTP was turned off on your servers, SFTP access is only possible using certificates, and you are religiously using 2FA/MFA for all kinds of privileged access including the hosting control panel. Basically, if you guarantee you have the absolute minimal exposure to human stupidity, you can maybe afford to shed a protection layer that's troublesome.

Nicholas K. Dionysopoulos

Lead Developer and Director

🇬🇷Greek: native 🇬🇧English: excellent 🇫🇷French: basic • 🕐 My time zone is Europe / Athens
Please keep in mind my timezone and cultural differences when reading my replies. Thank you!

jjst135
  • Plain FTP turned off -> Yes
  • SFTP access is only possible using certificates -> No, but very secure passwords and SFTP is only used by ourselves.
  • 2FA/MFA for hosting panel- > Yes. Also the control panel is only accessible to us. Not our clients. Not even for their own hosting.
  • We also have a pretty tight firewall on the server.
  • Exposure to human stupidity -> very very low, except our own ;-)

Filescanner: Why might consider just scanning the site files once a month to see if we can filter out some files that we don't 'know' to be save, This does take some time and research (to check / compare the files) , but doing this one or two site could give us a list of most likely to be 'safe' files and give us some kind of baseline to check other sites against. Other then that I think for us the 'practical amount of security' would be not to use the filescanner on a daily basis for all our sites.

Thanks for your insights Nicholas.

 

 

 

Support Information

Working hours: We are open Monday to Friday, 9am to 7pm Cyprus timezone (EET / EEST). Support is provided by the same developers writing the software, all of which live in Europe. You can still file tickets outside of our working hours, but we cannot respond to them until we're back at the office.

Support policy: We would like to kindly inform you that when using our support you have already agreed to the Support Policy which is part of our Terms of Service. Thank you for your understanding and for helping us help you!