> To prevent similar incidents from happening in the future, Pidgin announced that, from now on, it will only accept third-party plugins that have an OSI Approved Open Source License, allowing scrutiny into their code and internal functionality.
This is an understandable policy, but how would it have stymied the attacker in this case? It's unlikely that Windows users would be building from source (and Darkgate appears to be Windows only). Unless there's a policy that Pidgin extensions are strictly reproducible, it seems unlikely that the presence of an adjacent, benign source artifact would have increased the likelihood of early discovery.
The idea is to slow them down and make it harder. We don't have the time, resources, or expertise to examine every plugin which is precisely why we don't host or provide binaries for external plugins.
> The moral is obvious. You can't trust code that you did
not totally create yourself. (Especially code from companies that employ people like me.) No amount of
source-level verification or scrutiny will protect you
from using untrusted code.
— Ken Thompson, Reflections on Trusting Trust, 1984
Or, you can run untrusted code in a restricted sandbox. Sadly, Linux distributions do not implement it out of the box for unclear reasons, unlike browsers for example which run every app in a sandbox.
What I want is a system where I can run anything without any risk.
>Linux distributions do not implement it out of the box
There are several distributions that _do_ implement by-default restrictions to all running software with stuff like Cgroups and GRSecurity. There are even distributions dedicated to isolating even the drivers, like Qubes.
I think quoting RoTT in this context is a little cliche: as a practical matter, we're all trusting immense amounts of code that we haven't read. The question is what to do about that practical reality, other than "give up because of the existential threat of a compiler backdoor."
The answer is to procure your binaries from sources you trust:
* Commercial vendors like Microsoft, Intel, Valve, etc. who have a vested financial interest in your continued patronage.
* Private vendors like the guys behind WINE, Notepad++, ffmpeg, etc. who are reputable and have that reputation on the line.
Speaking practically, if you don't trust your source to begin with you aren't going to waste your time auditing their code and compiling it yourself either.
I know Gentoo Linux is not for everyone and doesn't fix the issue of there being wayy too much source to ever personally be able to check it all, however I think there is something to be said for the fact that the source is indeed readable in-the-clear with most parts of the system and lots of it has even been looked-over by the package/ebuild maintainers. Not trying to say there's no risk, but I think it might reduce it quite a bit if you have the patience! The #gentoo IRC channel is in-my-experience incredibly helpful, totally smashing most types of support from corporate companies out of the water! (Of course that's also only working like that because hardly anyone uses Gentoo.. but I think the point still stands!)
This is an understandable policy, but how would it have stymied the attacker in this case? It's unlikely that Windows users would be building from source (and Darkgate appears to be Windows only). Unless there's a policy that Pidgin extensions are strictly reproducible, it seems unlikely that the presence of an adjacent, benign source artifact would have increased the likelihood of early discovery.