A Note on Vendor Application Security
No need for tinfoil hats when it comes to application security, we’re all too painfully aware of what can happen. From data breaches to destructive attacks, the potential impacts couldn’t be more clear.
Web applications in particular are interesting because of their exposed position – it’s not uncommon for sensitive web applications to be secured “only” by their application logic.
This means that a logical flaw in one of the functions, be it the login function, authorisation function, or access control function, could have a devastating impact.
Application Security Audits
Penetration testing is a common method to assess the “security” of an application or system, as it entails trying to break in or perform unwanted actions. The result is that application owner, be it the developer or whoever bought the software, gains a better understanding of what flaws are present.
The final report will, almost always, include recommendations on how to fix the underlying problem, or otherwise lessen the risk level.
Verify According to Your Needs
The proverb “trust, but verify” (or perhaps “never trust, always verify”) applies to application security because of two unfortunate reasons:
- It is easy to save effort on not doing “proper” security testing; and
- The product might be secure, but your deployment and install might not.
I put “proper” in quotation marks because what is a reasonable level of security for one organisation may not be acceptable for another. The depth of analysis and testing is tied to the level of verification required.
Unfortunately, this might mean that the risk appetite and security level of the vendor does not match that of your organisation. It doesn’t mean that the vendor didn’t do an okay job of securing their product, but the extent to which they did may not be sufficient for you.
Always verify according to your needs.