OWASP Cheat Sheet Series
cheatsheetseries.owasp.orgI'm a product security engineer. I reference these all of the time during my own work to make sure I didn't miss something stupid, but I also hand links out to them to engineers when we do find bugs in their code. Most of the time I think they're ignored.
If most engineers just took a second to read the ones that were directly pertinent to their projects and tried to be cognisant of some mitigations, I'd find substantially less low-hanging-fruit vulnerabilities in the first review pass. Doing so actually makes my job significantly more difficult, and forces me to dig deeper - which is a good thing. Instead of writing up for the 100th time some input validation spiel, I can spend time searching for more complex bugs, writing protocol fuzzers, and doing real analysis in the time I have for the review.
I've been on both sides of this game - worked as a developer and also worked as a penetration tester. I've seen pen testers laugh at the stupid vulns that developers introduce and I've laughed at a few myself. But I've also seen the deadlines as a developer, bugs that are bringing the whole system down and costing the company lots of money, legacy code that is extremely hard to understand and difficult to even get your feature working.
Those on the security side often only think "its really not that difficult to make it secure, just follow these guidelines and you'll be fine", but they don't realise the myriad of other issues that the developers are dealing with.
EDIT: Security needs to be encouraged from the top down. If management is onboard with follow secure practices then they need to also understand that that means things might take a little longer to complete.
What guidelines? Never seen any vulnerability report that included how to trigger the vulnerability, what it gives the attacker or how to remediate it in the language/framework being used.
When programmers feel frustrating repetition they automate. The way to reach people is not with scolding but with scanners that are easy to use, helpful, and don't have lots of false positives.
> The way to reach people is not with scolding
I think it's an unfair assumption to say that I'm scolding the teams I work with by trying to educate them on security issues - the fact that that is your takeaway from my last post says a lot about another issue with security organisations: that there often exists some adversarial nature between software engineering and product security. I moved into security from product engineering. I try my best to be the ally of the engineers, and educate them on bug classes so they can learn to deal with issues throughout the software development lifecycle, from early threat modelling and design issues to creating implementations that minimize weaknesses.
A helpful solution, as the child poster says, is to have frameworks that stop you from doing the dumb stuff, but smaller organisations sometimes don't have that luxury, and even still you can shoot yourself in the foot with your frameworks (Using front-end frameworks as an example, I've seen way too many extraneous uses of dangerouslySetInnerHtml and DOMSanitizer bypasses when I was a consultant.)
> scanners that are easy to use, helpful, and don't have lots of false positives.
I've written a fair amount of automation tooling to glue together COTS/OSS SAST/DAST applications. Most of even the better commercial tools still yield insane amounts of false positives and require human interaction to make sure that the bug is actually exploitable. Common web security tools such as Burp Suite Pro's scanner are effectively useless for most modern web apps. Some languages and architectures are better than others, some companies' internal rulesets are better than others, and it's a struggle to get something that works for even the majority case. Some of the most mature technology companies I've worked at are still trying to build these tools across their infrastructure, and they have better success in some languages/platforms than others, and they don't have the resources to keep up with the new languages / platforms / frameworks engineers want to use for this project or that. It's an uphill battle, and education is still just as important as tooling.
I've been involved with several security teams, either as a builder getting my stuff reviewed by them, or as someone brought in to help them do the reviews.
Besides myself, I've never came across another developer in an InfoSec team. Their background ranges from networking, desktop jockeying, manual sysadmin, audit, script running pentesters and mangerial, but never really developers, nor anyone who has been setting up automation for an operations function.
I think this is partly because I contract, and the sort of orgs that bring me on are already struggling, but I think it's also just a common theme that InfoSec teams don't build, and so people that do don't want to be there.
This is what leads to a lot of things we don't like. The demands to follow processes that don't really help, the buying of random products and demanding you integrate, etc. They simply lack of knowledge in Product Development leads to a lot of bad habits.
Much like you suggest, my job is too easy really. The builders also flee these orgs, because dealing with bullshit bureaucracy isn't fun, so with what's left all I can really do is suggest: use a framework that deals with security considerations, and don't deviate; follow this guidence such as CIS Benchmark; Use scanning tooks and look into the input; basically basic stuff, then come back to me when I have something to look at.
I would love to have someone in or next to my team to teach me/us about security flaws that I produce. I sometimes look into the basic stuff from OWASP and hope to catch the most common things but I miss someone dedicated with a lot of experience. The biggest hurdle that I see so far is to educate the teams to plan in the time to actually check for vulnerabilities. The constant pressure to deliver new features in the SCRUM environment makes many things that should be part of proper software development a though sell.
Repeating yourself to other people may not seem like scolding to you. Even if you feel they're ignoring you for some reason you can't pinpoint. So poor word choice on my part?
What I found helps the most, is to have frameworks that make implementing secure code hard. XSS is much more rare now that most people use Angular or React (plugins can be trouble tough).
I assume you mean implementing insecure code hard.
Yes, it should be insecure.
The thing that I find difficult with OWASP: there doesn't always seem to be comprehensive examples provided for what these attack surfaces could be used for. That makes it difficult to both understand the impact of a particular issue, and test for it.
As an example: https://cheatsheetseries.owasp.org/cheatsheets/AJAX_Security...
I'm fascinated to know how this could actually be exploited. But there's no hint or reference to that. It's just "don't do this".
The biggest pain point in these security guidelines is context. For example this Array override issue has been fixed in major browsers 11 years ago [0]. Unless someone codes for IE6 I'd consider this not a real problem.