I worked for the state once and the number of times I had to put my foot down for security was appalling. We’re talking like getting web services updated to use basic password auth could take months and I’d be pressured by management to ignore it because some asshat using the service doesn’t want to update their 30 year old batch file to deal with auth. Other people would regularly push things that could easily expose thousands of people’s identifying info just to get management off their backs. A couple projects I think I was specifically kept away from because they were “mission critical” and they didn’t want me slowing it down with trivial stuff like not leaking unencrypted databases…
“Looks like there’s a broken link on this page. No problem, we can get that fixed up in a day or two after we tackle the 32 vulnerabilities that cropped up since the last time we changed that page."
That is something I just don’t get. I’m a hobbyist turned pro turned hobbyist. The only people who I ever offered my services to were either after one of my very narrow specialties where I was actually an expert or literally could not afford a “real” programmer.
I never found proper security to have any impact on my productivity. Even going back to my peak years in the first decade of this century, there was so much easily accessible information, so many good tutorials, and so many good products that even my prototypes incorporated the basics:
Encrypt the data at rest
Encrypt the data in transit
No shared accounts at any level of access
Full logging of access and activity.
Before rollout, back up and recovery procedures had to be demonstrated effective and fully documented.
Edited to add:
It’s like safety in the workplace. If it’s always an add-on, it will always be of limited effectiveness and reduce productivity. If it’s built in to the process from the ground up, it’s extremely effective and those doing things unsafely will be the productivity drain.
All excellent points. I never worked at those scales or under those conditions, neither should I have been permitted to. And I had enough self-awareness to keep myself away from anything like that.
I guess when I read about this breach or that, the real damage seems to be a result of not having the basics covered. Whatever “basic” might mean for different scales of operation, encrypted at rest seems to be the the basis of public harm through theft of data, and it strikes me that if that can’t be managed at a particular scale, then operating at that scale should not be considered.
Dependencies, scope creep, feature creep, off by one errors, misconfiguration, unclear/unenforced contracts/invariants… Most of those are trivial to solve at small scale, but the more moving parts you have, the more complex it becomes
Of course, but that just makes the case for security as a foundational principle even stronger.
Mistakes happen. They always will. That’s not a reason to just leave security as the afterthought it so often is.
None of the things I mentioned have anything to do with errors and scope creep, but everything to do with building using sound principles and practices always. As in, you know, always. In class, during bootcamps, during design meetings, when writing sample code, when writing reference implementations, during the construction of the prototype that, let’s face it, almost always goes into production. Always.
Sure, and then the big client bankrolling your company needs the feature in production for next week.
If you’re gafam you can tell them to get screwed and that you need more time, but at least in my experience I’ve always been on the other side of the table, and sometimes you gotta change a setting in a production DB because the related GUI change was not approved since the guy doing the review was sick and the other reviewer was not sure which shade of green to use somewhere on the page.
I agree with that security is not something you add on the side, but circumstances change and things are not always in control. You say mistakes happen, but not everything I mentioned is caused by mistakes, sometimes the shortcut is completely intentional.
Companies only care about security when it’s too late, at which point they will blame you for writing unsafe software, but if your company or your job depend are at stake, that’s often a risk you have to take
… if your company or your job depend are at stake, that’s often a risk you have to take
Take all the risks you want. Just be sure that you’re the one actually taking the risk, not the people whose data you manage. I get really tired of people and companies who claim that it was a necessary risk when they’re not the ones paying for the bad outcomes.
You risk something by standing your ground, not in agreeing to that which puts me at risk.
I worked for the state once and the number of times I had to put my foot down for security was appalling. We’re talking like getting web services updated to use basic password auth could take months and I’d be pressured by management to ignore it because some asshat using the service doesn’t want to update their 30 year old batch file to deal with auth. Other people would regularly push things that could easily expose thousands of people’s identifying info just to get management off their backs. A couple projects I think I was specifically kept away from because they were “mission critical” and they didn’t want me slowing it down with trivial stuff like not leaking unencrypted databases…
Very stark contrast to a typical day at my job.
“Looks like there’s a broken link on this page. No problem, we can get that fixed up in a day or two after we tackle the 32 vulnerabilities that cropped up since the last time we changed that page."
That is something I just don’t get. I’m a hobbyist turned pro turned hobbyist. The only people who I ever offered my services to were either after one of my very narrow specialties where I was actually an expert or literally could not afford a “real” programmer.
I never found proper security to have any impact on my productivity. Even going back to my peak years in the first decade of this century, there was so much easily accessible information, so many good tutorials, and so many good products that even my prototypes incorporated the basics:
Edited to add:
It’s like safety in the workplace. If it’s always an add-on, it will always be of limited effectiveness and reduce productivity. If it’s built in to the process from the ground up, it’s extremely effective and those doing things unsafely will be the productivity drain.
Did you remember to plan for a zero downtime encryption key rotation?
Did you know when account passwords expire? Have you thought about password rotation?
That sounds like a good practice until you have 20 (or even 2000) backend server requests per end user operation.
All of those are taken from my experience.
Security is like an invasive medical procedure: it is very painful in the short term but prevents dire complications in the long term.
All excellent points. I never worked at those scales or under those conditions, neither should I have been permitted to. And I had enough self-awareness to keep myself away from anything like that.
I guess when I read about this breach or that, the real damage seems to be a result of not having the basics covered. Whatever “basic” might mean for different scales of operation, encrypted at rest seems to be the the basis of public harm through theft of data, and it strikes me that if that can’t be managed at a particular scale, then operating at that scale should not be considered.
Dependencies, scope creep, feature creep, off by one errors, misconfiguration, unclear/unenforced contracts/invariants… Most of those are trivial to solve at small scale, but the more moving parts you have, the more complex it becomes
Of course, but that just makes the case for security as a foundational principle even stronger.
Mistakes happen. They always will. That’s not a reason to just leave security as the afterthought it so often is.
None of the things I mentioned have anything to do with errors and scope creep, but everything to do with building using sound principles and practices always. As in, you know, always. In class, during bootcamps, during design meetings, when writing sample code, when writing reference implementations, during the construction of the prototype that, let’s face it, almost always goes into production. Always.
Sure, and then the big client bankrolling your company needs the feature in production for next week.
If you’re gafam you can tell them to get screwed and that you need more time, but at least in my experience I’ve always been on the other side of the table, and sometimes you gotta change a setting in a production DB because the related GUI change was not approved since the guy doing the review was sick and the other reviewer was not sure which shade of green to use somewhere on the page.
I agree with that security is not something you add on the side, but circumstances change and things are not always in control. You say mistakes happen, but not everything I mentioned is caused by mistakes, sometimes the shortcut is completely intentional. Companies only care about security when it’s too late, at which point they will blame you for writing unsafe software, but if your company or your job depend are at stake, that’s often a risk you have to take
Take all the risks you want. Just be sure that you’re the one actually taking the risk, not the people whose data you manage. I get really tired of people and companies who claim that it was a necessary risk when they’re not the ones paying for the bad outcomes.
You risk something by standing your ground, not in agreeing to that which puts me at risk.