Security
-
Ever hear the phrase "when all you have is a hammer, everything looks like a nail"? I apologize and no offense to the OP, but I'm guessing that this post is along that line.
That's not really it at all. I fully understand, that not all software requires some form of security. But that should simply not negate the thought or the process. It's a balancing act, between cost, usability, performance, resources. Nobody has just 1 security hole in their system. IF folks take the attitude, oh it's not our problem, we rely on something else, our software doesn't need it, we're not big enough to get hacked their isn't enough motivation. It just screams of naivety. It's an attitude that needs to change, especially, as more apps move away from the desktop, to the cloud, or onto mobile devices. Or employees bring in, usb drives, download e-mail attachments at work.
-
Microsoft recently released a study in which they stated that only 20% of software developers/engineers employed any form of security or security process within their applications or code. I am interested to know as a security engineer why that is? I've written a lot of crappy insecure code, that could quite potentially cost previous employers a lot of money. However they too were not that interested in security. isn't it time we get serious about security? Why do we not take it seriously?
-
That's not really it at all. I fully understand, that not all software requires some form of security. But that should simply not negate the thought or the process. It's a balancing act, between cost, usability, performance, resources. Nobody has just 1 security hole in their system. IF folks take the attitude, oh it's not our problem, we rely on something else, our software doesn't need it, we're not big enough to get hacked their isn't enough motivation. It just screams of naivety. It's an attitude that needs to change, especially, as more apps move away from the desktop, to the cloud, or onto mobile devices. Or employees bring in, usb drives, download e-mail attachments at work.
Quote:
Or employees bring in, usb drives, download e-mail attachments at work.
Sounds like a computer use policy to me, easily fixed by editing the security policies on the computer. Not saying software should be a "its not our problem", and definitely shouldn't be something that they purposely leave gaping, but on the other hand the developer shouldn't focus on security issues, when a company or product line gets big enough they hire people like you to help. Until that point comes a developer should focus on usability and stability (while not being downright stupid with security), and let the rest sort itself out. I'm not naive about the software I release. I know what the security flaws are, but I know that the chances of them getting exploited is like .0001% of my total shipped software. That means I should not dedicate more than .0002% of my time fixing or avoiding that software hole. Microsoft and big companies take the same road and for good reason. They know when they release software that it has security holes. They know what (most of) the holes are, but the big holes are plugged and the little ones get taken care of as the USER prioritizes. The only way to do that is to rely on user reports. Developers have a sense of perfection when it comes to code and that's poor business practice unfortunately. Its up to managers to decide how "perfect" code has to be before they put it in the revenue stream. Good managers realize this, that its not perfect despite the developer and get it out there anyway. This pisses developers off (we are all infallible perfectionists), but it keeps them in the job.
-
According to my manager, security is not an issue...until it becomes an issue. Then we'll fix it.
-NP Never underestimate the creativity of the end-user.
-
I think what the OP is talking about and what you are talking about are two different topics. App licensing isn't really security. You should probably start your own topic in an appropriate programming forum to ask this.
-
Quote:
Or employees bring in, usb drives, download e-mail attachments at work.
Sounds like a computer use policy to me, easily fixed by editing the security policies on the computer. Not saying software should be a "its not our problem", and definitely shouldn't be something that they purposely leave gaping, but on the other hand the developer shouldn't focus on security issues, when a company or product line gets big enough they hire people like you to help. Until that point comes a developer should focus on usability and stability (while not being downright stupid with security), and let the rest sort itself out. I'm not naive about the software I release. I know what the security flaws are, but I know that the chances of them getting exploited is like .0001% of my total shipped software. That means I should not dedicate more than .0002% of my time fixing or avoiding that software hole. Microsoft and big companies take the same road and for good reason. They know when they release software that it has security holes. They know what (most of) the holes are, but the big holes are plugged and the little ones get taken care of as the USER prioritizes. The only way to do that is to rely on user reports. Developers have a sense of perfection when it comes to code and that's poor business practice unfortunately. Its up to managers to decide how "perfect" code has to be before they put it in the revenue stream. Good managers realize this, that its not perfect despite the developer and get it out there anyway. This pisses developers off (we are all infallible perfectionists), but it keeps them in the job.
Ron Beyer wrote:
Sounds like a computer use policy to me, easily fixed by editing the security policies on the computer.
Absolutely it's a computer policy use issue, however, the point was/is. How many companies really enforce, have proper policies in place? With the prevalence of BYOD. employees accessing private E-mail, my point was simply was/is developers shouldn't entirely leave the decision up to Network OPS, or IT to enforce security behind, firewalls. a BYOD or an E-mail could exploit a security hole in your "internal" application to make it's data external pretty quickly. I am still very much a developer as much as a security person today, I just develop security centric solutions to help the rest of the development team.
Ron Beyer wrote:
Microsoft and big companies take the same road and for good reason
Not really, perhaps that's what Microsoft used to do, however they've drastically changed their practices.
-
LOL. No I have a good job, that pays well. I am just a security evangelist and believe to many software developers/engineer don't take security seriously and don't think about security, nor do their manages and companies.
CdnSecurityEngineer wrote:
I am just a security evangelist
Oh boy. :rolleyes: Don't get me wrong. Security is important, but to me that statement reeks of troll-speak. Soren Madsen
"When you don't know what you're doing it's best to do it quickly" - Jase #DuckDynasty
-
CdnSecurityEngineer wrote:
I am just a security evangelist
Oh boy. :rolleyes: Don't get me wrong. Security is important, but to me that statement reeks of troll-speak. Soren Madsen
"When you don't know what you're doing it's best to do it quickly" - Jase #DuckDynasty
I understand, The difference is, I am a software engineer, security isn't all that I think about. But by the same token it's not something that enough developers think about.
-
Ron Beyer wrote:
Sounds like a computer use policy to me, easily fixed by editing the security policies on the computer.
Absolutely it's a computer policy use issue, however, the point was/is. How many companies really enforce, have proper policies in place? With the prevalence of BYOD. employees accessing private E-mail, my point was simply was/is developers shouldn't entirely leave the decision up to Network OPS, or IT to enforce security behind, firewalls. a BYOD or an E-mail could exploit a security hole in your "internal" application to make it's data external pretty quickly. I am still very much a developer as much as a security person today, I just develop security centric solutions to help the rest of the development team.
Ron Beyer wrote:
Microsoft and big companies take the same road and for good reason
Not really, perhaps that's what Microsoft used to do, however they've drastically changed their practices.
Quote:
Not really, perhaps that's what Microsoft used to do, however they've drastically changed their practices.
Judging by what my Windows Update log looks like on Tuesdays (every IT guy dreads Patch Tuesday) I would challenge that. Just this month Microsoft has released around 25 security updates for Windows 7 alone that were relevant to my machine. That plus how many Service Packs they put out that address "vulnerabilities" as Microsoft puts them, I think they are on the same track that they've been on for years.
-
I disagree, you wrote the malware, its your fault, and if it ever got to it, a court of law would side with me. If I leave a loaded gun on the table, its not my fault you shoot somebody with it. [edit] Yes I would be negligent for leaving it there, but you would be guilty of the crime[/edit] But here's the bigger question, are you going to spend your time writing a security exploit for XYZ Inc's Video Gadget, or Microsoft Movie Maker? Which do you think will get you further in your goal of taking over the world? I'm guessing its not Video Gadget... And because of Video Gadget's small user base, I'm guessing they can have a fix out for a reported exploit faster than Movie Maker, so there is no incentive for me as XYZ Inc's product manager to dedicate resources to pre-release holes when software doesn't make money sitting in the IDE.
Ron Beyer wrote:
If I leave a loaded gun on the table, its not my fault you shoot somebody with it. [edit] Yes I would be negligent for leaving it there, but you would be guilty of the crime[/edit]
But if you knowingly leave the loaded gun on a table in front of a small child, you are guilty of even greater crime (negligence, child endangerment, possibly even 2nd degree or accessory to murder in the US). Basically, the level of criminality depends on your foreknowledge of the situation. Going back to software, if you know of an exploit in your software that leads to your users getting pwned, you are responsible for fixing it. In response, some software companies intentionally ignore security issues so they can claim innocence. And, some pointy-haired managers actually believe ignoring security will save money and time. As most of us know, a security hole can be a HUGE money and time-suck, not to mention the loss of reputation. Many bean-counters only realize how expensive it is after they have already been burned. Security is a priority only after it is already too late. :((
// mght ToDo:
// Put Signature Here -
Ron Beyer wrote:
If I leave a loaded gun on the table, its not my fault you shoot somebody with it. [edit] Yes I would be negligent for leaving it there, but you would be guilty of the crime[/edit]
But if you knowingly leave the loaded gun on a table in front of a small child, you are guilty of even greater crime (negligence, child endangerment, possibly even 2nd degree or accessory to murder in the US). Basically, the level of criminality depends on your foreknowledge of the situation. Going back to software, if you know of an exploit in your software that leads to your users getting pwned, you are responsible for fixing it. In response, some software companies intentionally ignore security issues so they can claim innocence. And, some pointy-haired managers actually believe ignoring security will save money and time. As most of us know, a security hole can be a HUGE money and time-suck, not to mention the loss of reputation. Many bean-counters only realize how expensive it is after they have already been burned. Security is a priority only after it is already too late. :((
// mght ToDo:
// Put Signature HereYes about the gun, maybe I was being too general there... Lets take a couple examples: Mydoom Exploit ($38 billion in damages)[^] Uses back-doors in Windows Explorer to install sub-processes and take over the system. Conficker Worm ($9 billion in damages)[^] Uses flaws in Windows to perform attacks on the system and set up a bot net. There are a lot of examples of the authors of such viruses being jailed over creating them (for example, the ILOVEYOU virus written by a Filipino student) but I dare you to find an example of a software company being sued over security holes (Microsoft would spend all its time in court). There are examples where software companies had known security holes at release that were later exploited before a security patch could be put into place. Even some of the viruses above exposed those holes but Microsoft failed to completely patch them so the next generation of the virus was still effective. So yes, I would agree, I'm responsible for fixing the security hole, but I'm not responsible for exploiting it or the damage that comes from it being exploited. At least right now in the U.S. tort law does not obligate me to be responsible for damages caused by holes in software, nor do courts even take up such lawsuits. (The only exception to this is class-action lawsuits for financial information breaches, and this typically isn't a software issue but a network security one). In order to claim negligence for me leaving software holes open, you have to prove that I owed a duty to you. You should read a license and the "fitness for purpose" section. Determining the definition of the word "duty" is hard, because it requires that there is a "foreseeability" of harm and that the user took reasonable actions with computer security to prevent this. If you run my software without a firewall or anti-virus, I no longer have a duty to protect you. I could go on and on, but here is a good paper[^] on why software companies aren't currently responsible for defects and security holes.
-
Yes about the gun, maybe I was being too general there... Lets take a couple examples: Mydoom Exploit ($38 billion in damages)[^] Uses back-doors in Windows Explorer to install sub-processes and take over the system. Conficker Worm ($9 billion in damages)[^] Uses flaws in Windows to perform attacks on the system and set up a bot net. There are a lot of examples of the authors of such viruses being jailed over creating them (for example, the ILOVEYOU virus written by a Filipino student) but I dare you to find an example of a software company being sued over security holes (Microsoft would spend all its time in court). There are examples where software companies had known security holes at release that were later exploited before a security patch could be put into place. Even some of the viruses above exposed those holes but Microsoft failed to completely patch them so the next generation of the virus was still effective. So yes, I would agree, I'm responsible for fixing the security hole, but I'm not responsible for exploiting it or the damage that comes from it being exploited. At least right now in the U.S. tort law does not obligate me to be responsible for damages caused by holes in software, nor do courts even take up such lawsuits. (The only exception to this is class-action lawsuits for financial information breaches, and this typically isn't a software issue but a network security one). In order to claim negligence for me leaving software holes open, you have to prove that I owed a duty to you. You should read a license and the "fitness for purpose" section. Determining the definition of the word "duty" is hard, because it requires that there is a "foreseeability" of harm and that the user took reasonable actions with computer security to prevent this. If you run my software without a firewall or anti-virus, I no longer have a duty to protect you. I could go on and on, but here is a good paper[^] on why software companies aren't currently responsible for defects and security holes.
Excellent examples Ron, thanks for that. In my mind, there are really 3 different types of responsibility for security issues: Ethical Responsibility - Software vendors have an ethical responsibility to fix their code when they know of a security hole being actively exploited. Whether there is an ethical responsibility to fix security holes that have been discovered but not yet been exploited is really dependent on the risk to their customers. A banking app provider has high ethical responsibility because of the financial risk, while a photo editing app is less essential. I agree that the vendor's ethical responsibility is for their software and stored data only - they are not responsible for repairing users computers. Contractual Responsibility - Some vendors have a contract or license with their users laying out the vendor's responsibility and remedies for security issues. In the case of most shrink-wrap agreements, this should be renamed Contractual Non-Responsibility. Legal Responsibility - In some US states, data breach laws create a legal responsibility to notify users when their information is released. The laws may provide other remedies such as fines and credit monitoring. Sometimes a civil class-action lawsuit can find the software vendor negligent. However, in most cases there is no legal responsibility to take any action. The paper you linked to goes into more detail.
// mght ToDo: // Put Signature Here