|  | Posted by Michael Trausch on 04/03/06 21:29 
David Haynes wrote:>
 > Mike:
 >
 > I understand all that.
 >
 > There is a level of weak security sometimes called 'security through
 > obscurity' that URL hiding falls into. I agree it's not sufficient but,
 > then again, nothing is absolute when talking about security.
 >
 
 Yes, however, security through obscurity is *always* a false sense of
 security -- even when coupled with other mechanisms.  There is no real
 added benefit with URL hiding.  More on that in a second.
 
 > At best, you build walls within walls to increase the technical
 > knowledge required to defeat the system. Sometimes you can add to the
 > fun by adding false information to the mix. For example, if I change my
 > php mapping to, say, asp, an inexperienced hacker will spend time
 > chasing a blind alley (i.e. attempting asp exploits against a php system).
 >
 
 Yes, and no.  Do note that most of the vulnerabilities that *do* exist
 that are not application bugs, tend to be things like the server itself
 has a buffer overflow.  When you're using ASP, you can use it on
 Microsoft's IIS using VBScript, JScript, or Perl, or, there is also at
 least one setup that allows you to use ASP on Apache, so long as the
 language variant used is Perl (http://www.apache-asp.org/install.html).
 Since there are various versions of both, and yet, compromises still
 happen, you can conclude that the obscurity of that "security" is not
 effective.
 
 > Smarter hackers will not trust the asp signature and probe for other
 > corroborating information, but we have reduced the total population of
 > hackers hitting the site - which is one of the objectives of security.
 > Yes, it fails if the hacker is persistent, but the profile of the
 > amateur hacker is one of quick in/quick out. If they don't crack it
 > immediately, they tend to move on to easier prey unless there is some
 > compelling reward for continuing their efforts.
 >
 
 Perhaps, but you never really /truly/ know if the server is telling the
 truth or not.  For example, if I really, *really* wanted to, I could
 configure Apache to treat all *.php files in the web namespace on that
 particular server to actually be ran as CGI programs, which could be
 written in bash, C, C++, Ruby, Python, Perl...
 
 And that could actually *worsen* security, if I were using arbitrary C
 programs and trying to hide them as PHP ones.  Remote exploits, say,
 through a buffer overflow, would be easier to find if the program were a
 C program that was poorly written, as opposed to a PHP script actually
 running at a given location.
 
 > Organizations use security through obscurity all the time. They will
 > order equipment to be delivered to a sidewalk location with instructions
 > to 'drop it and leave the area'. The goal here is that the delivery
 > person has no idea of the final destination of the goods making it much
 > more difficult for the delivery person to supply location information to
 > some third party. Often buildings with security requirements are hidden
 > by mislabeling them or having no identification information on them at
 > all. URL masking is like dropping the package on the side walk or
 > mislabeling the building - it hides information from the attacker.
 >
 
 So, just because the majority of newcomers to the Internet like to
 top-post, does that mean that it's okay?  How about the fact that many
 people like to use that highly annoying thing called HTML mail and news?
 Does that make it any better?  What about if people were jumping off of
 bridges and out of buildings because of something they'd seen on TV or
 heard on the radio?  Would you follow them there, too?  Just because
 people seem to find a false sense of security or meaning in something,
 does not make the practice any more correct.  What it does do, is clue
 crackers into the fact that people actually believe that these methods
 help and work for them.  In many cases, they can correctly assume that
 "real" security concerns have not been looked after, because the people
 running on web servers will take their false sense of security and run
 with it.
 
 If anything, that means that they will improve their cracking tools, and
 try harder to get at information.  Especially if that information can
 help them in some way, such as harvesting of e-mail addresses, or other
 similar data-harvesting scenerios.  They can educate their software to
 just always assume that it is being lied to, and brute-force their way
 into things.  So, the practice, overall, can be argued to actually
 weaken security.
 
 > You and Jerry seem to be implying that I said that URL hiding
 > represented all the security you needed - which I never said. I was
 > simply objecting to Jerry's (and now your) assertion that URL hiding was
 > not a viable element within the security plan for a site.
 >
 
 Now, I never said that, nor did I imply it.  What I am saying is that
 using anything that breeds a false sense of security is a bad thing.
 While you may immediately know what is going on with it, and you may
 know what is going on, you're also going to be likely to confuse
 maintenance programmers with non-standard configurations, and you'll
 find, too, that middle- and upper-management love to buy into things
 that portray that false sense of security, because they don't know any
 better -- and the old expression of a little bit of knowledge can be a
 dangerous thing, applies here.
 
 The ramifications of such a choice are almost always larger then what
 you would expect, and consequently, the choice tends to not be worth it.
 Make the application robust and strong as possible, on its own merit.
 If you really feel that you want something to analyze and try to
 strengthen it, set up a honeypot so that you can take that data and put
 it to good use.  That can only help the application become stronger as
 time goes by.
 
 Security through obscurity is a horrible thing.  Microsoft relies on
 that for most of their business to work.  And as you can see by doing
 searches on the Internet, it doesn't work out so well
 (http://en.wikipedia.org/wiki/Windows_Genuine_Advantage is one such
 recent example of this).  Security through obscurity, in many cases, is
 just an unnecessary complication on the part of the programmer(s), and
 really nets no accomplishments or savings down the road.
 
 - Mike
  Navigation: [Reply to this message] |