Fog Creek Software
Discussion Board




Diversity = security?

Could someone help me understand the diversity argument?  If you have a diverse network, then aren't you as strong as the weakest link?  I'd think that an attacker would like 5 different ways to modify data X instead of one.

Or is this an argument for having diverse systems that don't communicate much with each other?

"Put all your eggs in one basket . and watch that basket!"  It seems there are different times when either diversity or monoculture is more appropriate.

Tayssir John Gabbour
Tuesday, August 19, 2003

It only makes sense for redundant duplicates.
It increases the penetration risk if the redundant systems are online, since you have enlarged the attack surface (and I'd argue in practice they would have to be online, since if they are not chances are pretty high they will fail when needed since they have not gone through "live testing" :-)).
However, if you are mostly at risk from vandalism and DoS then it can make sense, since all redundant divers systems would have to be taken out for the service to fail.

Just me (Sir to you)
Tuesday, August 19, 2003

I agree.  Maybe I'm seeing this diversity argument thrown around too often.

Tayssir John Gabbour
Tuesday, August 19, 2003

I think the main argument to be made about system diversity is that one catastrophic bug is unilkely to affect every system simultaneously. So, if you are all Windows, and you get hit by the worm, every system gets it. I fyou run Win, Mac and Unix, any one worm/virus will only hit 1/3 of your systems.

It's also a convenient excuse to bash Microsoft, but there's some truth in there.

jason

JasonB
Tuesday, August 19, 2003

Diversity works well for the network at large.

If there was a 33% split in the desktop market between Windows, Linux, and MacOS, any individual virus could only infect 33% of the available unpateched computers.  With Windows having a large marketshare, a virus can infect 80-95% of the available unpatched computers.  This provides even more incentive for virus authors because they will have a wide audience. 

So, for the computers on the internet, or any other large-scale system, a certain amount of diversity in design is good. 

Imagine what would happen if everybody had gone over to nuclear energy and we had all decided to use the same design of reactor as chenobryl (which has a set of severe design and safety flaws).  After chenobryl blew, there would be a lot of unhappy power companies, given that there's really no way to make that type of reactor safe.

Diversity doesn't work so well for an individual group of servers.

If you have two light aircraft, one with two engines, one with a single engine, the single engined aircraft will be safer.  Because you have all of your eggs in one basket, you make sure that the one engine works.  The second engine doubles the potential for failure because it doubles the complexity of your system (and the associated opportunities for there to be a missed problem).  And if one engine on a light twin aircraft fails, you have big problems.  If you had two different, but identically performing engines on the same airplane, it would be even worse.

If you have diversity, where there's two servers for your website, one running Windows and one running Linux, you have two sets of patches to keep up with.  This doubles the possibility that you will miss a patch and get hacked.  The effort spent on learning the ins and outs of securing the OS is doubled.

The two ways to get security for a corporate network are careful configuration of the network (including firewalls), and monitoring.

Flamebait Sr.
Tuesday, August 19, 2003

Two places I can see absolute justification for diversity:
1) Firewalls
2) DNS

1) meaning that you put, for example, a Cisco PIX for your public firewall, then a linux router as your private firewall, with the DMZ in between. Thus any vulnerability in either system is backstopped by the other system.
2) Only because MS got hammered by having four identical DNS servers on the same segment in the same room running from the same tape loads.

Philo

Philo
Tuesday, August 19, 2003

Asuming the 33% split listed above:

Sure, it's less likely that one worm/virus will wipe out your whole network, but you're exposing yourself to the vulnerabilities of 3 different operating systems.

One could argue that there are more exploits on Microsoft systems, but I would argue that we would see A LOT more exploits geared towards Linux & MacOS if they each had 33% of the desktop market.

Personally, I think it would be a wash (without getting into the inevitable TCO argument of supporting 3 desktop operating systems).

Jeff MacDonald
Tuesday, August 19, 2003

In biological systems, a pathogen that could infect a very large range of hosts consumes more energy than a pathogen that can infect a narrow range. Therefore, evolution pushes pathogens to a middle ground, where they can be just infectious enough to maintain the species.

For computer "viruses", there's no such trade-off.  If there were ten widely-used operating systems with different vulnerabilities, it would be no trouble for an author to pack code for exploiting all ten into the same worm.

Seth Gordon
Tuesday, August 19, 2003

I call this space shuttle High Availability.  To make systems HA and redundant on the space shuttle, NASA used multiple teams implementing the same spec.  The theory was if an error existed in one team's software, upon failure a redundant system could execute the prodecure in a different manner, and hopefully correctly.

I believe redundancy only makes sense if the software image isn't identical.  You may have hardware redundancy using in the same vendor, but in the case of a software failure, both systems will fail in the same way.  This was the case with the recent Cisco IOS flaw.

In my opinion the severity of the problem that resulted last month from Cisco's flaw is the reason that diversity of vendors is vital.  It is highly unlikely that severe flaws will co-exist in products from diseperate vendors.

I am working on a security product.  It is my intention to use this to my advantage.  If you want to be secure you need to go to two vendors for every critical network path to ensure redundancy.  This effectively opens up the market to multiple vendors. 

For small ISVs, such as the one I am working on, this is an advantage.  For huge vendors like Ciscos this is disadvantage.  In my opinion if back bone providers look at the core of the problem, Juniper could see a lot more business.

christopher

christopher baus
Tuesday, August 19, 2003

Seth wrote: "If there were ten widely-used operating systems with different vulnerabilities, it would be no trouble for an author to pack code for exploiting all ten into the same worm."

Is this true? I have never seen any executable program that could be "double-clicked" and run from more than one platform. I know there's CD's that can, for example, run on both Macs and Windows, but they have separate files for each system. Is this not a technical limitation?

Jordan Lev
Tuesday, August 19, 2003

Those who don't see the benefit of vendor diversity didn't spend and entire weekend last month emergency patching cisco backbone routers like a lot network admins did.  Most people didn't realize there was a huge race going on between admins and blackhats over the very stability of the internet.  Fortuntely the admins won, but it was only by about a day.  If the backbone providers had been using Juniper routers in parallel with their Cisco routers, this wouldn't have been anywhere near as serious as it was.  If the network had gone dark as a result of this, it could have had huge reprecussions in my opinion.  It would have been the blackout of the internet.  George Bush would have pushed a bill through congress to update our internet infastructure.  Thankfully that didn't happen.

christopher baus
Tuesday, August 19, 2003

There have been viruses that were designed to infect more than one system.  The Great Internet Worm was designed to hit Suns and VAXen, if I remember correctly.

And there were office macro viruses that would work on both the mac and the pc.

It would have to involve a different executable for each platform it was on.  It would end up in a virus writer's toolkit after a bit, but it wouldn't be completely trivial to do and would greatly increase the size and detectability of a virus. 

Flamebait Sr.
Tuesday, August 19, 2003

I did a quick google to varify a memory That surifaced in responce to this monoculture debate. The memory was that someone had created a 'proof of concept' virus targeting both PE (Windows) and ELF (Linux) exicutibles.  It was a poor vector but it aparantly worked well enough to get press.
Here is an article on the virus:  http://www.vnunet.com/News/1124075
This cracker was using the fact the ix86 machine code would run on most Linux machines and Win32 is almost exclusivly ix86... So if you want to avoid true monoculture then you need to leave Intel style hardware as well.  So what will that be (No Itantic and Hammer will emulate Ix86) SPARC, PowerPC, Power4...  And who will justify the cost?

A Software Build Guy
Tuesday, August 19, 2003

Philo, even the TLD dns servers are no longer all running bind.

Mike
Tuesday, August 19, 2003

And to all of you who think that uniformity saves endless amounts of money, training, etc. do note that:
[ http://news.com.com/2100-1012_3-1019067.html?tag=cd_mh ]

In this case, the diversity brought each copy of Windows XP down to $36 (and got rid of the dreaded WPA; Try that you monoculture supporters).

Security is not the only facet that monoculture hurts and divesrsity improves. Overall cost is there as well.

He-who-would-not-be-named
Wednesday, August 20, 2003

And Jeff, about the popularity being responsible for Windows being targetted more often -- that explanation seems plausible until you notice that Apache, for the last few years, was always more popular than any other web server, and nevertheless only IIS has been a worm magnet.

Compared to the huge holes in IIS, the security flaws previously discovered in Apache are small and very hard to abuse; This is supported by the empirical evidence that by and large, they weren't.

He-who-would-not-be-named
Wednesday, August 20, 2003

>It only makes sense for redundant duplicates.<

'redundant duplicates'?! Isn't that, well, redundant?

Unanimous Cowboy
Wednesday, August 20, 2003

He-who-would-not-be-named,

I certainly cannot disagree there.  From what I understand (I'm not an Apache admin), that product has a history of being very solid.  Tragically, it's more of an exception than the rule, software-wise.

Jeff MacDonald
Wednesday, August 20, 2003

"'redundant duplicates'?! Isn't that, well, redundant?"

Duplicates are only redundant if each one could handle the total load on its own.

Just me (Sir to you)
Thursday, August 21, 2003

*  Recent Topics

*  Fog Creek Home