Forest/Domain in the "DMZ" to accomodate web, front-end servers

Discussion in 'Security Software' started by Marlon Brown, Sep 19, 2005.

  1. Marlon Brown

    Marlon Brown Guest

    Imagine I have an IT guru in my organization and (political forces behind
    him) that don't let me put ANY front-end server in the internal network.

    I mean, I have Exchange 2003 OWA and Sharepoint servers that, to this date,
    have been published via ISA 2004 and in my view that was provided adequate
    security. Now imagine that I must put all servers that reside in the
    "internal" network in the "DMZ".

    Do you think it makes sense if I setup a "Domain-DMZ" and put all such
    front-end servers there and allow a one-way trust relantioship where my
    existing "domain-dmz" trusts the "corporate domain" ?

    Then I would put the Sharepoint Servers (front-end) and Exchange (front-end)
    and such DMZ-DOMAIN ?

    Do you really believe this would be a good security implementation for a mid
    size organization ? (5,000 AD accounts).
     
    Marlon Brown, Sep 19, 2005
    #1
    1. Advertisements

  2. Marlon Brown

    Keith I Guest

    Marlon,

    What is the purpose of the network segmentation? Would the Front-End
    Exchange and Share Point Services (SPS) now exposed directly to the
    Internet? If so, you are negating the value of ISA 2004. ISA 2004 has the
    hardened External interface, the other server roles do not by default. Do
    you trust ISA, if not dump it and use another device for your network
    segmentation control. However, I believe ISA 2004 provides a hardened
    service for Exchange and SPS. That is the objective of using ISA.

    Second, would that DMZ-Domain be trusted by the corporate domain for
    authentication? If you are trusting, what should be non-trusted, then you
    are again devising a less secure solution than existed prior. The domain is
    not the not the Windows 200x security boundy, the forest is the boundry.
    So, you'd have to create a new forest with a minimum of two domain
    controllers for redundancy.

    The other solution might be the DMZ-Domain trusting the corporate domain for
    management. While this makes it easier to manage this domain, and is
    recommended by some persons for systems of 25 or greater in DMZ, it seems
    like this is not your case.

    This IT guru is imposing solutions that are just bad ideas, based on ideas
    5-10 years ago. Your solution seems right on track. I like Microsoft's
    solution provided at
    http://www.microsoft.com/technet/prodtechnol/isa/2004/plan/workgroup_ee.mspx
    the best.
     
    Keith I, Sep 20, 2005
    #2
    1. Advertisements

  3. Marlon Brown

    MCSEGURU Guest

    I disagree... While the implementation may be poorly thoughout, and more of
    a bandaid to satisfy compliance with some directive, I assume network
    segmentation may be only one goal of the implementation. Logging and
    intrusion detection may be the driving force for his restrictive
    architecture, which is becoming more and more sought after by IT auditors.

    The benefit of a passive firewall device logging all activity, is it's alot
    harder to spoof at the passive interface, because we don't realize it's
    there, additionally, should a server be compramised, it's local logging
    could be totally lost.

    After all in todays' computer threats, our internal employees present a much
    higher risk than the internet hacker. Reason being, is we fail to enforce
    all the security we could on our internal servers we leave many
    vulnerabilities subject to accidental, or inentional misuse. This includes
    patches, policies, and account management.

    Architecture and Infrastructure Security teams can't easily force and manage
    these patches, configuration lockdowns, and other common oversights our
    applications teams, business units, and systems teams are implementing, , so
    the direction to segment all internal PC's from the server segments, and
    provide restricted port access based on implementation design scopes, allows
    security manager the control to manage, document and control exposed
    vulnerabilities much better.

    It's what I would do. Now would I use ISA 2004, probably not. There are
    Firewall technologies that manage the actual header conversations, and
    payload data in addition to the standard port/protocol access, which allows
    the security managers to really control what's going on with systems to the
    application layer we all wish we could monitor and log at.

    My 2 cents.
     
    MCSEGURU, Sep 20, 2005
    #3
  4. Um, I don't know where you came up with the idea that ISA Server doesn't
    perform application layer inspection and filtering, but you are dead wrong
    as it's been doing that since ISA 2000 debuted a number of years ago now.

    As to your point about the "internal" threat, this has always been the case.
    In addition to that, the network "edge" is essentially dead as a concept and
    the DMZ is deader than Julius Caesar as a security mechanism. Secure the
    transports, and the conversations to/from hosts. Provide isolation of
    trusted hosts from untrusted hosts. Who cares if untrusted hosts compromise
    other untrusted hosts? Who cares about what "normal" looks like on the
    Internet (or on my large corporate WAN for that matter)? I care about the
    hosts, and the data that resides on them. That is what attackers are after:
    the network is simply and end to a means.

    Authenticate users *and* machines. Clearly articulate and document policies
    in companies and provide for enforcement mechanisms for non-compliance.
    Provide enough detail in logging to be useful forensically. Have admins
    work as users unless they are performing administrative functions. Don't
    give admin privileges to non-admins.

    Many many more mantras can be placed here.

    My point is the network edge is not the place to have all your security.
    Rather, provide defense in depth and let ISA do what it is designed to do,
    and leverage the remaining layer 1-4 hardware to augment that.
     
    Steve Clark [MSFT], Sep 20, 2005
    #4
  5. Marlon Brown

    Marlon Brown Guest

    Please see responses below.

    If so, you are negating the value of ISA 2004. ISA 2004 has the
    -----> According to the "IT guru" vision, he believes that any server that
    is accessible from the Internet must be in the perimeter network.
    In case of a server compromise, according to him, would be isolated.
    ISA 2004 would still be used. I would configure a NIC-Perimeter in the ISA
    firewall to provide the layer 7 inspection.
    Currently ISA 2004 is doing its job, but as I described, OWA, Sharepoint,
    etc are in the internal network for simplicity.

    If I put this design in place, that should be actually a Forest - correct.

    If you are trusting, what should be non-trusted, then you
     
    Marlon Brown, Sep 21, 2005
    #5
  6. Marlon Brown

    MCSEGURU Guest

    Agreed that there are many different philosophies and approaches to the same
    goal. I also agree to a point that there are trusted, and un-trusted
    separation. I do not agree that there has been "focus" on considering
    internal users as risks. Surely for at least a half a decade we've been
    cautious of what our "end-users" could access, but how about the application
    teams? How about managing their change control to their custom
    applications? How about enforcing they use secure code? That is definately
    a very new focus in the industry my friend.

    Still, I maintain that things are only trusted until otherwise identified,
    and as implementation designers and engineers the idea of a trusted anything
    assumes we know all that is possible. So depending on the system design as
    a whole, things we most commonly define as trusted, really should not be
    considered trusted at all. So for the sake of thoroughness, why not
    consider all untrusted, and manage all things allowed specifically through
    defined passive systems that are managed separately.

    Now as for ISA 2004 being a seamless application layer inpspection security
    device: While I am quite fond of the product, I won't begin to agree that
    it's "out of the box" feature set provides the "layman" with the
    managability at the application layer than many of it's competitors. And as
    a systems engineer, I usually don't have time to drill down to the
    programming of such filters that in my opinion are much better left to the
    programmers and regression testing teams at the software provider. I mean,
    I could take a linux box and make it an application layer inspection device
    out of it too, but I have 500 servers, and 3000 desktops to worry about. I
    lack the time to be effective and efficient in building and testing features
    with my own custom made solutions, when I could easily go the direction of a
    full featured package, and leave the updates and effectiveness testing to
    their manpower resources.

    If you are only offering HTTP/HTTPS, FTP, SMTP, ISA probably serves your
    purposes well at all layers. But if considering using it as a protection
    and logging device for "trusted" servers being segmented from LAN clients
    where the array of protocols is much larger, the scope in which you will be
    defining your "Layer 7" *** I shiver at the techno talk **** filters greatly
    increases. Many managers will have alot of homework to do, in terms of verb
    usage, malformed strings, etc... So I say, for the layman implementation
    engineer, who will likely fail to dig this deep, what's the point of
    managing all the doorways, if you're not going to enforce the dress code?
    The real goal is to prevent the unknown by allowing only the known, not just
    log it. Since ISA 200X's competitors pre-package these filter features
    beyond web-traffic, I think there is a higher probability application layer
    enforcement will actually get used.

    I mean we've all been around. How many really savvy Security Engineers do
    you find out there that are actually responsible for implementation? More
    likely than not it's the old Consultant and has-been hack.

    Again, everyone has an opinion, and understandibly justly so. I mean to
    insult to anyone, just my perspective on the industry.
     
    MCSEGURU, Sep 22, 2005
    #6
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.