Skip to main content

 

Simplify secure infrastructure management with System Center

Published: February 25, 2015

Author: Jayson Ferron, Microsoft MVP, CEH, CISSP, CRISC, CVEi, MCITP, MCSE, MCT, and NSA-IAM


IT security is one of the most difficult challenges that every organization must deal with. Although security is much broader than this, you can make the goal of maintaining a secure, well-managed infrastructure easier to achieve by standardizing, and therefore simplifying your systems. Knowing what programs are installed and configured and how your systems are built helps you get to that goal.

In this article I will focus on the 20,000-foot view of how you can accomplish this task by using the Microsoft System Center suite of tools. I will not go into step-by-step details, but will focus more on the tools you can use to assist you in meeting the goals of building standard images to reduce the risks that can occur when manually building and deploying systems. I have included in this article links and more information on the tools I will discuss.

As an IT administrator and security professional, there are many questions about security that I ask about, but for this article I will focus on the following:

  • What systems are you using?
  • Do you have policies and procedures that you follow?
  • How do you verify and confirm that policies are being followed?
  • What tools do you use to support the automation of your processes?
  • How do you test configurations?
  • ·And my favorite, what services are running on what computers?

The reason I ask these questions is to understand how well documented a company’s IT structure is. Often, when I ask questions like, “How are your servers and desktops configured?” or “Do you have a document that shows what ports, services, and processes are running on your servers or workstations?” the answer I get 90% of the time is, “No.”

This becomes the major focus of IT security and I’ll explain it this way. If you cannot tell me what is running in your environment, then how do you know if I added a new application to your network? If you do not know what services, applications, or ports are in use, how do you know what has been changed? This lack of knowledge can allow a hacker to add applications and remote access tools, and gain access to your company data.

Create a baseline

A baseline is a state of being that gives you a known configuration to test against. Most organizations have a collection of software and settings that should be present on all computers. This article shows you techniques that allow you to easily create, deploy, and maintain a standardized configuration. This could include operating system patches, applications, security policy settings, antivirus software, and more. If you build an image for a workstation or server, this becomes your baseline(s), or master image(s). You then have a starting point for all future workstations or servers — as you add more software you can create additional baselines.

By creating a baseline, or master image, you can create multiple new servers or workstations that match all existing documented build guides. This allows you to easily add and have the same configurations on systems of the same type. This will assist you in documenting, testing, and patch management, and also during audits to verify that configurations are being built to specifications. We are going to use the System Center suite to accomplish this.

What’s included with System Center

So let’s first review the System Center suite of products and the primary functionality of each product.

System Center Configuration Manager: Configuration Manager lets you perform tasks such as the following:

  • Deploy operating systems, software applications, and software updates
  • Track and remediate computers for compliance settings
  • Track hardware and software inventory
  • Remotely administer computers

System Center Orchestrator: Orchestrator is a workflow management solution for the data center. Orchestrator lets you automate the creation, monitoring, and deployment of resources in your environment.

System Center Virtual Machine Manager: Virtual Machine Manager (VMM) is a management solution for the virtualized data center that lets you configure and manage your virtualization host, networking, and storage resources in order to create and deploy virtual machines and services to private clouds that you have created.

System Center App Controller: App Controller provides a common self-service experience that can help you easily configure, deploy, and manage virtual machines and services across private and public clouds.

System Center Operations Manager: Operations Manager provides infrastructure monitoring that is flexible and cost-effective, helps ensure the predictable performance and availability of vital applications, and offers comprehensive monitoring for your data center and cloud, both private and public.

System Center Endpoint Protection (included with Configuration Manager): Includes an operations, configuration, data-protection, service, and virtual machine manager, as well as advanced endpoint protection. It provides a single, integrated platform for managing policies, endpoints, software deployment, data-loss prevention, and Internet security.

System Center Service Manager: Service Manager provides an integrated platform for automating and adapting your organization’s IT service management best practices, such as those found in Microsoft Operations Framework (MOF) and Information Technology Infrastructure Library (ITIL). It provides built-in processes for incident and problem resolution, change control, and asset lifecycle management.

System Center Data Protection Manager: Provides Data Protection Manager (DPM) to back up servers, computers, Microsoft workloads, system state, and bare metal recovery (BMR).

Although the full System Center suite is helpful in reducing errors and controlling your environment by the use of automation, the tips in this article focus on Configuration Manager, VMM, and Service Manager.

Using Configuration Manager

To begin the process of building an image, you must first write down everything that has to be included. After you have your checklist, you can do it all manually, but by using the Operating System Deployment (OSD) functionality in Configuration Manager, you can create a series of deployment images that you can push out to your new server and ensure that each new computer (whether it be physical or virtual) meets the same standards and follows your best practices.

Think about this: If we create all web servers using a master image, then all web servers should have the same ports, services, and apps installed and then we can look for changes.

Since I have said that there is a need for enterprise baselines let’s discuss that process. How can you create, manage, and validate configurations through imaging, patching, and control using System Center modules?

Start by writing down everything that has to be included (operating system, antivirus, applications, patches, policies, backup agent, monitoring agent). For example, let’s create a Windows Server 2012 R2 computer with web server and Hyper-V roles, the Data Protection Manager Client, Endpoint Protection Client, and Operations Manager management packs.

You now have your checklist. You can use this master image as the basic image for all new web servers. You can build these servers manually, but then each time you build another server you might configure it differently, and human error will continually be a factor.

To build desktop images you can use the Windows Assessment and Deployment Kit to create the image. However, if you download the Microsoft Deployment Toolkit (MDT), you can then use a graphical tool to create standardized images. Click here for step-by-step directions.

You can also use Operating System Deployment (OSD) functionality in Configuration Manager. For more information about OSD click here. To download OSD, click here.

Now you have created a series of deployment images that you can push out to your new server or workstation and ensure that each new computer (whether it be physical or virtual) meets the same standards and follows your best practices.

You have built a master image for all new web servers or workstations. Using Configuration Manager you can deploy your new master image to all new web servers and know that all web servers have the same configuration.

You can scan what ports are open and create a baseline document and also scan what ports are open by the use of a third-party tool. You can also use System Center inventory tool to notify you of ansoftware that is installed on the computer that was not pushed by IT. Then you can create a document for each server using Service Manager or some other tool that records any changes or updates to your configuration. This will become your audit trail and a resource you can check for approved changes and document any issues.

After you install baseline images that you can push to bare metal or virtual machines, you can then add configurations or software by using Group Policy or packages hosted in System Center. A nice addition to your security portfolio that you may not be aware of, is the new Windows PowerShell Desired State Configuration (DSC) tool set. You can learn more about DSC here.

DSC can do many things, but for our purposes it does the following:

  • Deploy new software
  • Take a baseline, and then fix configurations that have drifted away from the desired state
  • Discover the actual configuration state on a given server

In addition, you can create custom resources to configure the state of any application or system setting. Once again, be sure to document the newly configured server in Service Manager.

Next steps

So, at this point, you have a functional, baseline, documented master image for your initial server installation; but things can change over time, so how do you handle issues like security patches, updates, and so on?

We all know that we should perform testing before putting anything in production, but how? We do not want to create a “Resume Generating Event” if the change we put into production hurts the company or risks your job.

Before you deploy patches or updates to your servers you should perform the proper tests. By using VMM you can make a copy of your production environment and create an isolated network on your Hyper-V infrastructure. You can then test updates and patches without any danger to your production environment.

As an administrator you can control when and where you will deploy a patch or update by using Configuration Manager. By creating multiple development, test, and production OUs you can leverage them to test and validate patches and pushes of updates to systems. After you verify that the updates work as expected, and only then, you can approve them for your production systems. Then you can update both the production computers and the master image so that all new servers have the updates applied. Remember to document that change to the image in Service Manager.

In addition to what I have discussed here in this article, there are third-party tools you can use to look at files, folders, and registry changes that can further support security and add additional real-time baselines to those applications and servers that require extra vigilance. These tools can report, and if allowed, can revert any unauthorized changes.

In this article I have discussed how you can create baseline images, as well as test, patch, and document changes that have been made in your system. If you do not have your systems documented, it is nearly impossible to tell when something has changed; and, if by chance you do detect a change, if you have not implemented proper monitoring and auditing you cannot know who made the change, or if it was authorized or unauthorized. By using baseline images you create with Configuration Manager and Service Manager to document changes, you are better enabled to secure your IT structure and reduce security risks.

About the Author

Jayson Ferron photoJayson Ferron (CEH, CISSP, CRISC, CVEi, MCITP, MCSE, MCT, Microsoft MVP, NSA-IAM) is a Principal at Interactive Security Training.  Jay’s work includes e-commerce, VPN, security audit, workflow process, training, and Windows and Linux enterprise designs. He specializes in operating systems, deployment, virtualization, security, cloud and high performance computing.

Jay has authored technical materials such as Architecting Microsoft Server Virtualization Solutions with Hyper-V and System Center Virtual Machine Manager, and has presented at numerous events including user groups, computer trade shows, DOD and federal conferences, ISPCON, TechEd, and WPC. Jay is a global board member of GITCA, past President of ISACA-CT, past president APCUG and a Microsoft MVP. You can follow his blog at http://blog.mir.net.

Correction: When we originally published this article, it was inadvertently attributed to the incorrect author. Jayson Ferron, a respected member of our MVP community, is the correct author.

Microsoft Security Newsletter

Sign up for a free monthly roundup of security news, bulletins, and guidance for IT pros and developers.