Microsoft Attack Surface Analyzer & Security Compliance Manager

Microsoft have relased an attack surface analyzer as part of their Security Development LifeCycle tools. The free product is avaiilable for download here.

As well as using the attack surface analyzer tool, I would highly recommend using the Microsoft Security Compliance Manager for assisting in applying best practice group policies for a Windows Domain Environment.

Understanding the attack surface on a server and applying best practice security policies within an enterprise environment are vital as part of a security strategy.

Virtualization: Hyper-V Background

Amazing isn’t it !!

Throughout my career I’ve seen so many changes in the way deploy our systems, but when I first started working with virtualization technologies in 1999, specifically desktop virtualization and then server virtualization in 2004, it was one of the most exciting technology advancements I had ever seen up to that point in my career. I remember showing a colleague of mine a laptop running Windows XP and windows server 2003 as my development platform for infrastructure projects, and he agreed, it was in his terms “Blimey, thats the best thing that I’ve seen in IT for a long while”.

Moving ahead, current day, I will writing articles specifically on the platform that I am certified for, Microsoft Hyper-V and SCVMM.

Microsoft first added the Hyper-V hypervisor to Windows Server 2008 as an add-on via a Microsoft update. it was included in later release builds as standard. Wow! A hypervisor part of an operating system with High Availability when utilized with clusters. For Microsoft, this was a big step which was long awaited for in the industry. Windows 2008 R2 enhanced the functionality of the hypervisor to allow for additional, most welcomed improvements. Some of the core improvements introduced in R2 were:

1) Processor Architecture migration across the same family of processors.

2) Improvements in the performance of dynamic vhd’s (a 95% increase in performance)

3) The ability to specify a management network which would not be utilized for Virtual Machine Network Adaptors (the option is to share a network adapter if required)

4) Cluster Shared Volumes (CSV): the ability for multiple cluster nodes to access the same logical unit number in a SAN or Shared disk subsytsem – excellent for live migration

5) Live Migration: the capability to migrate a VM from one cluster node on the same cluster without any VM down time

Licensing

Microsoft allow 4 virtual machines running on Windows Server 2008/R2 Enterprise Edition. Basically you purchase the hardware, Operating System and can install up to 4 VM’s without requiring an additional OS license. How is that for saving money. Also, if you purchase Windows Server 2008/R2 DataCenter, you can run as many VM’s as you like on the platform. What a huge saving !

Maturity

The maturity of the hypervisor use to be something some people were concerned about. Personally I had no concerns simply because I had used Virtual Pc and Virtual Server 2005/R2 for several years before Hyper-V. In fact during my days as a Senior Consultant, I had a laptop running several VM,s for customer demos, personal training, and product development and testing. Prior to deploying solutions on customers sites, I was able to POC the environment on a fresh set of VM’s to ensure the solution was practical and sound from an architectural perspective. Nowadays there is no concern over hypevisor maturity, it’s without a doubt the way to go to build your datacenter.

What about certified hardware and software?

Microsoft have a very extensive Windows Server Virtualization Validation Program where vendors must conform to Microsoft’s certification rules. Ensure you visit the site when your purchasing the hardware for your Hyper-V environment to ensure it is an approved and certified hardware platform.

You can also visit this Microsoft KB to determine which Microsoft products can be installed in a virtualized environment.

I will be writing more articles and adding news flashes on Microsoft Hyper-V in the near future.

Moving to the Cloud: The beginning

Over the years the Internet has brought us many new capabilities to collaborate from business to consumer, business to business and consumer to consumer. I always wonder where the Internet and technology will take us next, perhaps to another galaxy !

On a serious note, the potential of cloud imputing is endless. Companies have built their own data centers for many years, purchased server after server to ensure they have all the processing capabilities required to run their businesses. Unfortunately, the investment on servers, networking equipment, software, security, applications, server replacement cycles etc all come at a huge operating cost. Personally, I believe that SMBs will still keep their core infrastructure within their own data centre, and review what the cloud offers with caution at the very beginning. It’s a bit like buying BMW’s during your whole life for your week day vehicle and then deciding to sell it and buy a Mercedes, or having a Ferrari as your weekend car and then selling it to buy a Porsche. Some people love one or the other, but not both 🙂 Well thats me personally anyway (they are all fantastic car manufacturers though).

There are always applications which make more sense to be on the local network rather then hosted elsewhere. Even though the offerings in the cloud could replace a complete data centre, there are many bridges to cross before taking a huge leap of faith to move to an architecture which is essentially not controlled by the business. And as for us IT folks, well if we have a server, we control the application, updates, releases etc its all under our control and its our sweet spot !

I will be writing articles about cloud computing which cover architecture, security considerations and business processes which require consideration for moving to the cloud. There will also be topics based on specific issues and links to the most current articles from various security and cloud computing conferences to ensure you are kept up to date on recent content.

Watch this space !

The fight to bid for a Cloud Contract, Google vs Microsoft

Google Inc & Onix Neworking Corporation filed a lawsuit against the U.S. Department of the Interior (DOI) on October 29th with the U.S Court of Federal Claims in 2010. In the DOI’s RFQ they specified they required messaging solution/requirements for 88,000 users throughout the agency. From my understanding this should support calendars and collaboration and also meet the required privacy and security standards for government use.

Aparently, before the RFQ was released, Google met with the DOI and described the Google Apps solution. The problem related to this issue was the fact that the RFQ described background information which was directly related to BPOS-Federal Suite were critical to the success of the solution and the DOI standard. The background information in the RFQ directly supported Microsoft’s framework. Microsoft announced the BPOS-Federal Suite in February 2010. The suite is rated to improve the cloud computing security model to provide two-factor authentication and improved encryption technologies which also meets the Federal Information Security Management Act (FISMA) certification.

On the 5th of January 2010 the court ruled in favour of Google’s lawsuit. Google has made complaints against Microsoft where they have been stopped from bidding on government contracts in the past.

Google did make several attempts to build a relationship with the DOI, several months prior to the RFQ process. I’m not sure if the recent lawsuit is the correct way of building a relationship with the DOI, but this certainly raises questions as to why Google’s solutions were discarded during the process. I’m sure this is not the final battle, I’ll keep updated posts on this subject in the future.

Pen Testing/Security Tool: BackTrack 4

I wanted to share this free download. The BackTrack project has been around for many years and BackTrack 4/R2 includes many new features and tools.

If your interested in security and using a free tool for pen testing, try this one as part of your security  toolkit.

Download BackTrack 4 here

Two Factor Authentication: Enterprise Security Architecture

For many years now its been a standard practice to have a username and a password to access an enterprise service, internally in the organization or access resources over the internet via a VPN or web service. Today, the chances are most systems are still being accessed using this simple authentication model. Its like in the 70s, you had one key to unlock your car and start your vehicle. Before the car manufacturers installed immobilizers and anti-theft electronics with encrypted 2/3 way remote keys, anyone could get into your vehicle, without the deadlocks (and without breaking glass), and hotwire your car and be away down the road in less than 25 seconds. I like to think of this example as a one factor authenticated user on a enterprise network. I have a username and a password (my key to unlock the car) and get going.

Two Factor Authentication (TFA,2FA) has been around for many years. It is the basics of something you know (a password or a pin) and something you have, a token. Now we are moving onto the realm of  immobilizer with a key fob. That sounds great, but as always there is a cost associated with the value of having one time passwords provided by tokens, providing the ultimate security architecture for authenticaton. One time passwords (OTP) are the best method as a security measure. Thats said, I’m not discussing Multi-Factor Authenciation (MFA) in this post which offers additional security.

Over the years the cost of TFA has come down considerably and should be considered as a inherent part of the security architecture in any organization. For example, if your still not convinced, lets look at the very basics of how someone might try to hack into your system:

1) Well known passwords: A dictionary attack tends to try most common dictionary words. A dictionary database is used as part of the hacking program  to attempt to crack the password. As the power of computers increase, any number of passwords in a dictionary can be exhaustively tried within a matter of minutes. Companies usually include security controls to lockout accounts within a number of failed attempts within a set period of time to protect against this type of attack. Controls and policies are also put into place to force complex passwords (passwords with upper case and lower case characters, numbers and other non alphabetic characters) and require them to be changed after a set number of days.

2) Key logger/Malware: Lets put the above control in place, and any malware (Malicious Software) can record and send back screen shots and key strokes you have typed on your keyboard without your knowlege on a regular basis to a server over the internet. This is a basic example of how the basic enterprise security policies can fail.

3) Try common usernames and passwords: How often do you create test accounts, which may not have policies applied for account expiration and for password changes?

In the past I have found the following username/password combinations during basic security checks:

  • test/test
  • test/test1
  • test/test123
  • testuser/testuser
  • testuser1/test
  • testuser2/testuser2

Whilst this might seem like a simple approach to trying to hack test accounts, how many of you may have used the above combinations in the past? Even though this post discusses usernames and passwords, the same applied to well known usernames and passwords for network devices.

4) Shoulder surfing/internal threat: Someone walks by your desk and watches you login to your account and sees the keystrokes.

5) Can you remember where you wrote down your passwords, because you have so many to remember to access all the corporate systems?

As well as two factor authenication, it would be ideal to integrate all the corporate systems with a common security directory e.g. Microsoft Active Directory Domain Services (ADDS). The TFA model can then be integrated with ADDS and ensure access to applications is secured using the same model, thus providing an extra factor of security within the enteprise.

Which vendor should you choose?

I have listed the most commonly know vendors below:

RSA
Safe Word
Phone Factor
DeepNet Security
Quest Defender
EnTrust Identity Guard

Don’t forget to ensure the security architecture should apply internally as well as externally. Any externaly accessible IP address which provides access to the corporate systems should follow the same standards. Now a question comes to mind for a future post? How does this affect Cloud Computing – another post for a later date. I will be discussing the security standards an enterprise should consider when moving to the cloud.

From Cloud to Cloud: WinWire Technologies

I’ve always wondered how many organizations have considered moving to the cloud, moved to the cloud and back again but I’m still researching to find the right numbers. Out of personal interest I came across a company which moved from one cloud service provider to another. I recently came across a case study where WinWire Technologies had moved from Google Apps to the Microsoft Business Productivity Online Standard Suite. There are a number of questions raised when this type of migration occurs, but usually these could just lead down to a few significant and important areas:

  • Integration Issues
  • Performance Issues
  • Service Outage problems
  • Cost
  • Maturity of the service providers solution
  • Contractual Issues
  • Regulations
  • Data Corruption

After reading the case study it was very clear that one of the main issues was integration. Specifically, WinWire wanted integration with Microsoft Outlook and SharePoint which unfortunately was an issue with utilizing Google Apps. I was surprised to read the fact that they had some difficulty with formatting when moving documents Microsoft Office Applications and SharePoint to GMail, but this would be a useful test during a pilot of the solution.

The case study can be reviewed here

I will be writing more on considerations a business should take when moving to the cloud in future posts.