Monthly Archives: January 2011

Microsoft Attack Surface Analyzer & Security Compliance Manager

Microsoft have relased an attack surface analyzer as part of their Security Development LifeCycle tools. The free product is avaiilable for download here.

As well as using the attack surface analyzer tool, I would highly recommend using the Microsoft Security Compliance Manager for assisting in applying best practice group policies for a Windows Domain Environment.

Understanding the attack surface on a server and applying best practice security policies within an enterprise environment are vital as part of a security strategy.

Advertisement

Virtualization: Hyper-V Background

Amazing isn’t it !!

Throughout my career I’ve seen so many changes in the way deploy our systems, but when I first started working with virtualization technologies in 1999, specifically desktop virtualization and then server virtualization in 2004, it was one of the most exciting technology advancements I had ever seen up to that point in my career. I remember showing a colleague of mine a laptop running Windows XP and windows server 2003 as my development platform for infrastructure projects, and he agreed, it was in his terms “Blimey, thats the best thing that I’ve seen in IT for a long while”.

Moving ahead, current day, I will writing articles specifically on the platform that I am certified for, Microsoft Hyper-V and SCVMM.

Microsoft first added the Hyper-V hypervisor to Windows Server 2008 as an add-on via a Microsoft update. it was included in later release builds as standard. Wow! A hypervisor part of an operating system with High Availability when utilized with clusters. For Microsoft, this was a big step which was long awaited for in the industry. Windows 2008 R2 enhanced the functionality of the hypervisor to allow for additional, most welcomed improvements. Some of the core improvements introduced in R2 were:

1) Processor Architecture migration across the same family of processors.

2) Improvements in the performance of dynamic vhd’s (a 95% increase in performance)

3) The ability to specify a management network which would not be utilized for Virtual Machine Network Adaptors (the option is to share a network adapter if required)

4) Cluster Shared Volumes (CSV): the ability for multiple cluster nodes to access the same logical unit number in a SAN or Shared disk subsytsem – excellent for live migration

5) Live Migration: the capability to migrate a VM from one cluster node on the same cluster without any VM down time

Licensing

Microsoft allow 4 virtual machines running on Windows Server 2008/R2 Enterprise Edition. Basically you purchase the hardware, Operating System and can install up to 4 VM’s without requiring an additional OS license. How is that for saving money. Also, if you purchase Windows Server 2008/R2 DataCenter, you can run as many VM’s as you like on the platform. What a huge saving !

Maturity

The maturity of the hypervisor use to be something some people were concerned about. Personally I had no concerns simply because I had used Virtual Pc and Virtual Server 2005/R2 for several years before Hyper-V. In fact during my days as a Senior Consultant, I had a laptop running several VM,s for customer demos, personal training, and product development and testing. Prior to deploying solutions on customers sites, I was able to POC the environment on a fresh set of VM’s to ensure the solution was practical and sound from an architectural perspective. Nowadays there is no concern over hypevisor maturity, it’s without a doubt the way to go to build your datacenter.

What about certified hardware and software?

Microsoft have a very extensive Windows Server Virtualization Validation Program where vendors must conform to Microsoft’s certification rules. Ensure you visit the site when your purchasing the hardware for your Hyper-V environment to ensure it is an approved and certified hardware platform.

You can also visit this Microsoft KB to determine which Microsoft products can be installed in a virtualized environment.

I will be writing more articles and adding news flashes on Microsoft Hyper-V in the near future.

Moving to the Cloud: The beginning

Over the years the Internet has brought us many new capabilities to collaborate from business to consumer, business to business and consumer to consumer. I always wonder where the Internet and technology will take us next, perhaps to another galaxy !

On a serious note, the potential of cloud imputing is endless. Companies have built their own data centers for many years, purchased server after server to ensure they have all the processing capabilities required to run their businesses. Unfortunately, the investment on servers, networking equipment, software, security, applications, server replacement cycles etc all come at a huge operating cost. Personally, I believe that SMBs will still keep their core infrastructure within their own data centre, and review what the cloud offers with caution at the very beginning. It’s a bit like buying BMW’s during your whole life for your week day vehicle and then deciding to sell it and buy a Mercedes, or having a Ferrari as your weekend car and then selling it to buy a Porsche. Some people love one or the other, but not both 🙂 Well thats me personally anyway (they are all fantastic car manufacturers though).

There are always applications which make more sense to be on the local network rather then hosted elsewhere. Even though the offerings in the cloud could replace a complete data centre, there are many bridges to cross before taking a huge leap of faith to move to an architecture which is essentially not controlled by the business. And as for us IT folks, well if we have a server, we control the application, updates, releases etc its all under our control and its our sweet spot !

I will be writing articles about cloud computing which cover architecture, security considerations and business processes which require consideration for moving to the cloud. There will also be topics based on specific issues and links to the most current articles from various security and cloud computing conferences to ensure you are kept up to date on recent content.

Watch this space !

The fight to bid for a Cloud Contract, Google vs Microsoft

Google Inc & Onix Neworking Corporation filed a lawsuit against the U.S. Department of the Interior (DOI) on October 29th with the U.S Court of Federal Claims in 2010. In the DOI’s RFQ they specified they required messaging solution/requirements for 88,000 users throughout the agency. From my understanding this should support calendars and collaboration and also meet the required privacy and security standards for government use.

Aparently, before the RFQ was released, Google met with the DOI and described the Google Apps solution. The problem related to this issue was the fact that the RFQ described background information which was directly related to BPOS-Federal Suite were critical to the success of the solution and the DOI standard. The background information in the RFQ directly supported Microsoft’s framework. Microsoft announced the BPOS-Federal Suite in February 2010. The suite is rated to improve the cloud computing security model to provide two-factor authentication and improved encryption technologies which also meets the Federal Information Security Management Act (FISMA) certification.

On the 5th of January 2010 the court ruled in favour of Google’s lawsuit. Google has made complaints against Microsoft where they have been stopped from bidding on government contracts in the past.

Google did make several attempts to build a relationship with the DOI, several months prior to the RFQ process. I’m not sure if the recent lawsuit is the correct way of building a relationship with the DOI, but this certainly raises questions as to why Google’s solutions were discarded during the process. I’m sure this is not the final battle, I’ll keep updated posts on this subject in the future.

Two Factor Authentication: Enterprise Security Architecture

For many years now its been a standard practice to have a username and a password to access an enterprise service, internally in the organization or access resources over the internet via a VPN or web service. Today, the chances are most systems are still being accessed using this simple authentication model. Its like in the 70s, you had one key to unlock your car and start your vehicle. Before the car manufacturers installed immobilizers and anti-theft electronics with encrypted 2/3 way remote keys, anyone could get into your vehicle, without the deadlocks (and without breaking glass), and hotwire your car and be away down the road in less than 25 seconds. I like to think of this example as a one factor authenticated user on a enterprise network. I have a username and a password (my key to unlock the car) and get going.

Two Factor Authentication (TFA,2FA) has been around for many years. It is the basics of something you know (a password or a pin) and something you have, a token. Now we are moving onto the realm of  immobilizer with a key fob. That sounds great, but as always there is a cost associated with the value of having one time passwords provided by tokens, providing the ultimate security architecture for authenticaton. One time passwords (OTP) are the best method as a security measure. Thats said, I’m not discussing Multi-Factor Authenciation (MFA) in this post which offers additional security.

Over the years the cost of TFA has come down considerably and should be considered as a inherent part of the security architecture in any organization. For example, if your still not convinced, lets look at the very basics of how someone might try to hack into your system:

1) Well known passwords: A dictionary attack tends to try most common dictionary words. A dictionary database is used as part of the hacking program  to attempt to crack the password. As the power of computers increase, any number of passwords in a dictionary can be exhaustively tried within a matter of minutes. Companies usually include security controls to lockout accounts within a number of failed attempts within a set period of time to protect against this type of attack. Controls and policies are also put into place to force complex passwords (passwords with upper case and lower case characters, numbers and other non alphabetic characters) and require them to be changed after a set number of days.

2) Key logger/Malware: Lets put the above control in place, and any malware (Malicious Software) can record and send back screen shots and key strokes you have typed on your keyboard without your knowlege on a regular basis to a server over the internet. This is a basic example of how the basic enterprise security policies can fail.

3) Try common usernames and passwords: How often do you create test accounts, which may not have policies applied for account expiration and for password changes?

In the past I have found the following username/password combinations during basic security checks:

  • test/test
  • test/test1
  • test/test123
  • testuser/testuser
  • testuser1/test
  • testuser2/testuser2

Whilst this might seem like a simple approach to trying to hack test accounts, how many of you may have used the above combinations in the past? Even though this post discusses usernames and passwords, the same applied to well known usernames and passwords for network devices.

4) Shoulder surfing/internal threat: Someone walks by your desk and watches you login to your account and sees the keystrokes.

5) Can you remember where you wrote down your passwords, because you have so many to remember to access all the corporate systems?

As well as two factor authenication, it would be ideal to integrate all the corporate systems with a common security directory e.g. Microsoft Active Directory Domain Services (ADDS). The TFA model can then be integrated with ADDS and ensure access to applications is secured using the same model, thus providing an extra factor of security within the enteprise.

Which vendor should you choose?

I have listed the most commonly know vendors below:

RSA
Safe Word
Phone Factor
DeepNet Security
Quest Defender
EnTrust Identity Guard

Don’t forget to ensure the security architecture should apply internally as well as externally. Any externaly accessible IP address which provides access to the corporate systems should follow the same standards. Now a question comes to mind for a future post? How does this affect Cloud Computing – another post for a later date. I will be discussing the security standards an enterprise should consider when moving to the cloud.

From Cloud to Cloud: WinWire Technologies

I’ve always wondered how many organizations have considered moving to the cloud, moved to the cloud and back again but I’m still researching to find the right numbers. Out of personal interest I came across a company which moved from one cloud service provider to another. I recently came across a case study where WinWire Technologies had moved from Google Apps to the Microsoft Business Productivity Online Standard Suite. There are a number of questions raised when this type of migration occurs, but usually these could just lead down to a few significant and important areas:

  • Integration Issues
  • Performance Issues
  • Service Outage problems
  • Cost
  • Maturity of the service providers solution
  • Contractual Issues
  • Regulations
  • Data Corruption

After reading the case study it was very clear that one of the main issues was integration. Specifically, WinWire wanted integration with Microsoft Outlook and SharePoint which unfortunately was an issue with utilizing Google Apps. I was surprised to read the fact that they had some difficulty with formatting when moving documents Microsoft Office Applications and SharePoint to GMail, but this would be a useful test during a pilot of the solution.

The case study can be reviewed here

I will be writing more on considerations a business should take when moving to the cloud in future posts.

Cloud Computing, The Basics

When it comes to ‘ Cloud Computing’ there are a few acronyms which are referenced in most articles which I will explain in my first official blog about cloud computing basics. The main acronyms are described below.

SaaS (Software as a Service): A Service provided by a vendor which is typcially provided as a packaged solution to multiple customers. The service is usually provided, but not limited to, through a web browser. The vendor provides the service over the internet and is managed and maintained by the vendor. The customer does not need to worry about upgrades, patching and the security architecture of the service. Examples of SaaS include Facebook, Microsoft Online and Google Apps. SaaS has been around for a number of years.

IaaS (Infrastructure as a Service): Infrastructure as a Service is a service model where a company would outsource the servers, network and storage to a service provider. All the hardware is owned and managed by the service provider and the resources are provided over the internet. The service provider can also provide the operating system, messaging and databases. The company obtaining the services would usually pay on a transaction or per use basis. Examples of IaaS include Amazon Web Services AWS), Microsoft Hyper-V Private Cloud, Apples iWork.com and IBM’s Blue cloud services. Utilizing IaaS effectively allows the architecture of a dynamic datacenter which can be flexible to a organizations requirements.

PaaS (Platform as a Service): Platform as a Service is a architecture framwork that allows a complete development platform to build and assemble solutions, similar to SaaS, but with development tools for customization. The underlying Operating System and Hardware is still provided by the service provider. PaaS offers the ability to run full rich applications over the internet offered as a utility computing. The model is still usually provided on a pay per use or on a subscription basis. Rich internet applications can be developed by businesses utilizing a rhobust platform with faster application delivery times. PaaS includes modules which can be integrated to build the applications necessary for the business. Examples of PaaS include Microsoft Azure, Salesforce.com, Rollbase, Google App Engine and BungeeConnect.

Cloudstream An integration template which provides the required nuts and bolts to secure, provide governance and manage the communication between two services at the Application Programming Interface (API). The integration can be enterprise to cloud and cloud to cloud. The cloudstream captures configuration information for cloud brokers and packages the configuration information to connect the endpoints together. CloudStream will become the standard for integration across the cloud and enterprise. For on premise systems, appliances/software solutions can help with cloud integration such as the  Vordel Cloud Service Broker, Forum Sentry SOA Security Gateway .Layer 7 CloudSpan Products, Ping Federate Connectors and Microsoft Active Directory Federation Services 2.0.

SharePoint Server 2010 Service Applications

The architecture in SharePoint 2010 has completely changed from its predecessor, Microsoft Office SharePoint Server 2007 (MOSS). In the previous version of SharePoint a Shared Services Provider (SSP) was used to provide services for a group of applications which were associated to a SSP. These included the following services:

  • Office SharePoint Server Search: Necessary to crawl web applications in order to index content into a single index.
  • Excel Services: Used to provide access to Excel workbooks in trusted data connection libraries
  • My Sites: Provide a method for web applications to leverage the mysite functionality
  • Usage Data: A Central location to store site usage data
  • Business Data Catalog: A schema for stored business data

A web application in this case would only be associated with one SSP.

In SharePoint Server 2010 the architecture has been redesigned to be far more flexible and scalable. The ‘Service Applications’ are now what make up the above Shared Services with some additional services included in the new version. These are as follows:

 (Read the full TechNet article here: http://technet.microsoft.com/en-us/library/cc560988.aspx)

Access Services Lets users view, edit, and interact with Access 2010 databases in a Web browser.
Business Data Connectivity service Gives access to line-of-business data systems.
Excel Services Application Lets users view and interact withExcel 2010 files in a Web browser.
Managed Metadata service Manages taxonomy hierarchies, keywords and social tagging infrastructure, and publish content types across site collections.
PerformancePoint Service Application Provides the capabilities of PerformancePoint.
Search service Crawls content, produces index partitions, and serves search queries.
Secure Store Service Provides single sign-on authentication to access multiple applications or services.
State service Provides temporary storage of user session data for SharePoint Server components.
Usage and Health Data Collection service Collects farm wide usage and health data, and provides the ability to view various usage and health reports.
User Profile service Adds support for My Site Web sites, profile pages, social tagging and other social computing features.
Visio Graphics Service Lets users view and refresh published Visio 2010 diagrams in a Web browser.
Web Analytics service Provides Web service interfaces.
Word Automation Services Performs automated bulk document conversions.
Microsoft SharePoint Foundation Subscription Settings Service Provides multi-tenant functionality for service applications. Tracks subscription IDs and settings for services that are deployed in partitioned mode. Deployed through Windows PowerShell only.

 

The farm services are now far more extensible and scalable and can be shared with other farms via service application proxies.

I’ll be writing more about service applications in future posts.