MOSS MVP

I've moved my blog to http://blog.falchionconsulting.com!. Please update your links. This blog is no longer in use--you can find all posts and comments at my new blog; I will no longer be posting to this site and comments have been disabled.

Monday, November 15, 2010

Real World SharePoint 2010 Released!

Today I got the great news that the Real World SharePoint 2010 book has finally been released! This is my first ever publication and I’m truly excited to see it finally available. To have my first publication be this particular book where so many stellar folks have contributed is just an incredible honor for me. I was one of the few contributors that had a very early delivery date for their chapters and as such I  finished my chapter in December of 2009 so I’ve been waiting a long time (almost a full year) for this sucker to finally get out. Now I just have to get my second publication finished :)

Thanks again to everyone who contributed or otherwise helped organize and make this book happen and a special thanks to all those who choose to purchase it :)

Monday, November 8, 2010

November 2010 SharePoint Connections Decks and Code

The past few weeks have been very busy for me as I’ve just finished presenting at two different conferences, and as I don’t typically present at conferences this was kind of a big deal for me. You can get my slides for SPTechCon, the conference I spoke at in late October from my earlier post, and if you’re looking for the slides and code from SharePoint Connections, which I just returned from you can find them below. I know I promised several folks that I’d have these available Friday but I decided to take some long overdue family time instead.

November 2010 SharePoint Connections Slide Decks:

November 2010 SharePoint Connections Demo Code:

Now that I’m done with conferences for the year I guess it’s time to get back to writing my book :)

-Gary

Friday, October 22, 2010

SPTechCon Decks

I just wrapped up a couple of quick days at SPTechCon in Boston and had a great time! I don’t make it to too many conferences so it was great to see old friends again and of course I loved visiting Boston as once upon a time I used to live not too far from Boston and worked in the heart of the Boston financial district for quite some time.

While at the conference I presented two talks – as promised you can download them below:

In a couple of weeks I’ll be at SharePoint Connections in Las Vegas – if you’re there please stop by one of my sessions and say hi!

-Gary

Monday, October 4, 2010

Service Accounts and Managed Service Accounts in SharePoint 2010

With SharePoint 2010 we now have the ability to allow SharePoint to manage various service accounts thus foregoing the need to have IT administrators manually manage password changes. This new feature is a great benefit to SharePoint administrators and security conscious admins in general as it allows us to easily enforce our corporate security policies by changing these passwords on a schedule, and the administrators don’t even know what the password is so the likelihood of a compromise due to a disgruntled admin, though not eliminated, is somewhat reduced.

But the introduction of this new feature isn’t all good. The complication comes from the fact that SharePoint 2010 doesn’t implement this capability consistently. So an account that is configured as a Managed Service Account and set to have its password changed automatically could also be used in certain places that don’t understand the managed account concept. When the managed account password is changed the feature that uses that account and only knows the username and password (so it does not use the managed account details) will effectively be broken. As an example, if you configure the Enterprise Search Service to use a managed account whose password is scheduled to be changed every 30 days and you use that same account for the content crawl account then when that password is changed the content crawl will cease to function as it will be unable to authenticate the account. It’s important to note, however, that this issue only comes to light when you configure the managed account to have it’s password changed automatically.

So what things can be managed accounts and what cannot? The following lists what I’ve come across so far (if I’ve missed anything please leave a comment so I can update these lists):

Managed Service Accounts:

  • All Service Application Pool Accounts
    • Access Service Application
    • BCS Service Application
    • Excel Services Service Application
    • Metadata Service Application
    • PerformancePoint Service Application
    • Enterprise Search Service Application
    • Secure Store Service Application
    • Subscription Settings Service Application
    • User Profile Service Application
    • Visio Services Service Application
    • Web Analytics Service Application
    • Word Automation Service Application
    • Word Viewing Service Application
    • PowerPoint Viewing Service Application
    • Security Token Service Application
  • All Content Web Application Pools
  • Service Instances
    • Claims to Windows Token Service
    • Document Conversion Launcher Service
    • Document Conversion Load Balancer Service
    • Microsoft SharePoint Foundation Sandboxed Code Service
    • SharePoint Foundation Help Search
    • SharePoint Server Search (Enterprise Search)
    • Web Analytics Data Processing Service

Service Accounts (should not be managed):

  • Search Crawl Accounts
    • For Foundation Search and Server (Enterprise) Search
  • Unattended User Accounts
    • Excel Services Service Application
    • Visio Services Service Application
    • PerformancePoint Service Application
    • (in general, any Secure Store application credentials)
  • Object Cache Portal Accounts
    • Super User Account
    • Super Reader Account
  • User Profile
    • Synchronization Service Account (listed incorrectly on the FarmCredentialManagement.aspx page)
    • Synchronization Connection Account
  • Server Search Custom Crawl Rule Accounts
    • Any crawl rule that specifies an account other than the default crawl account

Again, these are just the accounts that I’ve personally bumped up against so it may not be a complete listing.

Viewing and Creating Managed Accounts

To see the current list of Managed Service Accounts using Central Admin go to Security –> Configure managed accounts:

Configure Managed Accounts

You can edit the settings for any managed account by simply clicking the edit icon associated with the account you wish to modify. Once on the Manage Account screen you can configure the automatic password change settings:

Configure Managed Account

To perform the same tasks using Windows PowerShell we can use the Get-SPManagedAccount cmdlet to retrieve the list of managed accounts:

Get-SPManagedAccount

Or we can retrieve a specific account using the -Identity parameter or by passing in a Web Application or Service:

Get-SPManagedAccount -Identity "localdev\spfarm"

clTo change the settings for a Managed Account we can use the Set-SPManagedAccount cmdlet:

Set-SPManagedAccount

To create a new Managed Account we use the New-SPManagedAccount cmdlet. In the example below I’m manually creating a PSCredential object so that I can specify my password (pa$$w0rd) in script (very useful for building out dev or test environments – otherwise you should use Get-Credential to prompt for the password so that it is not hard coded anywhere):

New-SPManagedAccount

Applying Managed Accounts

Once you have your Managed Accounts created you can begin to use them for things such as Service Instances and Service and Content Application Pools. To associate a managed account with a specific Service Instance using Central Admin you can go to Security –> Configure service accounts. On the Service Accounts page you can set the account used for the Farm Account, Service Instances, Web Content Application Pools, and Service Application Pools. The Service Instances are highlighted in the following image:

Configure Service Accounts

Service Instances

To set the account associated with a particular Service Instance using Windows PowerShell we simply get the ProcessIdentity property of the Service Instance and set its Username property. Once set we call Update() to update the Configuration Database and then Deploy() to push the change out to all Service Instances. To make this easier I put this code in a function that I can call by passing in the Service Instance and credentials to use:

function Set-ServiceIdentity($svc, $username)
{
  
$pi = $svc.Service.ProcessIdentity
  
if ($pi.Username-ne $username) {
      
$pi.Username= $username
      
$pi.Update()
      
$pi.Deploy()
    }
}

Here’s an example of how you can call this function:

Set-ServiceIdentity

Service Application Pools

To create a new Service Application pool we use the New-SPServiceApplicationPool cmdlet and pass in the name of the Application Pool to create and the Managed Account to assign as the Application Pool identity:

New-SPServiceApplicationPool

It’s extremely important to note that the application pool that you create using the New-ServiceApplicationPool cmdlet cannot be used for your content Web Applications. Unfortunately there is no out-of-the-box equivalent for creating Application Pools for Web Applications.

Web Application Pools

As previously noted there is no cmdlet for creating Application Pools for Web Applications. Instead what you need to do is first check if the Application Pool you need already exists by using the SPWebService’s ContentService static property. If it exists then pass in just the name of the Application Pool to the New-SPWebApplication cmdlet, otherwise pass in the name and the Managed Account to use as the Application Pool’s identity:

New-SPWebApplication

Applying Service Accounts

When it comes to applying non-managed accounts to the various features things get a little more complicated. Let’s start with the Crawl Accounts.

SharePoint Foundation Search Service

For SharePoint Foundation Search we can set the crawl account (or content access account) using Central Admin by navigating to the Services on Server page and clicking the SharePoint Foundation [Help] Search link which takes you to the settings page where we can set the crawl account:

SharePoint Foundation Search Settings

To set the same information using Windows PowerShell we actually have to go old-school and use STSADM as there’s no PowerShell equivalent cmdlet. Here’s a snippet of PowerShell code that I use to accomplish this:

function ConvertTo-UnsecureString([System.Security.SecureString]$string) 
{
    $unmanagedString = [System.Runtime.InteropServices.Marshal]::SecureStringToGlobalAllocUnicode($string)
    $unsecureString = [System.Runtime.InteropServices.Marshal]::PtrToStringUni($unmanagedString)
    [System.Runtime.InteropServices.Marshal]::ZeroFreeGlobalAllocUnicode($unmanagedString)
    
    return $unsecureString
}

$searchSvcAccount = Get-Credential "localdev\spsearchsvc"
$crawlAccount = Get-Credential "localdev\spcrawl"

$stsadmArgs = "-o spsearch -action start " + `
    "-farmserviceaccount `"$($searchSvcAccount.Username)`" " + `
    "-farmservicepassword `"$(ConvertTo-UnsecureString $searchSvcAccount.Password)`" " + `
    "-farmcontentaccessaccount `"$($crawlAccount.Username)`" " + `
    "-farmcontentaccesspassword `"$(ConvertTo-UnsecureString $crawlAccount.Password)`" " + `
    "-databaseserver `"spsql1`" " +  `
    "-databasename `"SharePoint_FoundationSearch`""

Write-Host "Running: stsadm $stsadmArgs"
$stsadmoutput = cmd /c "stsadm $stsadmArgs" 2>&1
if ($lastexitcode -ne 0) {
    throw "Unable to start Foundation Search Service.`n$stsadmoutput"
}

Note that I’m using a helper function to convert the secure password to a static string which I can then pass to the STSADM spsearch command.

SharePoint Server Search Service

To manage the crawl account for the SharePoint Server Search Service (also known as the Enterprise Search Service) using Central Admin we simply need to navigate to the Search Administration page of the Service Application that we wish to modify and click the link for the Default content access account. This will bring up the following screen:

Default content access account

Note that by default this account will be set to be the same account you used for the Search Service Instance which is a Managed Account. If you do not change this account and you have configured SharePoint to manage the account password then your crawls will fail when the password changes. To make this change using Windows PowerShell we use the Set-SPEnterpriseSearchServiceApplication cmdlet:

$crawlAccount = Get-Credential "localdev\spcrawl"
$searchApp | Set-SPEnterpriseSearchServiceApplication -DefaultContentAccessAccountPassword $crawlAccount.Password -DefaultContentAccessAccountName $crawlAccount.Username

Remember not to do this step until after you have provisioned the Administration Component.

Object Cache Accounts

Many administrators when they first configure SharePoint 2010 and hit a Web Application for the first time are likely to see a recurring event in the event log stating that the object cache has not been configured correctly. The specific error is as follows:

Object Cache: The super user account utilized by the cache is not configured. This can increase the number of cache misses, which causes the page requests to consume unneccesary system resources.

This is essentially telling you that you have missed a manual configuration step in which you need to run some PowerShell to set two accounts for SharePoint to use to access the object cache:

function Set-WebAppUserPolicy($webApp, $userName, $userDisplayName, $perm) {
    [Microsoft.SharePoint.Administration.SPPolicyCollection]$policies = $webApp.Policies
    [Microsoft.SharePoint.Administration.SPPolicy]$policy = $policies.Add($userName, $userDisplayName)
    [Microsoft.SharePoint.Administration.SPPolicyRole]$policyRole = $webApp.PolicyRoles | where {$_.Name -eq $perm}
    if ($policyRole -ne $null) {
        $policy.PolicyRoleBindings.Add($policyRole)
    }
    $webApp.Update()
}
$webApp = Get-SPWebApplication "http://content"
$portalSuperUserAccount = Get-Credential "localdev\SPSuperUser"
$webApp.Properties["portalsuperuseraccount"] = $portalSuperUserAccount.UserName
Set-WebAppUserPolicy $webApp $portalSuperUserAccount.UserName $portalSuperUserAccount.UserName "Full Control"

$portalSuperReaderAccount = Get-Credential "localdev\SPSuperReader"
$webApp.Properties["portalsuperreaderaccount"] = $portalSuperReaderAccount.UserName 
Set-WebAppUserPolicy $webApp $portalSuperReaderAccount.UserName $portalSuperReaderAccount.UserName "Full Read"

Make sure that you do not use the same account for both the super user and super reader. (And of course make sure you change the URL and account names to match your environment). For more information about these settings see the following TechNet article: http://technet.microsoft.com/en-us/library/ff758656.aspx

Unattended Accounts

There are some services, specifically the Visio Services Service Application, the Excel Services Service Application, and the PerformancePoint Service Application, that allow us to set an account that we can use for access data sources behind the scenes. These are called unattended access accounts. To set these accounts we must create a new target application in the Secure Store Service Application and associate the target application’s ID with the appropriate Service Application. The following PowerShell code demonstrates how to do this for the Visio Services Service Application (the Excel Services Service Application is virtually identical and just uses cmdlets specific to Excel rather than Visio; PerformancePoint is a lot simpler):

#Get the Visio Service App
$svcApp = Get-SPServiceApplication | where {$_.TypeName -like "*Visio*"}
#Get the existing unattended account app ID
$unattendedServiceAccountApplicationID = ($svcApp | Get-SPVisioExternalData).UnattendedServiceAccountApplicationID
#If the account isn't already set then set it
if ([string]::IsNullOrEmpty($unattendedServiceAccountApplicationID)) {
    #Get our credentials
    $unattendedAccount = Get-Credential "localdev\SPUnattended"

    #Set the Target App Name and create the Target App
    $name = "$($svcApp.ID)-VisioUnattendedAccount"
    Write-Host "Creating Secure Store Target Application $name..."
    $secureStoreTargetApp = New-SPSecureStoreTargetApplication -Name $name `
        -FriendlyName "Visio Services Unattended Account Target App" `
        -ApplicationType Group `
        -TimeoutInMinutes 3

    #Set the group claim and admin principals
    $groupClaim = New-SPClaimsPrincipal -Identity $svcApp.ApplicationPool.ProcessAccount.Name -IdentityType WindowsSamAccountName
    $adminPrincipal = New-SPClaimsPrincipal -Identity "$($env:userdomain)\$($env:username)" -IdentityType WindowsSamAccountName

    #Set the account fields
    $usernameField = New-SPSecureStoreApplicationField -Name "User Name" -Type WindowsUserName -Masked:$false
    $passwordField = New-SPSecureStoreApplicationField -Name "Password" -Type WindowsPassword -Masked:$false
    $fields = $usernameField, $passwordField

    #Set the field values
    $secureUserName = ConvertTo-SecureString $unattendedAccount.UserName -AsPlainText -Force
    $securePassword = $unattendedAccount.Password
    $credentialValues = $secureUserName, $securePassword
    
    #Get the service context
    $subId = [Microsoft.SharePoint.SPSiteSubscriptionIdentifier]::Default
    $context = [Microsoft.SharePoint.SPServiceContext]::GetContext($svcApp.ServiceApplicationProxyGroup, $subId)

    #Check to see if the Secure Store App already exists
    $secureStoreApp = Get-SPSecureStoreApplication -ServiceContext $context -Name $name -ErrorAction SilentlyContinue
    if ($secureStoreApp -eq $null) {
        #Doesn't exist so create.
        Write-Host "Creating Secure Store Application..."
        $secureStoreApp = New-SPSecureStoreApplication -ServiceContext $context `
            -TargetApplication $secureStoreTargetApp `
            -Administrator $adminPrincipal `
            -CredentialsOwnerGroup $groupClaim `
            -Fields $fields
    }
    #Update the field values
    Write-Host "Updating Secure Store Group Credential Mapping..."
    Update-SPSecureStoreGroupCredentialMapping -Identity $secureStoreApp -Values $credentialValues 

    #Set the unattended service account application ID
    $svcApp | Set-SPVisioExternalData -UnattendedServiceAccountApplicationID $name
}

When it comes to PerformancePoint we have a lot less work we need to do as the product team was nice enough to make it so that the Set-SPPerformancePointSecureDataValues does all the work of setting up the target application for us (note though that they did screw up how the Service Application is passed into the cmdlet requiring you to pass in the ID of the Service Application rather than the actual Service Application object):

$unattendedAccount = Get-Credential "localdev\SPUnattended"
$secureValues = Get-SPPerformancePointSecureDataValues -ServiceApplication $svcApp.Id
if ($secureValues.DataSourceUnattendedServiceAccount -ne $unattendedServiceAccount.UserName) {
    Write-Host "Setting unattended service account $($unattendedServiceAccount.UserName)..."
    $svcApp.Id | Set-SPPerformancePointSecureDataValues -DataSourceUnattendedServiceAccount $unattendedServiceAccount
}

User Profile Synchronization Service Identity

One thing to watch out for is when setting the account for the User Profile Synchronization Service. This service wants you to use the Farm Account as the identity. This means that your Farm Admin account cannot have it’s password managed by SharePoint if you intend to use this service (or at least, it shouldn’t be unless you don’t mind manually fixing this service every time your password changes – good luck with that BTW). Your Farm Admin account will always be a Managed Account (you can’t change that) so be extra careful when changing this accounts password (either manually or automatically). To set this account using Central Admin you can click Start next to the User Profile Synchronization Service entry on the Services on Server page.

User Profile Synchronization Service

To accomplish the same thing using PowerShell we need to get an instance of the Synchronization Service and set a few properties and call the SetSynchronizationMachine method passing in the username and password of the Farm Admin account (note that it requires the password be passed in as a standard string and not a secure string so I use my previously defined ConvertTo-UnsecureString function):

$syncMachine = Get-SPServer "sp2010dev"
$profApp = Get-SPServiceApplication | where {$_.Name -eq "User Profile Service Application 1"}
$account = Get-Credential "localdev\spfarm"
if ($syncMachine.Address -eq $env:ComputerName) {
    $syncSvc = Get-SPServiceInstance -Server $env:ComputerName | where {$_.TypeName -eq "User Profile Synchronization Service"}
    $syncSvc.Status = [Microsoft.SharePoint.Administration.SPObjectStatus]::Provisioning
    $syncSvc.IsProvisioned = $false
    $syncSvc.UserProfileApplicationGuid = $profApp.Id
    $syncSvc.Update()
    $profApp.SetSynchronizationMachine($syncMachine.Address, $syncSvc.Id, $account.UserName, (ConvertTo-UnsecureString $account.Password)) 
}

if ($syncSvc.Status -ne "Online") {
    Write-Host "Starting User Profile Synchronization Service..."
    Start-SPServiceInstance $syncSvc
}
do {Start-Sleep 2} while ((Get-SPServiceInstance -Server $env:ComputerName | where {$_.TypeName -eq "User Profile Synchronization Service"}).Status -ne "Online")

Summary

As you can see setting the accounts that are used throughout SharePoint 2010 is anything but consistent and in some cases a real pain in the a$$. I know I didn’t cover how to set every account (custom crawl rule accounts, user profile sync connection accounts, others?) but hopefully someone out there has already documented these, or if not perhaps they’d be nice enough to post a comment here for others benefit from (maybe one day I’ll add them myself but for now I think this post is quite long enough). As always, please let me know if I’ve missed something or otherwise got something wrong as I certainly don’t claim to have all the answers.

Happy PowerShelling Smile

Tuesday, August 24, 2010

Announcing Aptillon

Today I finally get to spill the beans on a secret I’ve been wanting to share for a while: today myself and six of my good SharePoint friends are announcing that we have formed a new consulting company! You can read the full announcement on the News page of our new site http://www.aptillon.com/ but I’ve also quoted it here:

Today we are happy to publicly announce a project that we’ve been working on for several months – Aptillon – a new SharePoint consulting company created to fill the need for a dedicated, talented, and respected SharePoint consulting firm with the track record for successful delivery. While the name Aptillon is new, we think you’ll recognize the names of our principals: Todd Baginski, Darrin Bishop, Dan Holme, Gary Lapointe, David Mann, Matthew McDermott, and Maurice Prather.

Each of our principals has depth in many areas of SharePoint. Together our team has depth and breadth across the entire product as well as the supporting technologies like Office, Silverlight, Windows Server, Exchange, SQL Server and more. Between us, we have over 50 years of experience with SharePoint. We are Microsoft Certified Masters, Most Valuable Professionals and Microsoft Certified Professionals aligned to deliver SharePoint solutions for projects of any size. We are passionate about “technology done right”, applying technology smarts to business solutions and having fun while we do it.

We are open for business and already delivering solutions for clients. Please let us know if we can help you.

To help celebrate the launch of Aptillon, we are giving away a “SharePoint 2010 Bookshelf” – 5 books on SharePoint 2010 that the Aptillon founders had a hand in producing. If you’re attending the Best Practices Conference in Washington DC this week, track down Darrin, David or Maurice (they’re all delivering sessions) and exchange business cards with them. After the conference, we’ll draw one lucky winner and ship them the books. It’s just one little way that Aptillon is helping the SharePoint community.

Look for more information in the next few weeks.

- Aptillon founders: Dan, Darrin, David, Gary, Matthew, Maurice, Todd

So I know what your thinking, “but Gary, you just went independent! What about Falchion? I could never pronounce it but you’ve got that sword and those cool business cards, what’s going on?” Have no fear – Falchion isn’t going away – I, like the others, are still independent consultants and still own our own companies. Aptillon is just sliding in on top – it is a way for us to work as a team when it makes sense to do so – we can go after projects together that we couldn’t as individuals and we can do so as one company making it significantly easier on our collective clients.

For me the relationship is a perfect one – I still determine what projects I work on but now I have an extraordinary team that I can leverage and call upon to help me with projects or to feed me projects. We all have a vested interest in seeing each other succeed as our success is Aptillon’s success and Aptillon’s success is our success. To date we’ve already had several projects that we’ve jointly worked on in various capacities and they’ve all gone incredibly well. With this team of individuals I have full confidence that there will be many more to come…

Thursday, August 12, 2010

Getting an Inventory of All SharePoint Documents Using Windows PowerShell

I got an email today asking if I had anything that would generate a report detailing all the documents throughout an entire SharePoint Farm. As this wasn’t the first time I’ve been asked this same question I decided that I’d just go ahead and post the script for generating such a report.

The script is really quite straightforward – it simply iterates through all Web Applications, Site Collections, Webs, Lists, and finally, List Items. I skip any List that is not a Document Library (as well as the Central Admin site) and then build a hash table containing all the data I want to capture. I then convert that hash table to an object which is written to the pipeline.

All of this is placed in a function which I can call and then pipe the output to something like the Out-GridView cmdlet or the Export-Csv cmdlet. I also wrote the script so that it works with either SharePoint 2007 or SharePoint 2010 so that I don’t have to maintain two versions (I could have used cmdlets such as Get-SPWebApplication, Get-SPSite, and Get-SPWeb but there was little benefit to doing so and the script would be limited to SharePoint 2010).

One word of caution – in a large Farm this script should be run off hours or at least on a back facing server (not your WFE) – it’s going to generate a lot of traffic to your database.

function Get-DocInventory() {
    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
    $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    foreach ($spService in $farm.Services) {
        if (!($spService -is [Microsoft.SharePoint.Administration.SPWebService])) {
            continue;
        }
        
        foreach ($webApp in $spService.WebApplications) {
            if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication]) { continue }
            
            foreach ($site in $webApp.Sites) {
                foreach ($web in $site.AllWebs) {
                    foreach ($list in $web.Lists) {
                        if ($list.BaseType -ne "DocumentLibrary") {
                            continue
                        }
                        foreach ($item in $list.Items) {
                            $data = @{
                                "Web Application" = $webApp.ToString()
                                "Site" = $site.Url
                                "Web" = $web.Url
                                "list" = $list.Title
                                "Item ID" = $item.ID
                                "Item URL" = $item.Url
                                "Item Title" = $item.Title
                                "Item Created" = $item["Created"]
                                "Item Modified" = $item["Modified"]
                                "File Size" = $item.File.Length/1KB
                            }
                            New-Object PSObject -Property $data
                        }
                    }
                    $web.Dispose();
                }
                $site.Dispose()
            }
        }
    }
}
Get-DocInventory | Out-GridView
#Get-DocInventory | Export-Csv -NoTypeInformation -Path c:\inventory.csv

Monday, August 9, 2010

MSDN Visual Studio Ultimate Contest Winners

I’m a little late in getting this post out due to getting a bit slammed last week with work, book writing, and last minute preparations for SharePoint Saturday Denver (which was a fantastic event – thank you everyone who attended, sponsored, spoke or volunteered!).

Recently I blogged about a contest I wanted to run to celebrate my new adventures as an independent consultant. There were some great comments submitted and many of them, besides being enjoyable to read and quite flattering, also proved very useful in providing Microsoft some real-world feedback for SharePoint 15 planning (who knows, maybe we’ll even see some of my extensions built into the next version of the product – that would be really cool!).

It was certainly difficult to pick the winners so I had my daughter help me out – and the winners are Frode A and Station! (if you guys could send me an email I’ll get you the activation key you’ll need).

Congratulations to the winners and thank you everyone who submitted a comment!