Serge van den Oever [Macaw]

SharePoint RIP. Azure, Node.js, hybrid mobile apps

  • A templating engine using PowerShell expressions

    [Update Januari 5, 2008: fixed a small bug, content was always saved in encoding ANSI, resulting in the loss of special characters.

    Changed the line:

    Set-Content -Path $destination -value $expandedText

    to

    Set-Content -Path $destination -value $expandedText -encoding $Encoding

    The attached file is updated as well.

    ]

    While working on our Software Factory for SharePoint 2007 solutions I needed to do some simple template expansion. First choice would be to use the Text Templates from the DSL toolset as available in the Visual Studio 2005 SDK, and write the templates in the T4 template language. Problem is that the template expansion I need to do right now needs to expand variables, function  and expressions in the PowerShell language. So I created a small PowerShell script to implement the Template-Expand command to do just that. First some simple explanatory but useless examples:

    $a='template'
    function MyFunction($action) { if ($action -eq 1) { 'a function'} else { 'WRONG!' }}
    ./Template-Expand -text 'This is a [[$a]] test to execute [[MyFunction -action 1]] and to add 2+3=[[2+3]] '
    

    Results in:

    This is a template test to execute a function and to add 2+3=5

    You can also assign the output to a variable like in the following example. In this example I changed the default left and right markers [[ and ]] to the same syntax  used in the T4 template language:

    $result = ./Template-Expand -leftMarker '<#=' -rightMarker '#>' -text 'This is a <#= $a #> test to execute <#= MyFunction -action 1 #> and to add 2+3=<#= 2+3 #>'

    The variable $result now contains the expanded template text.

    Nota bene that the markers are used to construct our matching regular expression as follows: [regex]"$leftMarker(.*?)$rightMarker", so the marker strings must escape special regular expression characters. The default value for the left marker is for example "\[\[".

    I also added some extra options, like the possibility to read the template from a file, and write the expanded template text to a destination file using the options -path and -destination.

    If you have a template file template.txt with the following content:

    <values>
        <value>
    [[    
        $a=10
        $b=20
        $a*$b
        for ($i=0; $i -lt 3; $i++)
        {
            $i*5
            $i*10
        }
    ]]    
        </value>
    </values>

    You can execute the template expansion with the following command:

    ./template-expand -path template.txt -destination templateexpanded.txt

    This will result in a file templateexpanded.txt with the following content:

    <values>
        <value>
    200 0 0 5 10 10 20    
        </value>
    </values>

    I know, the example is useless, but you get the drift;-) Important thing to notice in the example, expressions can consist of multiple lines!

    You can also define functions within your template as in the following example:

    [[
    function SayHelloWorld
    {
        "Hello world!"
    }
    ]]
    And then he said:
    [[SayHelloWorld]]

    If you want to have the configuration of variables and functions in a separate powershell file, use the -psConfigurationPath. The specified file (which must have a .ps1 extension) will be sourced, so variables and functions don't have to be defined in the global context.

    Thanks to this blog entry by the PowerShell team I got the needed delegation stuff working.

    And now the code, happy templating en let me know if it works for you or which features you are missing!!

    Save the code below to Template-Expand.ps1. I also added this file as an attachment to this blog post.

    ---------------cut-----------------cut--------------cut-------------cut------------cut---------cut-------------

    # ==============================================================================================
    # 
    # Microsoft PowerShell Source File -- Created with SAPIEN Technologies PrimalScript 4.1
    # 
    # NAME: Template-Expand.ps1
    # 
    # AUTHOR : Serge van den Oever, Macaw
    # DATE   : December 30, 2006
    # VERSION: 1.0
    #
    # I needed a MatchEvaluator delegate, and found an example at http://blogs.msdn.com/powershell/archive/2006/07/25/678259.aspx
    # ==============================================================================================
    

    Template-Expand

    Simple templating engine to expand a given template text containing PowerShell expressions.

    Arguments:

    $text (optional): The text of the template to do the expansion on (use either $text or $path)

    $path (optional): Path to template to do the expansion on (use either $text or $path)

    $destination (optional): Destination path to write expansion result to. If not specified, the

    expansion result is result as text

    $psConfigurationPath (optional) : Path to file containing PowerShell code. File will be

    sources using ". file", so variables can be declared

    without global scope

    $leftMarker (optional): Left marker for detecting expand expression in template

    $rightMarker (optional): Right marker for detecting expand expression in template

    $encoding (optional): Encoding to use when reading the template file

    Simple usage usage:

    $message="hello"; ./Template-Delegate -text 'I would like to say [[$message]] to the world'

    param ( $text = $null, $path = $null, $destination = $null, $psConfigurationPath = $null, $leftMarker = "[[", $rightMarker = "]]", $Encoding = "UTF8" )

    ==============================================================================================

    Code below from http://blogs.msdn.com/powershell/archive/2006/07/25/678259.aspx

    Creates a delegate scriptblock

    ==============================================================================================

    Helper function to emit an IL opcode

    function emit { param ( $opcode = $(throw "Missing: opcode") )

    if ( ! ($op = [System.Reflection.Emit.OpCodes]::($opcode)))
    {
        throw "emit: opcode '$opcode' is undefined"
    }
    
    if ($args.Length -gt 0)
    {
        $ilg.Emit($op, $args[0])
    }
    else
    {
        $ilg.Emit($op)
    }
    

    }

    function GetDelegate { param ( [type]$type, [ScriptBlock]$scriptBlock )

    # Get the method info for this delegate invoke...
    $delegateInvoke = $type.GetMethod("Invoke")
    
    # Get the argument type signature for the delegate invoke
    $parameters = @($delegateInvoke.GetParameters())
    $returnType = $delegateInvoke.ReturnParameter.ParameterType
    
    $argList = new-object Collections.ArrayList
    [void] $argList.Add([ScriptBlock])
    foreach ($p in $parameters)
    {
        [void] $argList.Add($p.ParameterType);
    }
    
    $dynMethod = new-object reflection.emit.dynamicmethod ("",
        $returnType, $argList.ToArray(), [object], $false)
    $ilg = $dynMethod.GetILGenerator()
    
    # Place the scriptblock on the stack for the method call
    emit Ldarg_0
    
    emit Ldc_I4 ($argList.Count - 1)  # Create the parameter array
    emit Newarr ([object])
    
    for ($opCount = 1; $opCount -lt $argList.Count; $opCount++)
    {
        emit Dup                    # Dup the array reference
        emit Ldc_I4 ($opCount - 1); # Load the index
        emit Ldarg $opCount         # Load the argument
        if ($argList[$opCount].IsValueType) # Box if necessary
     {
            emit Box $argList[$opCount]
     }
        emit Stelem ([object])  # Store it in the array
    }
    
    # Now emit the call to the ScriptBlock invoke method
    emit Call ([ScriptBlock].GetMethod("InvokeReturnAsIs"))
    
    if ($returnType -eq [void])
    {
        # If the return type is void, pop the returned object
        emit Pop
    }
    else
    {
        # Otherwise emit code to convert the result type which looks
        # like LanguagePrimitives.ConvertTo(value, type)
    
        $signature = [object], [type]
        $convertMethod =
            [Management.Automation.LanguagePrimitives].GetMethod(
                "ConvertTo", $signature);
        $GetTypeFromHandle = [Type].GetMethod("GetTypeFromHandle");
        emit Ldtoken $returnType  # And the return type token...
        emit Call $GetTypeFromHandle
        emit Call $convertMethod
    }
    emit Ret
    
    #
    # Now return a delegate from this dynamic method...
    #
    
    $dynMethod.CreateDelegate($type, $scriptBlock)
    

    }

    ==============================================================================================

    Write-Verbose "Template-Expand:" if ($path -ne $null) { if (!(Test-Path -Path $path)) { throw "Template-Expand: path '$path' can't be found" }

    # Read text and join the returned Object[] with newlines
    $text = [string]::join([environment]::newline, (Get-Content -Path $path -Encoding $Encoding))
    

    }

    if ($text -eq $null) { throw 'Template-Expand: template to expand should be specified through -text or -path option' }

    if ($psConfigurationPath -ne $null) { # Source the powershell configuration, so we don't have to declare variables in the # configuration globally if (!(Test-Path -Path $psConfigurationPath)) { throw "Template-Expand: psConfigurationPath '$psConfigurationPath' can't be found" } . $psConfigurationPath }

    $pattern = New-Object -Type System.Text.RegularExpressions.Regex -ArgumentList "$leftMarker(.*?)$rightMarker",([System.Text.RegularExpressions.RegexOptions]::Singleline) $matchEvaluatorDelegate = GetDelegate System.Text.RegularExpressions.MatchEvaluator { $match = $args[0] $expression = $match.get_Groups()[1].Value # content between markers Write-Verbose " -- expanding expression: $expression" trap { Write-Error "Expansion on template '$name' failed. Can't evaluate expression '$expression'. The following error occured: $_"; break } Invoke-Expression -command $expression }

    Execute the pattern replacements and return the result

    $expandedText = $pattern.Replace($text, $matchEvaluatorDelegate)

    if ($destination -eq $null) { # Return as string $expandedText } else { Set-Content -Path $destination -value $expandedText -encoding $Encoding }

  • PowerShell pitfalls: reading text from file using get-content

    I had a really strange effect in PowerShell that puzzled me for hours!

    I have the following script:

    $a = @'
    One
    Two
    '@
    $a
    $p = [regex]"One"
    $p.Replace($a, "OneReplaced")
    

    $b = get-content -path templ.txt $b $q = [regex]"One" $q.Replace($b, "OneReplaced")

    And a file templ.txt containing the following text:

    One
    Two

    When I execute the script I get the following output:

    One
    Two
    OneReplaced
    Two
    One
    Two
    OneReplaced Two

    So what happens:

    I initialize a variable $a with two lines of text: line 1: One, line 2: Two. When I display variable $a it shows One and Two on two seperate lines. I now replace One with OneReplaced. Output of the replacement is two lines of text. Line 1: OneReplaced, line 2: Two.

    Everything ok so far.

    I now read the contents of variable $b from the file templ.txt. This file contains two lines of text: line 1: One, line 2: Two. When I display variable $b it shows One and Two on two seperate lines. I now replace One with OneReplaced. Output of the replacement is ONE LINE of text: OneReplaced Two.

    This is not what I expected.

    After a lot of debugging I found out why this happened. When you do $b = get-content -path templ.txt you don't get a string back, but an object array. You can see that when you do: (get-content -path templ.txt).GetType(), this displays:

    IsPublic IsSerial Name                                     BaseType
    -------- -------- ----                                     --------
    True     True     Object[]                                 System.Array

    If you inspect the Object[] variable $b, you see that $b[0] = "One" and $b[1] = "Two".

    When the command $q.Replace($b, "OneReplaced") is executed, the variable $b of type Object[] is cast to a string. This cast combines the string objects in the Object[] by appending them with a space in the middle.

    So what is the simple solution to all this: when reading the content, join all lines with a newline, as in the following code line:

    $b = [string]::join([environment]::newline, (get-content -path templ.txt))

    Such a pity that this costed me 4 hours, I thought it was in the regular expression replacement:-(

    But the good thing is that I can now solve that nasty bug in my Template Engine using PowerShell expressions.

  • PowerShell: calculating a relative path

    Sometimes you need a simple thing like calculating the relative path of a file given its full path and a base path. For example you have a file c:\a\b\c\d\e.doc and a base path c:\a\b\c, the relative path is now d\e.doc. I use the following PowerShell function to do this, actually using only .Net framework commands;-)

    function global:RelativePath
    {
        param
        (
            [string]$path = $(throw "Missing: path"),
            [string]$basepath = $(throw "Missing: base path")
        )
        
        return [system.io.path]::GetFullPath($path).SubString([system.io.path]::GetFullPath($basepath).Length + 1)
    }    
    

    Note that I use GetFullPath to get rid of things like .. in a path, like in c:\a\b..\c\d\e.

  • PowerShell and debugging

    I still did not found a good debugging environment for my PowerShell development. The thing I use most is a command I found in this blog post, part of a great blog post series on PowerShell debugging on the Windows PowerShell blog by the PowerShell team. The command allows you to set a breakpoint at any location in your script, and enter an interactive shell where you can do whatever you need to do. I made a minimal modification to the command so I can see at which breakpoint I am:

    # Start-Debug (alias: bp)
    # Stop running current script and go into interactive mode so values of variables can be inspected
    function global:Start-Debug
    {
        param
        (
            $name = ""
        )
           $scriptName = $MyInvocation.ScriptName
           function prompt
           {
              "Debugging [{0}]>" -f $(if ([String]::IsNullOrEmpty($scriptName)) { "globalscope:$name" } else { "$scriptName:$name" } )
           }
           $host.EnterNestedPrompt()
    }
    Set-Alias bp Start-Debug -Scope "global"

    You can now set a breakpoint in your code by adding just the command bp, or by adding a parameter like in bp "new piece of code" so you get a prompt indicating at which breakpoint you are.

    If you enter the nested prompt you can do things like listing all variables with the command ls variable:*, or show the values of all ther currently defined environment variables with ls env:*.

  • PowerShell: "Cleaning" a path name, searching for smarter solution...

    When you construct a path in powershell with for example Join-Path, you can get things like ".."in your path.

    For example:

    Join-Path -Path "c:\program files" -ChildPath "..\Temp"

    results in:

    "c:\program files\..\Temp"

    Instead of:

    "c:\Temp"

     I had to solve this problem, and now came up with the below dirty solution. Any cleaner solutions are appreciated;-)

     

    # CleanPathName
    # Clean a given path from elements like .. in the path and trailing '\',
    # so c:\program files\..\temp\ becomes c:\temp.
    # The given Path must exist.
    # Should be rewritten if a cleaner approach is found
    function global:CleanPathName
    {
      param
      (
        [string]$path = $(throw "Missing: path")
      )
      $orgLocation = Get-Location
      Set-Location -Path $path
      $cleanPath = Get-Location
     

    # restore original location
      Set-Location -Path $orgLocation

      return $cleanPath.Path
    }

  • PowerShell and using .Net enum types

    [NOTE: Because this page is the first hit in Google when you search on Powershell + enum, and I landed on this page too often myself, I decided to expand the page with some additional information] 

    Scripting is heaven when you can utilize the complete .Net framework. One thing that was not directly clear for me was how to use enum values when calling .Net functions. It happened to be really easy, just cast the string representative of the enum value.

    $myString = "/A/B/C//D/E//F/G"

    $myParts = $myString.Split("/", [System.StringSplitOptions]"RemoveEmptyEntries")

    results in an array of A,B,C,D,E,F,G

    UPDATE: it happens to be even easier, you can say:

    [System.Text.RegularExpressions.RegexOptions]::Singleline

    And you can even binary-or them together:

    [System.Text.RegularExpressions.RegexOptions]::Singleline -bor [System.Text.RegularExpressions.RegexOptions]::ExplicitCapture

    It is also possible that an enum is defined within an enclosed type, in this case use [<namespace>.<enclosing type>+<nested type>]::EnumValue (thanks Alex)

    For example:

    [Microsoft.SharePoint.SPViewCollection+SPViewType]::Gantt

    It is also possible to create a new real .net enum from PowerShell script. See http://blogs.msdn.com/powershell/archive/2007/01/23/how-to-create-enum-in-powershell.aspx

    And in PowerShell 2.0 you can do it even cleaner: http://thepowershellguy.com/blogs/posh/archive/2008/06/02/powershell-v2-ctp2-making-custom-enums-using-add-type.aspx

     

  • PowerShell Community Extensions 1.0 Released @ CodePlex

    Keith Hill released the PowerShell Community Extensions version 1.0 at http://www.codeplex.com/PowerShellCX. This well documented set of extra CmdLets, aliases and scripts provides you with a lot of goodies that make life in both PowerShell as a Shell and PowerShell as a scripting language way easier.

    One of the nice goodies is the Get-CmdletMaml cmdlet which reflects over a snapin assembly and produces PowerShell MAML help.  It uses a number of attributes defined by PowerShell and .NET as well as some defined by PSCX. Maml is an XML format to describe help on PowerShell cmdlets and their arguments.

    It also contains the latest version of TabExpansion functionality I wrote about in this blog post.

    See the Documentation for details on all the new commands. Check this out if you live your life in PowerShell.

    And by the way: CodePlex provides source code to the goodies, so there is a lot of code here to learn from how to do PowerShell development!

  • PowerShell: $null and parameter type

    One of those things that can take you ages to find out: I create a function where a parameters that can either have a string value or can be $null:

    function doit
    {
      param
      (
        [string]$a = $( throw "Missing: parameter a"),
        $b = $( throw "Missing: parameter b")
      )

      if ($a -eq $null) { Write-Host "a is null" } else { write-Host "a is not null" }
      if ($b -eq $null) { Write-Host "b is null" } else { Write-Host "b is not null" }
    }

    If I call this function with: doit $null $null

    I get the following result:

    a is not null
    b is null

    What happened: because I specified a type [string] for parameter a, the $null value gets casted to a string with value "". 

  • PowerShell: strict mode and the variable provider

    When working in script languages where declaration of variables is not used there is always the problem of typos in names. PowerShell has a possibility to use 'strict" mode: when a variable is used without an initial assignment you get an error:

    set-psdebug -strict -trace 0

    But now I have problems with third-party scripts that check for the existance of variables by comparing them to $null like this:

    if ($var -eq $null) { ... }

    This throws an error in strict mode.

    To solve this problem use the variable provider. The variable provider gives you access to all variables. There is also a provider for functions, the environment variables, etc. You can even write your own providers.

    Check this out:

    ls variable:*

    ls function:*

    ls env:*

    To prevent the error in the variable existance check do the following:

    if (!(Test-Path variable:var)) { ... }

  • PowerShell: Tab Expansion wonders

    I assume that by now everyone is using PowerShell as their default shell, if not, time to get rid of that ancient cmd.exe thingy;-) If you are working within PowerShell, one of the powerful features is tab expansion. When you start a command you can type the first letters of the command, press tab and voila. But it does not have to stop there! Tab expansion can be extended, and that is whart a lot of people are doing!

    For some background on tab expansion see the blog entry by the powershell team on this topic: http://blogs.msdn.com/powershell/archive/2006/04/26/584551.aspx

    You can find a great series on developing tab expansion at monadblog.blogspot.com.

    And /\/\o\/\/ has another great series at www.thepowershellguy.com

    I'm currently using a tab expansion script from http://powershell.wiki.com/TabExpansion, it's wonderful!

  • WPF/E is there, but one thing puzzles me...

    The samples of WPF/E look stunning, the kind of effects we are used to see in Flash applications. I had a quick look at the WPF/E SDK, and one thing puzzled me: it looks like WPF/E currently only provides a DOM, and that the Javascript runtime as available in your browser is used to access the WPF/E DOM. This means that you still have to solve all Javascript language differences between different browser platforms. It also means that things like interaction with the server (for example AJAX calls) must be handled through the browser Javascript. If my quick observations are correct it means that for WPF/E we are still dependend on the same cross-browser AJAX solution libraries as we are using for our current AJAX sites; libraries like prototype, , scriptaculous, AJAX .NET Professional, and of course ASP.NET Ajax.

    At one side this is great, all knowledge on this technology can be reused and the possibilities are infinite. On the other side: there is not a well defined boundary to do your programming in, where you are sure it works on all platforms. I think this is one of the advantages of a platform like Flash.

    On the other hand: in a future release a micro version of the .Net framework will be embedded, maybe this will provide this boundary. In the mean time I forsee all the cross browser problems we all love in our current "old technology" web solutions.

  • ohloh.net: actual facts on 3000 open-source projects with 220 million lines of source code - impressive!!

    Thanks to born2code (Dutch blogger) I was introduced to the impressive site http://ohloh.net, a directory of open-source projects where the code repositories of the projects are crawled to gather all kinds of statistic and historical facts.

    http://ohloh.net is launched by ex Microsoft employees to evaluate open-source projects.

    Why is this site interesting? It can provide you with information on why you could put trust in the project to include in your own solution because it has an active community of developers, or that you should be careful because it is a one man show, or because ti is developed in the languages that your development team can support.

    For example the Mono project has the following statistics:

    And if you need some "facts" on what it would cost you if you have to do the same development yourself: 

    For more background information on the site see the followoing articles:

    Check it out!

  • PowerShell and some adventures in environment variables, quotes and output wrapping

    Summary: Solving issues with implementing a PowerShell script that generates PowerShell code with correct quoting and no output wrapping, and calling this PowerShell generator script from a cmd.exe batch script.

    The story:

    In my adventures with PowerShell I have such a simple problem that is giving me a headache for hours now.

    I want to write out a set environment variables to a PowerShell file as PowerShell variables with the same name as the environment variable that are assigned the value of the environment variable.

    So there is an environment variable set as follows:

    set MyFirstLittleEnvironmentVariable=Hello Amsterdam!

    And I want to write this out to:

    $MyFirstLittleEnvironmentVariable = 'Hello Amsterdam!'

    Powerful as PowerShell is, this is simple. For example I want to write out all variables starting with "My":

    Get-Item -path env:My* | foreach { [String]::Format("{0}{1} = {2}{3}{4}", "`$", $_.Name, "`'", $_.Value, "`'")

    Note all the trickery to get the quotes around the value, if you know a smarter way, please let me know. This costed me another hour:-(

    This all works nice and sweet, if I execute this command from a PowerShell prompt I get exactly what I want.

    Now I want to redirect this output into a file. I save the above command to file SaveMyEnvironmentVariables.ps1, and then I execute the following command:

    SaveMyEnvironmentVariables.ps1 > "c:\My Script Files\MyEnvironmentVariables.ps1"

    And what happens: the outputted lines are wrapped at 80 characters, not something you want when generating code!

    After some digging I found some some links that helped me out a little bit, but still not solved the problem:

    In my situation my output goes through Host-Out, and Host-Out has by default a formatting specified of 80 characters. See also help about_diplay.xml in your PowerShell command prompt.

    I want to save my output by redirecting the output of my PowerShell script to another file. I could not get this working.

    My current solution is:

    Get-Item -path env:My* | foreach { [String]::Format("{0}{1} = {2}{3}{4}", "`$", $_.Name, "`'", $_.Value, "`'") } | Out-File -FilePath $Args[0] -width 2147483647

    UPDATE: Thanks to The PowerShell Guy I could bring my solution back to the way more readable version below:

    Get-Item -path env:My* | foreach { "`$$($_.Name) = `'$($_.Value)`'" } | Out-File -FilePath $Args[0] -width 2147483647

    And for real good examples of the usage of PowerShell, have a look at http://www.thepowershellguy.com.

    Where Args[0] is the first parameter specified to the script and 2147483647 is the max width (it's a signed 32 bit parameter).

    I now have to call my script as follows:

    SaveMyEnvironmentVariables.ps1 "c:\My Script Files\MyEnvironmentVariables.ps1" from the PowerShell prompt.

    But actually I need to call it from a good old cmd.exe batch script. And there is got complex, so that is why I initially decided to solve my problem by redirecting my output. Examine the following statement carefully and especially look at the quotes;-), it took me another half an hour to solve all the problems you get with spaces in paths:

    PowerShell -Command "& 'c:\My Script Files\SaveMyEnvironmentVariables.ps1' 'c:\My Script Files\MyEnvironmentVariables.ps1'"

  • NAnt XmlList command updated

    A while a go I wrote a handly NAnt task to select data from XML files using XPath expressions. A few days ago I got a reaction on the blog by Matt who wants to try to get it in NAntContrib, and suddenly I got a new and improved version by Jonni Faiga through e-mail!!!

    I also included a small zip file with the source code, a small build script, and a dll (probably for .Net 2.0).

    Extract the zip file somewhere, go to the directory in a command shell, execute NAnt and you have a tested dll for your .Net platform. Copy the resulting dll Macaw.XmlList.dll next to your NAnt executable and for the rest of your programming live you have the power of the xmllist command at your fingertips!

    Matt, I hope you can get it included in NantContrib!

    The new and improved version:

    // Serge van den Oever (serge@macaw.nl)
    // Based on idea from weblog entry: http://blogs.geekdojo.net/rcase/archive/2005/01/06/5971.aspx combined with the code of xmlpeek.
    // Feedback by Matt (http://weblogs.asp.net/soever/archive/2005/05/08/406101.aspx)
    // Extended by Jonni Faiga [december 1, 2006]
    // Publication of this source in weblog entry: http://weblogs.asp.net/soever/archive/2006/12/01/nant-xmllist-command-updated.aspx
    

    using System; using System.Globalization; using System.IO; using System.Text; using System.Xml; using System.Collections.Specialized;

    using NAnt.Core; using NAnt.Core.Attributes; using NAnt.Core.Types;

    namespace Macaw { /// <summary> /// Extracts text from an XML file at the locations specified by an XPath /// expression, and return those texts separated by a delimiter string. /// </summary> /// <remarks> /// <para> /// If the XPath expression specifies multiple nodes the node are seperated /// by the delimiter string, if no nodes are matched, an empty string is returned. /// </para> /// </remarks> /// <example> /// <para> /// The example provided assumes that the following XML file (xmllisttest.xml) /// exists in the current build directory. /// </para> /// <code> /// <![CDATA[ /// <?xml version="1.0" encoding="utf-8" ?> /// <xmllisttest> /// <firstnode attrib="attrib1">node1</firstnode> /// <secondnode attrib="attrib2"> /// <subnode attrib="attribone">one</subnode> /// <subnode attrib="attribtwo">two</subnode> /// <subnode attrib="attribthree">three</subnode> /// <subnode attrib="attribtwo">two</subnode> /// </secondnode> /// <thirdnode xmlns="http://thirdnodenamespace">namespacednode</thirdnode> /// <fourthnode>${myproperty}</fourthnode> /// <fifthnode>${myproperty=='Hi'}</fifthnode>
    /// </xmllisttest>
    /// ]]> /// </code> /// </example> /// <example> /// <para> /// The example reads numerous values from this file: /// </para> /// <code> /// <![CDATA[ /// <?xml version="1.0" encoding="utf-8" ?> /// <project name="tests.build" default="test" basedir="."> /// <target name="test"> /// <!-- TEST1: node exists, is single node, get value --> /// <xmllist file="xmllisttest.xml" property="prop1" delim="," xpath="/xmllisttest/firstnode"/>
    /// <echo message="prop1=${prop1}"/> /// <fail message="TEST1: Expected: prop1=node1" unless="${prop1 == 'node1'}"/> ///
    /// <!-- TEST2: node does not exist --> /// <xmllist file="xmllisttest.xml" property="prop2" delim="," xpath="/xmllisttest/nonexistantnode" />
    /// <echo message="prop2='${prop2}'"/> /// <fail message="TEST2: Expected: prop2='empty'" unless="${prop2 == ''}"/> ///
    /// <!-- TEST3: node exists, get attribute value --> /// <xmllist file="xmllisttest.xml" property="prop3" delim="," xpath="/xmllisttest/firstnode/@attrib" />
    /// <echo message="prop3=${prop3}"/> /// <fail message="TEST3: Expected: prop3=attrib1" unless="${prop3 == 'attrib1'}"/> ///
    /// <!-- TEST4: nodes exists, get multiple values --> /// <xmllist file="xmllisttest.xml" property="prop5" delim="," xpath="/xmllisttest/secondnode/subnode" />
    /// <echo message="prop5=${prop5}"/> /// <fail message="TEST4: Expected: prop5=one,two,three,two" unless="${prop5 == 'one,two,three,two'}"/> ///
    /// <!-- TEST5: nodes exists, get multiple attribute values --> /// <xmllist file="xmllisttest.xml" property="prop5" delim="," xpath="/xmllisttest/secondnode/subnode/@attrib" />
    /// <echo message="prop5=${prop5}"/> /// <fail message="TEST5: Expected: prop5=attribone,attribtwo,attribthree,attribtwo" unless="${prop5 == 'attribone,attribtwo,attribthree,attribtwo'}"/> ///
    /// <!-- TEST6: nodes exists, get multiple values, but only unique values --> /// <xmllist file="xmllisttest.xml" property="prop6" delim="," xpath="/xmllisttest/secondnode/subnode" unique="true"/>
    /// <echo message="prop6=${prop6}"/> /// <fail message="TEST6: Expected: prop6=one,two,three" unless="${prop6 == 'one,two,three'}"/> ///
    /// <!-- TEST7: nodes exists, get multiple attribute values --> /// <xmllist file="xmllisttest.xml" property="prop7" delim="," xpath="/xmllisttest/secondnode/subnode/@attrib" unique="true"/>
    /// <echo message="prop7=${prop7}"/> /// <fail message="TEST7: Expected: prop7=attribone,attribtwo,attribthree" unless="${prop7 == 'attribone,attribtwo,attribthree'}"/> ///
    /// <!-- TEST8: node exists, is single node, has namespace http://thirdnodenamespace, get value --> /// <xmllist file="xmllisttest.xml" property="prop8" delim="," xpath="/xmllisttest/x:thirdnode">
    /// <namespaces> /// <namespace prefix="x" uri="http://thirdnodenamespace" /> /// </namespaces> /// </xmllist> /// <echo message="prop8=${prop8}"/> /// <fail message="TEST8: Expected: prop8=namespacednode" unless="${prop8 == 'namespacednode'}"/> /// /// <!-- TEST9: node exists, is single node, get value expanded via current nant properties--> /// <property name="myproperty" value="Hi"/> /// <xmllist file="xmllisttest.xml" property="prop9" delim="," xpath="/xmllisttest/fourthnode"/> /// <echo message="prop9=${prop9}"/> /// <fail message="TEST9: Expected: prop1=${myproperty}" unless="${prop9 == myproperty}"/> /// /// <!-- TEST10: node exists, is single node, get value expanded via current nant function--> /// <xmllist file="xmllisttest.xml" property="prop10" delim="," xpath="/xmllisttest/fifthnode"/> /// <echo message="prop10=${prop10}"/> /// <fail message="TEST10: Expected: prop10=True" unless="${prop10 == 'True'}"/> /// </target> /// </project> /// ]]> /// </code> /// Result when you run this code: /// <code> /// <![CDATA[ /// test: /// /// [echo] prop1="node1" /// [echo] prop2="''" /// [echo] prop3="attrib1" /// [echo] prop5="one,two,three,two" /// [echo] prop5="attribone,attribtwo,attribthree,attribtwo" /// [echo] prop6="one,two,three" /// [echo] prop7="attribone,attribtwo,attribthree" /// [echo] prop8="namespacednode" /// [echo] prop9="Hi" /// [echo] prop10="True" /// /// BUILD SUCCEEDED /// ]] /// </code> /// </example> [TaskName ("xmllist")] public class XmlListTask : Task { #region Private Instance Fields

        <span style="color: rgb(0,0,255)">private</span> <span style="color: rgb(0,128,128)">FileInfo</span> _xmlFile;
        <span style="color: rgb(0,0,255)">private</span> <span style="color: rgb(0,0,255)">string</span> _xPath;
        <span style="color: rgb(0,0,255)">private</span> <span style="color: rgb(0,0,255)">string</span> _property;
        <span style="color: rgb(0,0,255)">private</span> <span style="color: rgb(0,0,255)">string</span> _delimiter = <span style="color: rgb(128,0,0)">","</span>;
        <span style="color: rgb(0,0,255)">private</span> <span style="color: rgb(0,0,255)">bool</span> _unique = <span style="color: rgb(0,0,255)">false</span>; <span style="color: rgb(0,128,0)">// assume we return all values
    

    private XmlNamespaceCollection _namespaces = new XmlNamespaceCollection(); private bool _expandProps = true;

    #endregion Private Instance Fields

    #region Public Instance Properties /// <summary> /// The name of the file that contains the XML document /// that is going to be interrogated. /// </summary> [TaskAttribute("file", Required=true)] public FileInfo XmlFile { get { return _xmlFile; } set { _xmlFile = value; } }

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// The XPath expression used to select which nodes to read. /// </summary> [TaskAttribute ("xpath", Required = true)] [StringValidator (AllowEmpty = false)] public string XPath { get { return _xPath; } set { _xPath = value; } }

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// The property that receives the text representation of the XML inside /// the nodes returned from the XPath expression, seperated by the specified delimiter. /// </summary> [TaskAttribute ("property", Required = true)] [StringValidator (AllowEmpty = false)] public string Property { get { return _property; } set { _property = value; } }

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// The delimiter string. /// </summary> [TaskAttribute ("delim", Required = false)] [StringValidator (AllowEmpty = false)] public string Delimiter { get { return _delimiter; } set { _delimiter = value; } }

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// If unique, no duplicate vaslues are returned. By default unique is false and all values are returned. /// </summary> [TaskAttribute ("unique", Required = false)] [BooleanValidator()] public bool Unique { get { return _unique; } set { _unique = value; } }

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// Namespace definitions to resolve prefixes in the XPath expression. /// </summary> [BuildElementCollection("namespaces", "namespace")] public XmlNamespaceCollection Namespaces { get { return _namespaces; } set { _namespaces = value; } } /// <summary> /// If true, the any nant-style properties on the result will be /// expanded before returning. Default is true. /// </summary> [TaskAttribute("expandprops")] [BooleanValidator()] public bool ExpandProperties { get{ return _expandProps; } set { _expandProps = value; } }

    #endregion Public Instance Properties

    #region Override implementation of Task

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// Executes the XML reading task. /// </summary> protected override void ExecuteTask() { Log(Level.Verbose, "Looking at '{0}' with XPath expression '{1}'.", XmlFile.FullName, XPath);

            <span style="color: rgb(0,128,0)">// ensure the specified xml file exists
    

    if (!XmlFile.Exists) { throw new BuildException(string.Format(CultureInfo.InvariantCulture, "The XML file '{0}' does not exist.", XmlFile.FullName), Location); } try { XmlDocument document = LoadDocument(XmlFile.FullName); Properties[Property] = ExpandProps(GetNodeContents(XPath, document)); } catch (BuildException ex) { throw ex; // Just re-throw the build exceptions. } catch (Exception ex) { throw new BuildException(string.Format(CultureInfo.InvariantCulture, "Retrieving the information from '{0}' failed.", XmlFile.FullName), Location, ex); } }

    #endregion Override implementation of Task

    #region private Instance Methods

        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// Loads an XML document from a file on disk. /// </summary> /// <param name="fileName">The file name of the file to load the XML document from.</param> /// <returns> /// A <see cref="XmlDocument">document</see> containing /// the document object representing the file. /// </returns> private XmlDocument LoadDocument(string fileName)
    { XmlDocument document = null;

            <span style="color: rgb(0,0,255)">try</span> 
            {
                document = <span style="color: rgb(0,0,255)">new</span> XmlDocument();
                document.Load(fileName);
                <span style="color: rgb(0,0,255)">return</span> document;
            } 
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">Exception</span> ex) 
            {
                <span style="color: rgb(0,0,255)">throw</span> <span style="color: rgb(0,0,255)">new</span> BuildException(<span style="color: rgb(0,0,255)">string</span>.Format(<span style="color: rgb(0,128,128)">CultureInfo</span>.InvariantCulture,
                    <span style="color: rgb(128,0,0)">"Can't load XML file '{0}'."</span>, fileName), Location, 
                    ex);
            }
        }
    
        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// Gets the contents of the list of nodes specified by the XPath expression. /// </summary> /// <param name="xpath">The XPath expression used to determine the nodes.</param> /// <param name="document">The XML document to select the nodes from.</param> /// <returns> /// The contents of the nodes specified by the XPath expression, delimited by /// the delimiter string. /// </returns> private string GetNodeContents(string xpath, XmlDocument document) { XmlNodeList nodes;

            <span style="color: rgb(0,0,255)">try</span> 
            {
                XmlNamespaceManager nsMgr = <span style="color: rgb(0,0,255)">new</span> XmlNamespaceManager(document.NameTable);
                <span style="color: rgb(0,0,255)">foreach</span> (XmlNamespace xmlNamespace <span style="color: rgb(0,0,255)">in</span> Namespaces) 
                {
                    <span style="color: rgb(0,0,255)">if</span> (xmlNamespace.IfDefined &amp;&amp; !xmlNamespace.UnlessDefined) 
                    {
                        nsMgr.AddNamespace(xmlNamespace.Prefix, xmlNamespace.Uri);
                    }
                }
                nodes = document.SelectNodes(xpath, nsMgr);
            } 
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">Exception</span> ex) 
            {
                <span style="color: rgb(0,0,255)">throw</span> <span style="color: rgb(0,0,255)">new</span> BuildException(<span style="color: rgb(0,0,255)">string</span>.Format(<span style="color: rgb(0,128,128)">CultureInfo</span>.InvariantCulture,
                    <span style="color: rgb(128,0,0)">"Failed to execute the xpath expression {0}."</span>, xpath), 
                    Location, ex);
            }
    
            Log(Level.Verbose, <span style="color: rgb(128,0,0)">"Found '{0}' nodes with the XPath expression '{1}'."</span>,
                nodes.Count, xpath);
    
            <span style="color: rgb(0,128,0)">// collect all strings in a string collection, skip duplications if Unique is true
    

    StringCollection texts = new StringCollection(); foreach (XmlNode node in nodes) { string text = node.InnerText; if (!Unique || !texts.Contains(text)) { texts.Add(text); } }

            <span style="color: rgb(0,128,0)">// Concatenate the strings in the string collection to a single string, delimited by Delimiter
    

    StringBuilder builder = new StringBuilder(); foreach (string text in texts) { if (builder.Length > 0) { builder.Append(Delimiter); } builder.Append(text); }

            <span style="color: rgb(0,0,255)">return</span> builder.ToString();
        }
        <span style="color: rgb(128,128,128)">///</span><span style="color: rgb(0,128,0)"> </span><span style="color: rgb(128,128,128)">&lt;summary&gt;
    

    /// Expands project properties in the string /// </summary> /// <param name="result"></param> /// <returns></returns> private string ExpandProps(string result) { if (Properties == null || !ExpandProperties) { return result; } return Properties.ExpandProperties(result, null); } #endregion private Instance Methods } }

     

  • PowerShell: calling a function with parameters

    I just started with PowerShell to do some complex scripting. As a beginner in this new language I will probably run into all the quirks that the language has, but hey thats the fun with learning something new. The first quirk: calling a function with parameters.

    function f([string]$a, [string]$b)
    {
      Write-Host "a:", $a, " b:", $b
    }

    f("hello", "world") # Results in: a: hello world b:
    f "hello" "world"   # Results in a: hello b: world

    If you put something between parentheses, it is executed as an expression first.

    For more information on what you can do with functions, execute the following command in your PowerShell: Get-Help about_function

  • Running Wss2 and Wss3 side-by-side

    I didn't know that it was possible, but Microsoft posted a document describing how to do it. Didn't test it out yet, please share you're experiences if you do try it out. Would be great for on a development box where you are developing things for both platforms.

     

  • SharePoint 2007: using the masterpage from your site in custom _layouts pages

    I got a question from Jeff:

    I'm wondering if you have experimented with creating application pages that will both run in the sharepoint context and that can use the default.master of the web. When creating application pages in the _layouts folder these pages can use the application.master, but cannot use the default.master.

    I tried to do some theoretical thinking on this topic, but it could be that I'm way off. So here are my thoughts. Please let me know if you tried it out, if you were successful, and what the best solution to this interesting problem is.

    Hi Jeff,

    If I understand you correctly you want to create application pages running in _layouts that uses the same masterpage as the site in which context the page is running.

    First thing is that if you want to use a masterpage from the site context, you need to have the same content placeholders as are expected by the master page.

    Master pages can be loaded dynamically. This can be done by assigning a master page file to the MasterPageFile property in the Page object. This property may only be assigned in the Page PreInit event, this is the first event executed page execution lifecycle.

    SharePoint has a set of static and dynamic tokens for specifying the masterpage to use:

    ~masterurl, ~site, and ~sitecollection. I assume you already tried to use ~site, that would be the easiest solution.

    Assuming that ~site does not work, one problem is now: how can we access the master page file that is in the site context. I don't know if it works if you specify a path pointing to a file in the site, because we are running in a different virtual directory. Otherwise you could implements a VirtualPathProvider that allows you to access files in the SPWeb that is your current context.

    Could be that you first have to assign a dummy masterpage that has all the correct placeholders, and that this masterpage must be stored in the _layouts pages as well.

    Anyone?

    UPDATE: From the comments, Roni Hofer confirms that he got it working as follows:

    protected override void OnPreInit(EventArgs e)

    {

     base.OnPreInit(e);

     SPWeb myWeb = SPControl.GetContextSite(Context).OpenWeb();

     string strUrl = myWeb.ServerRelativeUrl + "/_catalogs/masterpage/my.master";

     this.MasterPageFile = strUrl;

    }

     

    Where "my.master" has to be stored in master page gallery of the site.

  • SharePoint Solution Generator - part 2: the internals of the created site definition project

    In the first part of this serie on SharePoint Solution Generator I just went through the creation of a site definition project, compiling it, deploying it, and create a new site based on our new site definition. Now that we know that that part works, it is time to look into what we exactly are getting in the site definition project as created by the SharePoint Solution Generator. The SharePoint Solution Generator is part of Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions, a set of tools and templates for creating solutions for SharePoint 2005 that recently came out in beta. See this blog post for more information.

    As a quick recall what we are looking at: I created a site based on the out of the box team site site definition, created a site definition solution from it with the SharePoint Solution Generator, which resulted in a compilable and deployable Visual Studio 2005 C# project with the following structure:

    A good look at onet.xml

    A site definition is described by its onet.xml file. A small recap from the Windows SharePoint Service SDK on the function of onet.xml:

    Functions of ONET.XML

    ONET.XML has the following functions:

    • Defines the top and side navigation areas that appear on the home page and in list views.

    • Specifies the list definitions that are used in the site definition and whether they are available for creating lists on the Create page.

    • Specifies document templates that are available for creating document library lists on the New Document Library page and specifies the files used in the document templates.

    • Defines the base list types from which default Microsoft Windows SharePoint Services lists are derived.

    • Specifies the configurations of lists and modules that are used within site definitions.

    Site Definition Tasks with ONET.XML

    The following kinds of tasks can be performed in ONET.XML to customize a site definition:

    • Specify an alternate cascading style sheet (CSS) file, JavaScript file, or ASPX header file for a site definition.

    • Modify navigation areas for the home page and list pages.

    • Add a list definition as an option to the Create page.

    • Add a document template for creating document libraries.

    • Define a configuration for a site definition, specifying the lists, modules, files, and Web Parts that are included when a site is instantiated.

    Because we created our site definition solution from an untouched instance of the wss team site, it is an interesting excercise to compare the onet.xml file in our site definition solution with the onet.xml file in the wss team site site definition (sts).

    A good way to do such a comparision is by using a good diff tool, I used the SuperDiff power toy for Visual Studio 2005:

    During the comparison the following things came to my attention:

    1. Resources get expanded. The original site definition uses resource files for all texts in the site definition. Our new site definition has all the texts expanded into the language we selected when creating the instance of the site we created our site definition project from (see picture above). This is understandable, on creation of the site the resource references in the onet.xml are expanded into the selected language. But actually this is a pity and a bit of a design flaw in WSS. WSS would have been better designed if it was possible to have resource references in the site instance as well. In MOSS 2007 there is such a feature for the PublishingWeb sites called variations. But what does this mean: it is not possible to create an instance of an existing language agnostic site definition, make some changes, and publish it again in a language agnostic way. The only thing you can do to accomplish this is to create a tool that rewrites the expanded text strings back to their resource file references. Good opportunity for a third-party tool here? There is one exception where the expanded resource is turned into a resource reference again: the NavBarPage element with the link to the Home of the site, probably because that one is assumed to be always there.
    2. Only one configuration. The SharePoint built-in site definitions have a concept called Configurations: based on the same site definition and list definitions, different configurations can be specified where a configuration describes which lists, modules, site features and web features to include when creating an instance of that configuration of the site definition. For example the sts site definition  has three configurations: Default (STS#0), Blank (STS#1) and DWS = Document WorkSpace (STS#2). On creating a site definition project based on an existing site, it only knows of one configuration, the configuration used for instantiating the site. This configuration is always called the Default configuration with ID 0.
    3. Web parts has all properties. A web part has a large set of properties, and most properties have a default value. An example of such a property is IsVisible with a default value true. In the built-in site definitions only the required properties are included, in the created site definition solution all properties are included, but hey, who cares! Another thing is that web part properties can use resource strings as well, but those are expanded in the site definition solution.

    Are the above points a show stopper? No absolutely not! If you want to create a really language agnostic version of your site definition that utilize resource files for your different supported language you have to do some extra work. In most cases you will be creating a solution for a customer in a chosen language.

    The Site Provisioning Handler

    The Site Provisioning Handler is a feature that enables the execution of code on provisioning an instance of a site based on the site definition. The feature has web scope and is defined as follows:

    <Feature  Title="TeamSite" Id="fe034860-4954-4b13-859f-892267dc0045" Description="" Version="1.0.0.0" 
              Scope="Web" Hidden="TRUE" DefaultResourceFile="core" 
              ReceiverAssembly="TeamSite, Version=1.0.0.0, Culture=neutral, PublicKeyToken=3b0f9fb38a73e4fc" 
              ReceiverClass="TeamSite.TeamSite" xmlns="http://schemas.microsoft.com/sharepoint/">
      <ElementManifests>
        <ElementFile Location="provisioner.xml" />
      </ElementManifests>
    </Feature>

    The ReceiverAssembly is the assembly created by compiling the code in the site definition solution. The relevant code for the provisioning feature can be found in the partial files SiteProvisioning.cs and SiteProvisioning.internal.cs. Especially the internal file is interesting, it contains the code as defined by the developers of the SharePoint Solution Generator. The code in this file does the following on provisioning of a new site:

    1. Restore web properties for the site
    2. Add custom CSS files available in the site in a folder with the name _styles to the site using the SPWeb.CustomizeCSS method (what does this do? I assume writing out links to these CSS files in all pages rendered for the web.)
    3. Restore Data View Web Part guids outside web part zones
    4. Restore Data View Web Part guids inside web part zones

    Creating a site definition containing data view web parts has always been a mess. Data View Web Parts use all kind of GUID's to reference to list instances in a site. But on site definition time you don't know these GUID's yet. Another problem are the GUID's for web part connections This code seems to fix these problems, but I need more time to dive into the exact innerworkings.

    The feature uses a provisioner.xml file containing specifications of things to fix. Our site definition has the following provisioner.xml file:

    <SiteSettings>
      <!-- _filecategory="Provisioner" _filetype="File" _filename="provisioner.xml" _uniqueid="b8b1d607-d3ea-4a77-9302-2dc5c6d57e0f" -->
      <ListInstances>
        <ListInstance Id="9d519060-66f4-4bd6-9216-be608493d134" Title="Announcements" FeatureId="00bfea71-d1ce-42de-9c63-a44004ce0104" />
        <ListInstance Id="f7a6a984-f2bb-4ddb-a131-cd9ae58e645e" Title="Calendar" FeatureId="00bfea71-ec85-4903-972d-ebe475780106" />
        <ListInstance Id="258accc5-3f74-460d-8ce8-2d682f5de4df" Title="Links" FeatureId="00bfea71-2062-426c-90bf-714c59600103" />
        <ListInstance Id="69f5806c-ebcf-40ec-bce8-bcfa65eb8e58" Title="Master Page Gallery" FeatureId="00000000-0000-0000-0000-000000000000" />
        <ListInstance Id="65a1e744-6ba6-433f-a1c1-a75d31a0f713" Title="Shared Documents" FeatureId="00bfea71-e717-4e80-aa17-d0c71b360101" />
        <ListInstance Id="655db2ef-1ae9-4ea0-9e31-d8ed15402539" Title="Tasks" FeatureId="00bfea71-a83e-497e-9ba0-7a5c597d0107" />
        <ListInstance Id="587c255b-ffde-4cd0-b042-fba67f20ba65" Title="Team Discussion" FeatureId="00bfea71-6a49-43fa-b535-d15c05500108" />
      </ListInstances>
      <WebProperties>
        <WebProperty Key="vti_extenderversion" Value="12.0.0.4407" />
        <WebProperty Key="vti_defaultlanguage" Value="en-us" />
        <WebProperty Key="vti_categories" Value="Business Competition Expense\ Report Goals/Objectives Ideas In\ Process Miscellaneous Planning Schedule Travel VIP Waiting" />
        <WebProperty Key="vti_approvallevels" Value="Approved Rejected Pending\ Review" />
      </WebProperties>
    </SiteSettings>

    The exact possibilities within this file will be the topic of a future blog post, if I ever get to it.

    The resulting solution file: TeamSite.wsp

    The final result of all the work is a SharePoint solution file with the extension .wsp. See this blog post by Chris Johnson for some more background info. This solution file can be deployed to your development server or your server farm.

    A .wsp file is actually just a CAB file that is renamed. If you rename it to a file .cab extension you can have a peek into it:

    <Click the image to enlarge it>

    But this is actually the exact same structure as you can find in your build output directory:

     

    Is this all?

    There is a short answer and a long answer.

    First the short answer: no. I only exported a site I made no modifications to, so it was still very close to it's underlying site definition. This was on purpose, to see how far it would match its underlying site definition. If you have a more complex site with modification the whole thing will become way more complex.

    Now the long answer: no. Given your site definition project, you can extend it with additional features, your own site provisioning code, your own content types, new list definitions, additional web parts, custom field controls, additional modules etc. etc. The Visual Studio 2005 Extensions for SharePoint give you all the tools do do exactly this. More on this in a future blog post.

  • SharePoint Solution Generator - part 1: create a site definition from an existing site

    This is part 1 in a series of blog post on the SharePoint Solution Generator.

    The SharePoint Solution Generator is a stand-alone application that can convert Wss3 web (SPWeb) into a Visual Studio 2005 site definition project that can be compiled into a SharePoint solution for deployment into your SharePoint farm. The SharePoint Solution Generator is part of Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions, a set of tools and templates for creating solutions for SharePoint 2005 that recently came out in beta. See this blog post for more information. This blog post documents the steps that I took in creating a site definition from an instantiation of a standard Windows SharePoint Services Team Site, and all things I noticed on the created site definition. For me it is a kind of documenting my findings in a way I can find it back when I Google for information on this topic later on. I have a short memory;-)

    Ok, lets get started. I created a site called TeamSite based on the standard Team Site template. I have three language packs installed, English, German and Japanese. I chose the English version.

     

    Without making any modification to the team site I fire up the SharePoint Solution Generator and start creating the site definition solution.

     The result is a C# site definition solution with the following elements in it:

    The project has a SharePoint Specific properties tab with a tree view on all features and the site definition in this project. If we had modified lists in the team site like adding new columns and new views, we probably also had list definitions included in this tree view. Below are the screen shots of all the configuration screens, so you get a feeling of what configuration capabilities are dynamically created:

    To prevent clashes on deployment, the specified Folder Name is appended with a GUID to create the folder during deployment on the server.

    Note that the Language is set to 1033 (English), this is the language we created our instance of the TeamSite in.

    Microsoft advises to use unique values greater than 10,000 for the ID attribute of your site template. The value is set to 10002 as you can see in the picture above. This is because I created a test site definition before with ID 10003 and deployed it to the server. I hope that the SharePoint Solution Generator makes a roundtrip to the server to check for the highest site definition ID with a minimum value of 10000, and adds 1 to it. I wonder what happens if all site definition creators in the world starts creating site definitions with the same ID's due to the usage of this tool;-) You can also specify the image to display on template selection, and the name of the template selection tab.

    Creating the SharePoint Solution file TeamSite.wsp and deploy it to our development server

    Visual Studio can do a deployment of our project (menu: Build -> Deploy TeamSite, or in the context menu of the project: Deploy) to the development server, assuming you have Visual studio running on your SharePoint developer server. The following appears in the Visual Studio output window:

    ------ Build started: Project: TeamSite, Configuration: Debug Any CPU ------
    C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Csc.exe /noconfig /nowarn:1701,1702 /errorreport:prompt /warn:4 /define:DEBUG;TRACE /reference:"C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPI\Microsoft.SharePoint.dll" /reference:"C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPI\Microsoft.SharePoint.Security.dll" /reference:C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.dll /reference:C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Web.dll /reference:C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.XML.dll /debug+ /debug:full /keyfile:Properties\Temporary.snk /optimize- /out:obj\Debug\TeamSite.dll /target:library Properties\AssemblyInfo.cs "Site Provisioning Handler\SiteProvisioning.cs" "Site Provisioning Handler\SiteProvisioning.Internal.cs"
    

    Compile complete -- 0 errors, 0 warnings TeamSite -> F:\Sources\SharePointProjects\TeamSiteSiteDefinition\TeamSite\bin\Debug\TeamSite.dll ------ Deploy started: Project: TeamSite, Configuration: Debug Any CPU ------ ------ Generate TeamSite.wsp file and setup batch file------ Creating solution ... Operation completed successfully.

    Creating setup batch file ... Operation completed successfully.

    ------ Add and deploy TeamSite.wsp to the SharePoint ------ Adding solution ... Operation completed successfully.

    Deploying solution ... Operation completed successfully.

    ------ Activate features in solution if necessary ------ No features in this solution were activated

    Restarting IIS ... Operation completed successfully.

    ========== Build: 1 succeeded or up-to-date, 0 failed, 0 skipped ========== ========== Deploy: 1 succeeded, 0 failed, 0 skipped ==========

    As you can see the project is compiled, a SharePoint solution file TeamSite.wsp is created including a batch script to simplify installation, the solution is deployed to the server and and IIS is restarted so the new site definition becomes active.

    This solution is a really simple solution, in more complex solutions additional steps are taken with repsect to feature activation.

    Create an instance of our new site definition

    We can now create an instance of our new site definition. If we go to the create site screen there appeared an extra template selection tab called "Development" where our new site definition appears:

    And it all just works! I'm amazed.

    In the next blog post I will dive deeper in what is actually created in the site definition project. This is absolutely not trivial, so please continue reading to get a better understanding of the inner workings.

    [NOTA BENE: ALL INFORMATION IN THIS BLOG POST IS BASED ON A BETA VERSION OF THE PRODUCT, AND MAY NOT REFLECT THE FUNCTIONALITY AND BEHAVIOUR OF THE FINAL VERSION]

  • Really useful PowerShell help application

    When you get started with PowerShell you get overwelmed by the new command to learn. PowerShell has a built-in help command that gives you an overview of all available commands, and per command you can get help on it's exact syntax. You get something like:

    But it is difficult to get a direct overview of what commands are there, and what they exactly do.

    Tonight I stumbled into a great little tool on CodePlex that does give you the same help information in a simple Windows application: ShinyPower.

    The good thing is that it reads its help information from PowerShell itself, so if you add new command-lets, they automatically show up in ShinyPower.

  • It is possible to run VMware images and Windows Server 2003 on Amazon's EC2!!

    I just got a reaction on my blog post Microsoft and virtualisation: Amazon EC2 functionality using Windows Hypervisor technology code-named Viridian? from Reuven. See http://developer.amazonwebservices.com/connect/thread.jspa?threadID=12540&tstart=15 for more information. Of course there are some issues:

    1. Licensing

    2. Qemu running in the AMI is used to virtualize Windows.

    The first steps are there... lets see where it goes!

  • Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions available for download, it's cool!

    Creating complex SharePoint solutions and deploying those solutions has always been sub optimal in the old versions of SharePoint. In SharePoint 2007 (Wss3, MOSS 2007) our trouble is over. We now have powerful deployment capabilities in the form of features and SharePoint solutions.

    But creating SharePoint 2007 solutions and creating the feature and solution configuration files was still something for the experts only, until today...

    Rumors have been around for a while that Microsoft would provide Visual Studio 2005 extension to help us create SharePoint solutions. In the mean time people had their own shot at making development and deployment easier. A good example is this blog post by Tony Bierman.

    Tonight I got a pointer from Mark Arend (thanks Mark!) to the Novermber CTP version of the Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions.

    I directly downloaded the stuff, and must say I was impressed. It does a lot of the things I was currently working on in the construction of a SharePoint Software Factory, and a lot more.

    From the download page:

    This Community Technology Preview (CTP) of the Visual Studio 2005 Extensions for Windows SharePoint Services contains the following tools to aid developers in building SharePoint applications:
    Visual Studio 2005 Project Templates

    • Web Part
    • Team Site Definition
    • Blank Site Definition
    • List Definition

    Visual Studio 2005 Item Templates (items that can be added into an existing project)
    • Web Part
    • Custom Field
    • List Definition (with optional Event Receiver)
    • Content Type (with optional Event Receiver)
    • Module

    SharePoint Solution Generator
    • This stand-alone program generates a Site Definition project from an existing SharePoint site. The program enables developers to use the browser and Microsoft Office SharePoint Designer to customize the content of their sites before creating code by using Visual Studio.

    Based on the elements in your project web part manifests, features and a solution file are automatically created and published when you do an explicit publish, or when you do F5 debugging.

    If you have questions or want to discuss this new stuff: http://www.microsoft.com/technet/community/newsgroups/dgbrowser/en-us/default.mspx?dg=microsoft.public.sharepoint.development_and_programming

     

    The next days I will blog a lot more on my experiences with these extensions, I already did some deep-diving. But now it is time to get some sleep.

    One small teaser, the SharePoint Solution Generator in action:

    And to show it is still beta:

    But this looks like something that can be easily solved.

    [NOTA BENE: ALL INFORMATION IN THIS BLOG POST IS BASED ON A BETA VERSION OF THE PRODUCT, AND MAY NOT REFLECT THE FUNCTIONALITY AND BEHAVIOUR OF THE FINAL VERSION]

  • SharePoint 2007: Accessing information on the SharePoint Web Server Extensions folder

    Information on SharePoint and the path where it manages its information can be found in the registry. For example the registry entry SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\12.0\Location contains the path to the Web Server Extensions folder for SharePoint 2007, and SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\12.0\Version contains the current version of SharePoint 2007. Yes I know, RTM is out, and I'm still running on Beta 2 TR:-(

     

    I came accross the following code by Microsoft for getting the Features folder:

     

            private string GetSharePointFeaturesDirectory()
            {
                string key = @"SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\12.0";
                string name = "Location";
    
            <span style="color: rgb(0,0,255)">string</span> featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            <span style="color: rgb(0,0,255)">try
    

    { RegistryKey regKey = Registry.LocalMachine.OpenSubKey(key); string value = regKey.GetValue(name) as string; regKey.Close();

                featuresDir = <span style="color: rgb(0,128,128)">Path</span>.Combine(value, <span style="color: rgb(128,0,0)">@"template\features"</span>);
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">SecurityException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">ArgumentNullException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">ArgumentException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">ObjectDisposedException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">IOException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
            <span style="color: rgb(0,0,255)">catch</span> (<span style="color: rgb(0,128,128)">UnauthorizedAccessException</span>)
            {
                featuresDir = <span style="color: rgb(0,128,128)">String</span>.Empty;
            }
    
            <span style="color: rgb(0,0,255)">return</span> featuresDir;
        }
    

  • SharePoint 2007 - /_layouts and how to create pages that run in site context

    Ages ago, in the time that SharePoint 2007 was still beta, I dived into how to create "in site context" pages that should be hosted in the /_layouts directory of SharePoint. My adventures from back then can be found in this blog post. I don't want to take the default Microsoft approach where all server-side code is included in the aspx pages themselves. Developing this way is way more difficult that using code-behind files. I found a solution by creating a Visual Studio 2005 web site in the /_layouts virtual directory of my SharePoint web site, which points to the physical folder C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\LAYOUTS. In this approach all code behind files are part of the solution, and are compiled and cached on page request. Although this approach works, I don't really like it. I prefer the Visual Studio 2003 approach where all code-behind is compiled into a single assembly that can be deployed. Another problem is the location of referenced assemblies. I had my referenced assemblies in the GAC, but I prefer to deploy to a bin folder so no IISRESET recycling of the SharePoint application pool is needed on recompilation.

    What I really want to achieve is the following:

    Create a web application project that can be deployed to the SharePoint /_layouts virtual directory, so my code is executed in the context of a site.

    The solution happens to be really easy:

    Create a web application project, either directly in the /_layouts folder or somewhere else and copy over all files needed to run your application.

    The *.dll and *.pdb files produced as build output must be places in the bin folder of your SharePoint web site. In my test situation this is the folder C:\Inetpub\wwwroot\wss\VirtualDirectories\3a938f6a-15f2-49ae-be78-328ad78974f5\bin. You can find this folder in your Internet Information Server Manager as follows:

    • Right-click of the SharePoint web site
    • Select properties
    • Go to the Home Directory tab

    The value in Local Path specifies the path to the virtual directory, and in this virtual directory you find a folder bin.

    If you create your web application project within the /_layouts virtual directory, you can set the build output path directly to this bin folder.

    Note that you can't use the Publish Web feature of the web application project, because you can't specify a separate path to deploy your assemblies to:

    For my test I created the following project:

    I added some really simple code to the Default.aspx and Default.aspx.cs files to prove that it works:

    Default.aspx:

    <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="SergeLayoutsTest._Default" %>
    

    <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">

    <html xmlns="http://www.w3.org/1999/xhtml" > <head runat="server"> <title>Site title test</title> </head> <body> <form id="form1" runat="server"> <div> Title of this site: <asp:Label ID="LabelTitle" runat="server" Text="Label"></asp:Label> </div> </form> </body> </html>

     Default.aspx.cs:

    using System;
    using Microsoft.SharePoint;
    

    namespace SergeLayoutsTest { public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { SPWeb web = SPContext.Current.Web; LabelTitle.Text = web.Title; } } }

    There is one more thing to do, exclude the selection of the authentication mode from your web.config file:

    web.config:

    <?xml version="1.0"?>
    

    <configuration> <appSettings/> <connectionStrings/> <system.web> <compilation debug="true" /> <!-- <authentication mode="Windows" /> --> </system.web> </configuration>

     We can now run the page in the context of two different sites to see that it works:

  • Microsoft and virtualisation: Amazon EC2 functionality using Windows Hypervisor technology code-named Viridian?

    A weblog post with info on two amazing services of Amazon:  S3 (Amazon Simple Storage Service) and EC2 (Amazon Elastic Compute Cloud), virtualized computing power. Could Microsoft deliver comparable functionality using their Hypervisor technology code-named Viridian, the new virtualization technology from Microsoft?

    A few weeks ago I attended a presentation by Werner Vogels, CTO of Amazon.com. He stated that Amazon is more than just an online  bookshop, it is an IT company. He told about the possibility to utilize Amazon’s computing power at 10 dollarcents an hour. After some browsing on the Amazon site I found the two amazing services he was mentioning:

    Amazon Simple Storage Service (Amazon S3)

    Amazon S3 is storage for the Internet. It is designed to make web-scale computing easier for developers.

    Amazon S3 provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and to pass those benefits on to developers.

    Amazon S3 Functionality

    Amazon S3 is intentionally built with a minimal feature set.

    • Write, read, and delete objects containing from 1 byte to 5 gigabytes of data each. The number of objects you can store is unlimited.
    • Each object is stored and retrieved via a unique, developer-assigned key.
    • Authentication mechanisms are provided to ensure that data is kept secure from unauthorized access. Objects can be made private or public, and rights can be granted to specific users.
    • Uses standards-based REST and SOAP interfaces designed to work with any Internet-development toolkit.
    • Built to be flexible so that protocol or functional layers can easily be added.  Default download protocol is HTTP.  A BitTorrent(TM)protocol interface is provided to lower costs for high-scale distribution.  Additional interfaces will be added in the future. 

    Pricing

    • Pay only for what you use. There is no minimum fee, and no start-up cost.
    • $0.15 per GB-Month of storage used.
    • $0.20 per GB of data transferred.

    Amazon Elastic Compute Cloud (Amazon EC2) - Limited Beta

    Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale computing easier for developers.

    Just as Amazon Simple Storage Service (Amazon S3) enables storage in the cloud, Amazon EC2 enables "compute" in the cloud. Amazon EC2's simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon's proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.

    Amazon EC2 Functionality

    Amazon EC2 presents a true virtual computing environment, allowing you to use web service interfaces to requisition machines for use, load them with your custom application environment, manage your network's access permissions, and run your image using as many or few systems as you desire.

    To use Amazon EC2, you simply:

    • Create an Amazon Machine Image (AMI) containing your applications, libraries, data and associated configuration settings. Or use our pre-configured, templated images to get up and running immediately.
    • Upload the AMI into Amazon S3. Amazon EC2 provides tools that make storing the AMI simple. Amazon S3 provides a safe, reliable and fast repository to store your images.
    • Use Amazon EC2 web service to configure security and network access.
    • Use Amazon EC2 web service to start, terminate, and monitor as many instances of your AMI as needed.
    • Pay for the instance hours and bandwidth that you actually consume.

    Service Highlights

    • Elastic
      Amazon EC2 enables you to increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously. Of course, because this is all controlled with web service APIs, your application can automatically scale itself up and down depending on its needs.

    • Completely Controlled
      You have complete control of your instances. You have root access to each one, and you can interact with them as you would any machine. Each instance predictably provides the equivalent of a system with a 1.7Ghz Xeon CPU, 1.75GB of RAM, 160GB of local disk, and 250Mb/s of network bandwidth.

    • Designed for use with Amazon S3
      Amazon EC2 works in conjunction with Amazon Simple Storage Service (Amazon S3) to provide a combined solution for computing and storage across a wide range of applications.

    • Reliable
      Amazon EC2 offers a highly reliable environment where replacement instances can be rapidly and reliably commissioned. The service runs within Amazon's proven network infrastructure and datacenters.

    • Secure
      Amazon EC2 provides web service interfaces to control network security. You define groups of instances and their desired accessibility.

    • Inexpensive
      Amazon EC2 passes on to you the financial benefits of Amazon's scale. You pay a very low rate for the compute capacity you actually consume. Compare this with the significant up-front expenditures traditionally required to purchase and maintain hardware, either in-house or hosted. This frees you from many of the complexities of capacity planning, transforms what are commonly large fixed costs into much smaller variable costs, and removes the need to over-buy "safety net" capacity to handle periodic traffic spikes.

    Pricing

    • Pay only for what you use.
    • $0.10 per instance-hour consumed (or part of an hour consumed).
    • $0.20 per GB of data transferred outside of Amazon (i.e., Internet traffic).
    • $0.15 per GB-Month of Amazon S3 storage used for your images (charged by Amazon S3).

    Data transferred within the Amazon EC2 environment, or between Amazon EC2 and Amazon S3, is free of charge (i.e., $0.00 per GB).

    Amazon S3 usage is billed separately from Amazon EC2; charges for each service will be billed at the end of the month.

    It is my feeling that solutions like S3 and EC2 will be the future of software development and deployment. Especially startup companies can benefit from these kind of solutions: at $72/month you have a server up and running, and you can scale up the number of servers when needed. Especially very useful if you need huge computing power during small amounts of time, like for example a three days online action. 

    After some reading I found out that the EC2 virtualization technology is Xen. Currently only Linux images can be hosted on EC2.

    Viridian, Microsoft’s new virtualization technology

    Although I have been using Microsoft’s virtualization technology for a few years with Virtual Server and Virtual PC, I was never very impressed by its performance when you compare it to VMWare. I mostly use it to run SharePoint in a virtual machine, and as you can see in this weblog post by Todd Baginski VMWare is the better option with respect to performance.

    Wouldn’t it be great if we could have EC2 like functionality using Microsoft Windows Server 2003 images? VMWare has such capabilites using the VMWare ESX Server. Microsoft is currently working on similar technology with their Hypervisor technology based product code-named Viridian, which architecture seems to be similar to the architecture of Xen, a technology Microsoft Research contributed to.

    And now someone please build a similar hosted virtual server model to EC2 where we can build our applications using the Microsoft .Net platform technology at comparable prices to Amazon!

    For more info on Viridian: http://www.google.com/search?num=100&hl=en&q=Viridian+virtualization

  • Comparing the features Microsoft Office SharePoint 2007 to SharePoint Portal Server 2003

    On the official blog of the SharePoint product group a link to a spreadsheet containing a comparison of the features of MOSS 2007 to the features of SPS 2003 is provided. It provides a feature comparision between the following products:

    • SharePoint portal Server 2003
    • Windows SharePoint Services 3.0
    • Office SharePoint Server 2007 for Search
    • Office Forms Server 2007
    • Office SharePoint Server 2007 Standard CAL
    • Office SharePoint Server 2007 Enterprise CAL or for Internet Site

    It is an extensive overview subdivided in the following categories:

    • Collaboration
    • Portal
    • Search
    • Content Management
    • Business Process and Forms
    • Business Inteligence
    • Management
    • Platform

    The Search category definitly proves me wrong on the rumors I had heard with respect to the search functionality available in the Standard CAL versus the Enterprise CAL: Only the BDC search functionality is part of the Enterprise CAL.

  • MOSS 2007 - Search only in Enterprise CAL?

    Like WSS2, WSS3 will be free of charge, covered by the standard Windows Client Access License (CAL).

    In the new version of Microsoft Office SharePoint Server 2007 (MOSS 2007) there will be two different CALs: Standard CAL and Enterprise CAL.

    Users of SharePoint Portal Server 2003 (SPS) will be “migrated” to the MOSS 2007 Standard CAL. Extra money has to be payed for the Enterprise CAL.

    No sweat until this point. But rumours have reached me…

    Standard CAL: Workflow, Document Management, Web Content Management, Site Model and Security

    Enterprise CAL: Data Integration (BDC), E-Forms (Forms Server), Excel Services and…. Search

    Although the search in WSS3 (the basis for MOSS 2007) is way better than the current WSS2 search (which is completely different from SPS search) and can now searchover complete site collections, I’m more than surprised that the full Search is no longer part of Standard CAL.

    Current customers of SPS 2003 using features of the full Search like indexing Exchange, Web Sites and the Filesystem, and defining custom properties for search will be not amused (to say the least).

    Is there someone out there who can confirm this?

    UPDATE: Dustin Miller pointed me to a weblog entry by Arpan Shah, a Group Product Manager for SharePoint Products and Technologies who has more details on Microsoft Enterprise Search and SKU breakdown.

    The following quote is important:

    It's important to note that there are different SKUs available that contain SharePoint search. Depending on your business needs, you want to explore the following.

    1. SharePoint Server for Search. This is a "Search SKU" and the licensing model is dollars/server. It comes in two versions (Standard and Enterprise) and depending on the number of documents, you want to choose the appropriate one. This SKU builds on top of Windows SharePoint Services (WSS) v3 and has all the extensibility that SPS 2003 has today. There's also an upgrade path from the search SKU to the Standard and Enterprise versions of SharePoint Server.

    2. SharePoint Server w/ Standard CAL. This is a Server/CAL licensing model and provides a lot more functionality across the board than the "search SKU". This is also built on WSS v3 and gives you portal and enterprise content management features (for specifics on what SharePoint Server provides, read this post). From a search perspective, this adds the ability to search people and introduces a new extensibility and UI experience known as the Search Center. People search can be enhanced with Knowledge Network for SharePoint which is an add-on if you own this SKU.

    3. SharePoint Server w/ Enterprise CAL. This is a Server/CAL liecnsing model and in addition to the capabilities provided w/ Standard, this provides rich Business Intelligence and Forms capabilities. From a search perspective, above and beyond what Standard provides, this introduces a new feature known as the Business Data Catalog (BDC) that allows you to easily search structured LOB systems without writing code. Out of the box, we plan to provide integration with SAP and Seibel as well as any database via ADO.NET. Any LOB system that exposes information via XML Web Services can be connected to.

    Please note: While #2 and #3 provide search functionality, they provide rich Information management features that helps information workers share, collaborate, find and retain information end-to-end. It's more than search!

    In recap (all will be available in the Office 2007 timeframe):

    1. Office SharePoint Server for Search follows a per server licensing model, is extensibile, crawls file shares, sharepoint sites, web sites, exchange pfs, lotus notes databases out of the box. it comes in to versions: standard and enterprise - you choose depending on how many docs. it upgrades to the other office sharepoint server skus.

    2. Office SharePoint Server Standard follows a CAL/server licensing model. It provides much richer features than the search SKU. From a search perspective, it provides people search and the Search Center.

    3. Office SharePoint Server Enterprise follows a CAL/server licensing mode. It provides more features than Standard such as BI and Forms. From a search perspective, it introduces the Business Data Catalog (BDC) that allows you to connect to LOB systems without writing code.

    Lets hope this SKU approach is true, it would mean that the only thing that SharePoint Search will privide in the Enterprise version is search through the BDC in LOB systems.

     

  • MOSS2007, Wss3 and extending stsadm.exe

    When you fire up stsadm.exe, the administrative “do it all” tool for SharePoint, you get the idea that it must be possible to add new commands because you see different available commands in different situations.

    I already dived into this a while a go but I stopped because there were so many other things to look into. Tony Bierman didn’t stop! He went all the way, and was even willing to share it with us in a great blog post including a sample solution with sample commands. 

  • SharePoint 2007: using ASP.NET server side code in your pages

    Remember the problems you had in SharePoint 2003 pages because it was not possible to plug in a simple piece of server side script in your pages? That you always had to write custom controls to accomplish this? Those times could be over, as longs as you approach this with great care.

    In the web.config file in the SharePoint virtual directory contains the following section:

      <SharePoint>
        <SafeMode MaxControls="200" CallStack="false" DirectFileDependencies="10" TotalFileDependencies="50" AllowPageLevelTrace="false">
          <PageParserPaths>
          </PageParserPaths>
        </SafeMode>
        :
      </SharePoint>

    By default the node <PageParserPaths> is empty. You can add <PageParserPath> nodes to specify the virtual paths where you want to allow server side scripts:

    <PageParserPaths>
            <PageParserPath VirtualPath="/pages/*" CompilationMode="Always" AllowServerSideScript="true" IncludeSubFolders="true"/>
    </PageParserPaths>

    Where CompilationMode is one of the following values:

    Always The page should always be compiled (default value)
    Auto ASP.NET will not compile the page, if possible. 
    Never The page or control should never be dynamically compiled.

    I assume that the AllowServerSideScript and IncludeSubFolders flags speak for themselves.

    Be careful with the virtual paths you specify in your PageParserPaths. Anyone that can modify or add a page to the virtual path can insert code that will be executed server side with no restrictions.

    A good location to specify as a PageParserPath is the location where you store your masterpages, for example /_catalogs/masterpage. You can now add server side script to your masterpages, which makes it available in all pages using this masterpage.

    <PageParserPaths>
            <PageParserPath VirtualPath="/_layouts/masterpage/*" CompilationMode="Always" AllowServerSideScript="true" IncludeSubFolders="true"/>
    </PageParserPaths>

    There is no documentation available on this functionality. I found two references in the Microsoft SharePoint documentation that handled with variations: http://msdn.microsoft.com/en-us/library/ms562040.aspx and http://msdn.microsoft.com/en-us/library/ms551625.aspx.

    Maurice Prather also describes the PageParserPath functionality in this blog post.

    Thanks to Stramit for pointing me in the right direction in this blog post on SharePoint navigation.