Wednesday, September 9, 2015

Things to watch on SharePoint migrations

I've recently worked on a larger migration project from SharePoint 2007 to 2013/Office 365.

One of the goals of the initial phase of the project was to establish whether or not the sites could entirely migrate to Office 365 or if an on-premise installation would be required, in conjunction with Office 365. Naturally, the on-premise installation would imply a hybrid scenario and the whole infrastructure overhead that would come with it.

There are an array of tools out there that can collect the majority of information you need on your content: DocAve from AvePoint, Metalogix or ShareGate reports.
These tools can go as deep as report on

  • All features installed by site url
  • Size of sites (GB) 
  • The last time a site has been used
  • Site templates
  • Custom list templates
  • Custom webparts, etc. 
In reality, there is not much need for scripting anything in order to find out exactly what your farm(s) contains and how it is used.

The same names offer migration tools (e.g. Content Matrix) that allow for a very configurable migration at any level (web app, site collection or list level even).

The real challenge are customizations and their portability to SharePoint 2013 or Office 365.
When it comes to customizations, one needs to quickly be able to differentiate between customizations that can be replaced with new SharePoint/365 functionality and customizations that will require more digging around and possible conversations with business users.

Things to watch for that can be easily overlooked:

  • remember un-ghosted forms/pages : not many tools report on this, but you can write a short PS script that will give you the list of those in a farm
  • InfoPath forms - can be migrated nicely, but they have data connections that need to be updated (also via script if necessary)
  • workflows
Possible approaches:
  • work at site collection level OR if you need to go down to site level, make sure you work within the hierarchy of sites! Labeling a particular sub-site without customizations as "migration-ready" may be incorrect if the parent site collection has custom features.
  • Use a data analysis tool like PowerPivot to aggregate all reports. This will be more about making sense of numerous reports, rather than developing the reports. Remember there's tools out there that do this for you!
  1. PowerPivot is a great tool that allows you to aggregate various reports (similarly to a database with relationships and joins)
  2. PowerPivot will allow you to expand/collapse on a hierarchy of sites within site collections
  3. PowerPivot will allow you to slice and dice by custom wsp-s, features, site collection admin, etc.
  4. PowerPivot will make the new IA become evident to you, especially in the case of huge numbers of sites.
  5. And last, but not least, it will give you the final list of sites and their attributes in a CSV/Excel format, so you can take that and "feed" it to any migration tool in an automated manner.
As a result of the slicing and dicing, you will be able to show value to the business users and avoid a hybrid approach.

Have fun migrating!!

Tuesday, August 26, 2014

SharePoint 2010 Enterprise Search Grouping

In the previous post, we discussed sorting the search results of the core search results web part by our custom search properties. This can be achieved by extending the core search results web part and overwriting the ConfigureDataSourceProperties() method.
It is also possible to group the results by a property or even a combination of properties.

For example, if I have 2 managed properties called 'MyDepartment' and 'MyReportType' and I would like to see the results grouped by all possible combinations of these two properties, such as:

Asset Management - Account Activity  -

  • Account Activity for Account A and Quarter 1
  • Account Activity for Account B Q4

Asset Management - Asset Acquisitions -

  • Asset Acquisitions 2014
  • Asset Acquisitions Feb 2014

Asset Management - Quarterly Income Reports +
Compliance - Quarterly Compliance Report  +
Investment Management - Risk Analysis +
Investment Management - Risk Decomposition +

, each one with a suite of reports within each group, where inside each grouping the reports would be sorted by other managed properties, such as a specific date property for example.
The group by criteria is bolded out and on expand (+) it would show all results for the group.

As a first step, the group-by properties need to be added to the sort-by collection, in their order of relevance. Note the MyDepartment is the first sort criteria, alphabetically ascending, followed by 'MyReportType':

                   // get the datasource and change the sort order
                    CoreResultsDatasource dataSource = this.DataSource as CoreResultsDatasource;
                    if (sortBy != string.Empty)
                        dataSource.SortOrder.Add("mydepartment", sd);
                        dataSource.SortOrder.Add("myreporttype", sd);


In the XSLT we will generate the presentation of the group, by applying the Muenchian method, found on Jeni Tennison's site:

  1. First we declare a key in the declaration area of the XSLT:
    <xsl:key name="gg" match="/All_Results/Result" use="concat(mydepartment, myreporttype)"/>
  2. We modify the regular body template to perform two for-eaches, one for the group-by criteria and another one for the results within each group-by value:

<xsl:template name="dvt_1.body">
    <xsl:for-each select="/All_Results/Result[count(. | key('gg', concat(mydepartment,myreporttype))[1]) = 1]">
        <td colspan="8">
           <xsl:value-of select="
          <xsl:text> - </xsl:text>
          <xsl:value-of select="
          <xsl:text> + </xsl:text> <!-- additional html and javascript to achieve expand-collapse-->
      <xsl:for-each select="key('gg', concat(
        <xsl:sort select="myeffectivedate" order="descending"/>
        <xsl:call-template name="dvt_1.rowview"/>

3. Inside the inner for-each, specify the sort criteria to be the date column. The results are rendered by calling the dvt_1.rowview template.

Tuesday, July 29, 2014

SharePoint 2010 Enterprise Search Custom Sort

When you want to sort your search by something more than just Modified date or Relevance, such as your own custom managed properties, a way to achieve this in SharePoint 2010 would be by extending the Core Search Results Web Part like below.
The Core Search Results Web Part can give a page size for up to 50 items, but you may also be able to overwrite that via the extension.

The webpart now reads the sort criteria and direction from the query-string, as well as an extra parameter that allows the user to specify whether the results should be paged at all or displayed in a single page. If they are paged, rather than using the limit of 50, a custom web part property is being used, which is in this example set to a default of 100.

    public class ExtendedSearchResultsWebPart : CoreResultsWebPart
        int customResultsPerPage = 100;
        [WebDescription("Results per page")]
        [WebDisplayName("Results per page")]
        public int CustomResultsPerPage { get{ return customResultsPerPage;} set { value = customResultsPerPage;} }

        protected override void ConfigureDataSourceProperties()
            if (this.ShowSearchResults)
                    bool viewAll = false;
                    string sortBy = string.Empty;
                    Microsoft.Office.Server.Search.Query.SortDirection sd = Microsoft.Office.Server.Search.Query.SortDirection.Descending;
                    if (this.Page.Request.QueryString["pall"] != null)
                        if (this.Page.Request.QueryString["pall"] == "1")
                            viewAll = true;
                    if (this.Page.Request.QueryString["sort"] != null)
                        sortBy = this.Page.Request.QueryString["sort"];
                        if(this.Page.Request.QueryString["sd"] != null)
                            sd = this.Page.Request.QueryString["sd"] == "ascending" ? Microsoft.Office.Server.Search.Query.SortDirection.Ascending : Microsoft.Office.Server.Search.Query.SortDirection.Descending;
                    else sortBy = "MYCUSTOMDEFAULTPROPERY";

                        // get the datasource and change the sortorder
                        CoreResultsDatasource dataSource = this.DataSource as CoreResultsDatasource;
                        dataSource.ResultsPerPage = (viewAll == false ? CustomResultsPerPage : 5000);
                        if (sortBy != string.Empty)
                            dataSource.SortOrder.Add(sortBy, sd);

                catch (Exception ex)
                     ULSLogging.LogError("MYLOGGING", "Search: " + ex.Message, ex.StackTrace); }


Tuesday, February 18, 2014

Custom Actions for list items assigned programmatically

Custom actions are usually deployed in a declarative manner, such as an Elements.xml file. They can be deployed to a specific content type in case your document library inherits from the specific content type, such as below:

<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">
  Title="Approve/Reject Monthly Report">
    <UrlAction Url="javascript:MyNamespace.retrieveListItems({ItemId});"/>
However, when your document library gets deployed via code, such as  feature receiver, and especially if the document library is not associated with a content type, there is the option to attach a custom action programmatically such as  in the example below.           

                    Guid formlibID = web.Lists.Add("My Library", string.Empty, SPListTemplateType.XMLForm);
                    SPList formLib = web.Lists[formlibID];

                    SPUserCustomAction printForm = formLib.UserCustomActions.Add();
                    printForm.Title = "Print Form";
                    printForm.Url = "javascript:MyNamespace.printForm({ItemId},'{ItemUrl}')";
                    printForm.Location = "EditControlBlock";

In both examples, the custom action performs some javascript logic (in this case the two functions both open up a modal popup), that takes in the current items' out-of-the-box ID and URL. 

Thursday, May 9, 2013

Double AD profiles in User Profile Service?

This is directly related to working with Claims.

The user profile service imports AD accounts and sets the 'identifier' of the object to be the sAMAAccountName (This is the AD property which looks like sampledomain\johndoe).

When a claims web app accesses the User Profile service, such as for example MySites, if configured to use Claims, it looks for the User Profile service by it's own identifier, which is the token in a format like this:  i:0#.f|ldapmember|johndoe

The object will not be found, and MySites will generate the user profile and set the url to the personal site (yet another property in User Profiles), and this is how you end up with two records that refer to the same person, such as below:

The first one is the one generated by the User Profile Service, with properties such as First name, Last Name and other properties you have mapped to be imported from AD, while the other profile is the one generated by MySites, and only has the token and the URL set to the personal site.

The solution is to map the two identifiers to each-other so that when a claims-based app queries the User Profile service, it finds the profile by token.

The first property, Claim User Identifier, refers to the token and the mapped AD property called sAMAAccountName refers to the format domain\user.

Once mapped, any user profile action will follow this rule. For already existing duplicate accounts, the duplicate token one needs to be deleted for clean-up.

The new MySite generations should look like this:

This solution assumes that when you set up the UPS AD connection, you configure it as well as Claims:

Saturday, May 4, 2013

Migrating CSV data into SharePoint lists

A common migration scenario of data into SharePoint involves CSV files to be imported via Datasheet view into SharePoint lists.

However there are some limitations such as multi-lookup values, where data should be provided in the following format: "5;#technology;#3science;#". The same format has to be provided for columns of type 'People or Groups', such as "67;#John Doe;#123;#Anne Jackson;#". As a result, you cannot use DataSheet view and you have to upload the CSV programmatically.

This post focuses on reading the CSV file programmatically. You have two options:
  • if you use Powershell, you can use the PS command import-csv and then access the data like this:

    import-csv c:\folder\file.csv
    foreach($line in $excelFile){$title = $ 

  • if you use C# you can read the CSV file via a OLEDB driver you need to install locally, to permit your 64-bit code (you have to run on 64 bit to be able to execute SharePoint API calls for the actual lookup of values and list item creation).

    You can find the driver here Microsoft Access Database Engine 2010 Redistributable.
    This driver is simply a replacement for JET OLEDB to be run on server applications. JET is only running on 32 bits.

    string connectionString = @"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=c:\folder_name\;Extended Properties='text;HDR=Yes;FMT=Delimited'";

    string excelSheetName = "file.csv";

    OleDbConnection myConnection = new OleDbConnection(connectionString);
    OleDbCommand myCommand = new OleDbCommand("Select * from " + excelSheetName + ";", myConnection);
    OleDbDataAdapter adapter = new OleDbDataAdapter(myCommand);


Sunday, January 13, 2013

How to use SharePoint's OOB error page to show a friendly error message

In many cases, such as a feature activation, it is best to not overwrite the out-of-the-box error page with a custom one, but have the page show a custom message such that users know where the custom code you wrote, failed.

Envision the following scenario: you have an OOB site collection and on custom feature activation, you try to:

  • bind custom fields deployed by the feature to a metadata term store
  • add these custom fields to content types
  • create subsites programmatically
  • set permissions programmatically, etc.
Wrap each action in it's own try-catch statement and inside the catch call your custom Error Handler  method that writes to the ULS logs.

In addition to writing to the ULS logs, have your method also do a forced Response.Redirect to the OOB error page, with "ErrorText" in the QueryString object:

  //write to log first, then:

   string idcurr = CorrelationId.GetCurrentCorrelationToken().ToString();
   errorMessage = HttpUtility.UrlEncode(errorMessage);
   HttpContext.Current.Response.Redirect("/_layouts/error.aspx?ErrorText="+ errorMessage + "&ErrorCorrelationId=" + idcurr);

The ErrorCorrelationId QueryString parameter will ensure that the ID you see on the error page matches what you see in the ULS logs.
But the actual Error Handling, if custom, will generate it's own ID when writing to the logs.
It is necessary for these two to match, basically for the code to associate the custom Error message generated with the ID shown on the page.
The only way to grab the latest CorrelationID generated is to grab it via this class, and I am referencing this blog where I found the solution to the issue:

public class CorrelationId
        public static extern uint EventActivityIdControl(uint controlCode, ref  Guid activityId);
        public const uint EVENT_ACTIVITY_CTRL_GET_ID = 1;
        public static Guid GetCurrentCorrelationToken()
            Guid g = Guid.Empty;
            EventActivityIdControl(EVENT_ACTIVITY_CTRL_GET_ID, ref  g);
            return g;