Saturday, June 18, 2011

How to set up database mirroring on a SharePoint live system

Scenario:
            Let's take a SharePoint 2010 farm installation with two front-ends WFE01 and WFE02 and two SQL Server 2008 R2 instances called SQLDB01 and SQLDB02.
            Let's install a third SQL Server instance (which can be SQL Express) to serve as the Witness in this mirroring scenario. We will use the SQLDB02 hard drive and the instance will be called SQLDB02\Witness.
            When installing the SharePoint farm, point both web front end instances to the same SQL Server SQLDB01. We will keep SQLDB02 separate from the farm, with SQLDB01 mirrored into SQLDB02 for fail-over in case of disaster.

The instructions below are supporting a high-availability synchronous mirroring configuration. This means that a SQL Server serving as a witness needs to be present, to ensure automatic fail-over. Otherwise, some manual configuration is needed in case of disaster. The table below describes all types of mirroring:

Two main actions need to be taken to enable mirroring:
  • Configure mirroring in SharePoint Central Administration  at Web Application level - by using the UI
  • Configure SQL Mirroring on the SQL Server instances - by running several sql scripts

In detail:
  1. Configure mirroring in SharePoint for the web application on port 80 with database WSS_Content


  2. Run step by step:
  • Run on SQLDB01 for each database you want mirrored. Only mirrored databases will be available in case of disaster. This script will set full recovery on the main WSS_Contect database to be mirrored. Full recovery needs to be set, in order to take a transactional log backup, which is necessary for mirroring.
alter database WSS_Content set recovery full
go
  • Take full backups of all SharePoint databases and place the backups into the folder “U:\Backups\”.
backup database WSS_Content to disk = 'U:\Backups\WSS_Content.bak' 
go
  • Take transactional log backups and save them with “*_log.bak” into the folder “U:\Backups\”.
BACKUP LOG WSS_Content to disk = 'U:\Backups\WSS_Content_log.bak'
go
  • Restore all databases on SQLDB02 in “NORECOVERY” mode.
restore database WSS_Content from disk = 'T:\restores\WSS_Content.bak' with norecovery, 
move 'WSS_Content' TO 'T:\DATA\WSS_Content.mdf',
move 'WSS_Content_log' TO 'U:\LOGS\WSS_Content_log.ldf' 
go
  • Restore all transactional logs on SQLDB02 in “NORECOVERY” mode:
restore log WSS_Content from disk = 'T:\restores\WSS_Content_log.bak' with norecovery
go
  • Enable mirroring by following the instructions:
--enable mirroring on SQLDB01 - principal, SQLDB02- mirror and SQLDB02\Witness as witness
--run this on SQLDB01 and on SQLDB02
CREATE ENDPOINT Mirroring
     STATE=STARTED
     AS TCP(LISTENER_PORT=5022)
     FOR DATABASE_MIRRORING(
     AUTHENTICATION=WINDOWS NEGOTIATE,
     ENCRYPTION=SUPPORTED,
     ROLE=PARTNER)
     GO 
--run on SQLDB02\Witness - note different port because it has the same ip as the mirror
CREATE ENDPOINT Mirroring
     STATE=STARTED
     AS TCP(LISTENER_PORT=5023)
     FOR DATABASE_MIRRORING(
     AUTHENTICATION=WINDOWS NEGOTIATE,
     ENCRYPTION=SUPPORTED,
     ROLE=WITNESS)
     GO
--run this on the mirrored instance SQLDB02
ALTER DATABASE WSS_Content
SET PARTNER= N'TCP://SQLDB01:5022'
GO
--run this on SQLDB01
ALTER DATABASE WSS_Content
SET PARTNER= N'TCP://SQLDB02:5022'
GO
--run this on SQLDB01 to assign the witness
ALTER DATABASE WSS_Content
SET WITNESS= N'TCP://SQLDB02:5023'
GO
  • Test that mirroring is working by checking if each database is marked with “Principal” or “Mirror”.
  • Test that mirroring is working by stopping the SQL Server service on SQLDB01 and testing that the site on port 80 is still working. When SQLDB01 is down, the SQLDB02 databases should not show “Mirror” anymore. 
  • If you restart the service on SQLDB02 and stop it on SQLDB01, SQLDB01 will once again become the principal. If you restart SQLDB02 now, this will become the mirror. The witness makes sure the only server standing is always the principal to ensure latest data integrity.

Friday, June 17, 2011

Folder-level document migration in SharePoint 2010

How many times have you encountered a scenario where as part of a migration of a site collection, a cleanup of the document libraries was also much needed? From the maze of folders and sub-folder structures, only particular levels of identified data would need to be migrated, and possibly into a whole different type of folder structure or even document library. In many cases, the new folder structure would not even exist yet, or would take a lot of time to create due to the number of sub-levels.

Unfortunately, SharePoint 2010 only offers some tools for document migration, such as the most commonly used PowerShell commands Export-SPWeb and Import-SPWeb. You can import and export sites, lists, document libraries and items.
Similarly, the Central Administration Backup/Restore utility only works at site or list level:


So after lots of searching, I started writing my own tool for migrating data at folder level, recursively if specified, and with folder structure automatic creation in the destination site.
The tool expects a single Excel (CSV) file with a list of source folder/list names and a list of desired destination structures.
The file should have the following header:

So what happens in code?
  1. Read the excel file for instructions on what to migrate
    string connectionString = ConfigurationSettings.AppSettings["ExcelConnectionString"].ToString();
    string excelSheetName = ConfigurationSettings.AppSettings["ExcelSheetName"].ToString();

    OleDbConnection myConnection = new OleDbConnection(connectionString);
    myConnection.Open();
    OleDbCommand myCommand = new OleDbCommand("Select * from " + excelSheetName + ";", myConnection);
    OleDbDataAdapter adapter = new OleDbDataAdapter(myCommand);
    adapter.Fill(ds);

    myConnection.Close();

  2. Read the source and destination web app url-s
    public void ReadWebApps()
    {
    sourceurl = (ConfigurationSettings.AppSettings["sourceurl"] != null ? ConfigurationSettings.AppSettings["sourceurl"].ToString() : string.Empty);
    desturl = (ConfigurationSettings.AppSettings["desturl"] != null ? ConfigurationSettings.AppSettings["desturl"].ToString() : string.Empty);
    }

  3. Migrate recursively by following the Excel line by line
    In the screenshot above, "-" is a notation for "specified root folder only", "*" stands for "recursively migrate all structures underneath" or you can specify a single file, like on the third line.
    Make sure you use a logging method, such as the one I have below "WriteLogAndConsole", to track which Excel rows might fail. I've removed most logging calls for the purpose of this blog.

    using (SPSite sourcesite = new SPSite(sourceurl))
    {
    WriteLogAndConsole("Source web app found: " + sourcesite.Url + ".");
    using (SPWeb sourceweb = sourcesite.RootWeb)
    {
    webApp = sourcesite.WebApplication;
    webApp.FormDigestSettings.Enabled = false;
    foreach (DataRow row in ds.Tables[0].Rows)
    {
    //reading excel row values
    sourcelistname = row["SourceList"].ToString();
    sourcefolderurl = row["SourceFolder"].ToString();
    destsite = row["DestinationSite"].ToString();
    destlistname = row["DestinationList"].ToString();
    destfolderurl = row["DestinationFolder"].ToString();
    if (destfolderurl == string.Empty)
    destfolderurl = sourcefolderurl;
    WriteLogAndConsole("Destination Folder not set to anything. A new destination folder will be created with the source folder name and url.");
    if (row["Mode"].ToString() == string.Empty)
    {
    mode = migrationMode.None;
    sourcefile = row["Mode"].ToString();
    WriteLogAndConsole("Mode value set to nothing. Nothing will be migrated.");
    }
    else
    mode = (row["Mode"].ToString() == "*" ? migrationMode.Recursively : (row["Mode"].ToString() == "-" ? migrationMode.FilesOnly : migrationMode.File));
    WriteLogAndConsole("Source List: " + sourcelistname + "; Source Folder: " + sourcefolderurl + "; Migration Mode: " + mode.ToString() + "; destinationsite: " + destsite + "; destinationlist: " + destlistname + "; destinationfolder: " + destfolderurl + ".");

    if (mode != migrationMode.None)
    {
    //finding source list + folder by url
    sourcelist = sourceweb.Lists[sourcelistname];
    if (sourcelist != null)
    {
    SPListItemCollection fldr = sourcelist.Folders;
    foreach (SPListItem fld in fldr)
    {
    sourcefolderurl = (sourcefolderurl.StartsWith("/") ? sourcefolderurl : "/" + sourcefolderurl);
    if (fld.Folder.Url == sourcelist.RootFolder.Name + sourcefolderurl)
    {
    //foreach excel row (source folder) perform a migration recursively, files only or just the specified file
    MigrateFiles(mode, fld, sourceweb, destsite, destlistname, destfolderurl, sourcefile);
    }
    }
    }
    }
    }
    webApp.FormDigestSettings.Enabled = true;
    }
    }



  4. The actual migration logic

    foreach (SPFile file in folder.Files)
    {
    //this will be the new URL of each file
    destURL = destweb.Url + "/" + destfolder.Url + "/" + file.Name;
    AddVersions(file, folder.Url, destfiles, sourceweb, destweb);
    SPFile copyFile;

    string modby = (file.Properties["vti_modifiedby"] != null ? file.Properties["vti_modifiedby"].ToString() : string.Empty);
    string modtime = (file.Properties["vti_timelastmodified"] != null ? file.Properties["vti_timelastmodified"].ToString() : string.Empty);
    SPUser spusr = null;
    try
    {
    //this adds the user to the website if it does not already exist; if it cannot add the user to teh website, it throws an exception
    spusr = destweb.EnsureUser(@modby);
    }
    catch (Exception ex)
    {
    spusr = destweb.CurrentUser;
    }

    //major publish (eg 1.0)
    //checks if the current version is major/minor version and publishes the file accordingly
    if (file.Level.ToString() == "Published")
    {
    WriteLogAndConsole("Adding and publishing file '" + file.Url + "'.");
    copyFile = destfiles.Add(destURL, file.OpenBinaryStream(), file.Properties, spusr, spusr, Convert.ToDateTime(modtime), Convert.ToDateTime(modtime), file.CheckInComment, true);
    copyFile.Publish(file.CheckInComment + " date modified: " + modtime + " modified by: " + modby);
    }
    else
    {
    WriteLogAndConsole("Adding file '" + file.Url + "'.");
    copyFile = destfiles.Add(destURL, file.OpenBinaryStream(), file.Properties, spusr, spusr, Convert.ToDateTime(modtime), Convert.ToDateTime(modtime), file.CheckInComment + " date modified: " + modtime, true);
    }
    }
    if (mod == migrationMode.Recursively)
    {

    //call this function on subfolders as well
    foreach (SPFolder subfld in folder.SubFolders)
    {
    //create destination subfolder first
    WriteLogAndConsole("Begin recursive migration of subfolders at level '" + subfld.Name + "'.");
    WriteLogAndConsole("Creating destination subfolder with same name as source '" + subfld.Name + "'.");
    SPList dlist = destweb.Lists[destfolder.ParentListId];
    SPFolder destfd = destfolder.SubFolders.Add(subfld.Name);
    RecursivelyMigrate(subfld, mod, destfd.Files, sourceweb, destweb, destfd);
    }
    }

  5. Migrate versions of documents
    There is an excellent blog post on how you can iterate through each document version and migrate it here. You can find most of the code at the referenced posting, with full comments, but for the purpose of this posting, and to add my own changes regarding modifiedby/modifieddate fields, you can use the method below:


    public void AddVersions(SPFile file, string folderURL,SPFileCollection destfiles,SPWeb sourceweb,SPWeb destweb)
    {
    try{
    SortedList myList = new SortedList();
    ICollection items = myList.Keys;
    WriteLogAndConsole("File '" + file.Url + "' has versions to migrate.");
    if (file.Versions.Count != 0)
    {
    foreach (SPFileVersion ver in file.Versions)
    {
    string tempKey = "";
    tempKey = Regex.Replace(ver.Url, "_vti_history/", "");
    tempKey = Regex.Replace(tempKey, "/" + folderURL, "");
    tempKey = Regex.Replace(tempKey, "/" + file.Name, "");
    myList.Add(int.Parse(tempKey), "");
    }
    }
    foreach (object key in items)
    {
    foreach (SPFileVersion newVer in file.Versions)
    {
    string temp = "";
    temp = Regex.Replace(newVer.Url, "_vti_history/", "");
    temp = Regex.Replace(temp, "/" + folderURL, "");
    temp = Regex.Replace(temp, "/" + file.Name, "");
    if (temp == key.ToString())
    {
    SetVersion(int.Parse(temp), destfiles, newVer, sourceweb,destweb);
    }
    }
    }
    }
    catch (Exception ex)
    {
    WriteLogAndConsole("Add versions failed with error: " + ex.Message + ex.StackTrace);
    }
    }
    public void SetVersion(int num,SPFileCollection destfiles,SPFileVersion newVer,SPWeb web,SPWeb destweb)
    {
    try{
    Stream verFileStream = newVer.OpenBinaryStream();

    int baseNum = 512;
    decimal d = num / baseNum;
    int i = (int)Math.Floor(d) * 512;

    string modby = (newVer.Properties["vti_modifiedby"] != null ? newVer.Properties["vti_modifiedby"].ToString() : string.Empty);
    string modtime = (newVer.Properties["vti_timelastmodified"] != null ? newVer.Properties["vti_timelastmodified"].ToString() : string.Empty);

    SPUser spusr = null;
    try
    {
    spusr = destweb.EnsureUser(@modby);
    }
    catch (Exception ex)
    {
    spusr = destweb.CurrentUser;
    }

    //major publish (eg 1.0, 2.0, 3.0)
    if (num == i)
    {
    SPFile copFileVers = destfiles.Add(destURL, verFileStream,newVer.Properties, spusr, spusr, Convert.ToDateTime(modtime), Convert.ToDateTime(modtime),newVer.CheckInComment, true);
    copFileVers.Publish(newVer.CheckInComment + " date modified: " + modtime + " modified by: " + modby);
    }
    //minor (eg 0.1, 1.1, 2.3)
    else
    {
    SPFile copFileVers = destfiles.Add(destURL, verFileStream, newVer.Properties, spusr, spusr, Convert.ToDateTime(modtime), Convert.ToDateTime(modtime), newVer.CheckInComment + " date modified: " + modtime, true);
    }
    }
    catch (Exception ex)
    {
    WriteLogAndConsole("Set version failed with error: " + ex.Message + ex.StackTrace);
    }
    }