How to Upload Files to SharePoint 2013 using C# for CRM Dynamics 2013

May 8, 2015 Comments off

In a previous post I mentioned that we are using SharePoint 2013 to store our documents from CRM Dynamics 2013, so in addition to users manually placing documents in SharePoint themselves – we have some NT services that we have written that move files from various network locations as well as emails from Microsoft Exchange mailboxes automatically to these SharePoint folders. The catch was that we didn’t want to install the NT services on SharePoint servers just to be able to use the SharePoint object models, so we needed to find an alternative solution. I found bits and pieces of this code from various locations around the web and can’t take credit for figuring out all of the details involved.

Here is a great resource for these and other operations in C#:

https://msdn.microsoft.com/en-us/library/fp179912.aspx

And some PowerShell options:

http://www.codeproject.com/Articles/762291/Upload-Files-to-Library-in-SharePoint-using-PowerS

In order to use this solution, you need to download one of the following SharePoint Server 2013 Client Components SDK’s depending on your environment (online or on-premise):

http://www.microsoft.com/en-ca/download/details.aspx?id=35585

or

http://www.microsoft.com/en-us/download/details.aspx?id=42038

Once these components are installed, you can use the following code in C# to upload a file from any computer or server that does not have SharePoint installed:

bool fileUploaded = FileUploadUtil.UploadFile(
fileName, fileContent, "ProjectDocuments", docLocationSubFolderUrl,
"https://team.yoursite.net", credential, true);

Here is the content of the FileUploadUtil class, obviously the class has a few more options, you can call it with the path to the file and it will extract the binary, or if you already have the binary then you can pass it in:

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Net;
using Microsoft.SharePoint.Client;
using SbiLib.Diagnostics;

namespace SbiLib.SharePoint.Client
{
    public static class FileUploadUtil
    {
        private const string DOCUMENT_LIBRARY = "DocumentLibrary";

        /// <summary>
        /// Use this routeint simply get a list of document libraries in the specified site, if you want to offer a dropdown list to the user
        /// </summary>
        /// <param name="siteUrl">The root site (or subsite) url</param>
        /// <returns>A string based list of document libraries that were found</returns>
        public static List<string> GetDocumentLibraryList(string siteUrl)
        {
            List<string> libraries = new List<string>();

            try
            {
                using (ClientContext clientcontext = new ClientContext(siteUrl))
                {
                    // Load Libraries from SharePoint     
                    clientcontext.Load(clientcontext.Web.Lists);
                    clientcontext.ExecuteQuery();
                    foreach (List list in clientcontext.Web.Lists)
                    {
                        try
                        {
                            if (list.BaseType.ToString() == DOCUMENT_LIBRARY)
                            {
                                // only load the items with a base type of a document library
                                libraries.Add(list.Title.ToString());
                            }
                        }
                        catch (Exception ex)
                        {
                            Logger.Log(TraceLevel.Error, "Failed while adding library to collection.", ex);
                        }
                    }
                }
            }
            catch (Exception ex)
            {
                Logger.Log(TraceLevel.Error, "Failed to get list of libraries at the siteUtl specified.", ex);
            }

            return libraries;
        }

        /// <summary>
        /// Use this routine to upload a file to a SharePoint document library if you only have the path of the file you want uploaded
        /// </summary>
        /// <param name="uploadFilePath">The file path including filename to the file that you want uploaded</param>
        /// <param name="destLibraryName">The name of just the folder that you will be putting the file in</param>
        /// <param name="folderUrl">The full path to the final destination of the file</param>
        /// <param name="siteUrl">The root site (or subsite) url</param>
        /// <param name="credential">Credentials for accessing the SharePoint site, you can use Default credentials if the process has permissions</param>
        /// <param name="overWriteIfExists">To replace the file if it already exists or not</param>
        /// <returns>success</returns>
        public static bool UploadFile(string uploadFilePath, string destLibraryName, string folderUrl, string siteUrl, 
            ICredentials credential, bool overWriteIfExists)
        {
            // make sure the file exists
            if (System.IO.File.Exists(uploadFilePath))
            {
                try
                {
                    // convert the file in to a byte array and get the actual file name
                    Logger.Log(TraceLevel.Info, "Preparing to convert the file to a byte array and get filename...");
                    byte[] fileContent = System.IO.File.ReadAllBytes(string.Format(uploadFilePath));
                    string fileName = System.IO.Path.GetFileName(uploadFilePath);

                    Logger.Log(TraceLevel.Info, "Preparing to call the full UploadFile() with the file content...");
                    return UploadFile(fileName, fileContent, destLibraryName, folderUrl, siteUrl, credential, overWriteIfExists);
                }
                catch (Exception ex)
                {
                    Logger.Log(TraceLevel.Error, "Failed to convert or upload the file, look for any previous errors.", ex);
                    return false; 
                }
            }
            else
            {
                // file was not found, return an error
                Logger.Log(TraceLevel.Error, "File not found at path specified, file not uploaed.");
                return false;
            }
        }

        /// <summary>
        /// Use this routine to upload a file to a SharePoint document library if you have the binary component of the file
        /// </summary>
        /// <param name="fileName">File name</param>
        /// <param name="fileContent">Binary file content</param>
        /// <param name="destLibraryName">The name of just the folder that you will be putting the file in</param>
        /// <param name="folderUrl">The full path to the final destination of the file</param>
        /// <param name="siteUrl">The root site (or subsite) url</param>
        /// <param name="credential">Credentials for accessing the SharePoint site, you can use Default credentials if the process has permissions</param>
        /// <param name="overWriteIfExists">To replace the file if it already exists or not</param>
        /// <returns>success</returns>
        public static bool UploadFile(string fileName, byte [] fileContent, string destLibraryName, string folderUrl, string siteUrl,
            ICredentials credential, bool overWriteIfExists)
        {
            bool result = false;

            Logger.Log(TraceLevel.Info, "Creating ClientContext...");
            try
            {
                using (ClientContext ctx = new ClientContext(siteUrl))
                {
                    ctx.Credentials = credential;
                    var web = ctx.Web;

                    Logger.Log(TraceLevel.Info, "Creating FileCreationInformation class...");
                    var newFileInfo = new FileCreationInformation();
                    newFileInfo.Content = fileContent;
                    newFileInfo.Overwrite = overWriteIfExists;
                    newFileInfo.Url = fileName;

                    Logger.Log(TraceLevel.Info, "Getting document library by name...");
                    List docs = web.Lists.GetByTitle(destLibraryName);

                    Microsoft.SharePoint.Client.File uploadedFile;
                    if (folderUrl.Length > 0)
                    {
                        Logger.Log(TraceLevel.Info, "Getting document library sub folder by url, and then adding file...");
                        uploadedFile = docs.RootFolder.Folders.GetByUrl(folderUrl).Files.Add(newFileInfo);
                    }
                    else
                    {
                        Logger.Log(TraceLevel.Info, "Adding file to root folder...");
                        uploadedFile = docs.RootFolder.Files.Add(newFileInfo);
                    }
                    ListItem item = uploadedFile.ListItemAllFields;
                    item.Update();

                    // get the current user's context
                    Microsoft.SharePoint.Client.User sharepointUser = ctx.Web.CurrentUser;

                    Logger.Log(TraceLevel.Info, "Preparing to upload file to SharePoint...");
                    try
                    {
                        item.Update();
                        ctx.Load(sharepointUser);
                        ctx.Load(uploadedFile);
                        ctx.ExecuteQuery();
                        item.Update();
                        Logger.Log(TraceLevel.Info, "File uploaded.");
                        result = true;
                    }
                    catch (Exception ex)
                    {
                        Logger.Log(TraceLevel.Error, "Failed to upload the file to the SharePoint siteUrl and document library specified.", ex);
                    }
                }
            }
            catch (Exception ex)
            {
                Logger.Log(TraceLevel.Error, "Failed outside of just uploading the file to sharepoint, error in ClientContext area.", ex);
            }
            return result;
        }
    }
}

Good luck uploading your files! Email me with questions…

-Eric Niemiec

eric@techinternals.com

SharePoint 2013 Keep-Alive or Warm-Up Script – PowerShell Scheduled Task (for CRM Dynamics Document Management)

May 5, 2015 Comments off

We have built a custom integration between CRM Dynamics 2013 and SharePoint 2013 that enhances the built-in capabilities of the document management feature already in CRM Dynamics 2013. Each time that specific entities are created or updated, the plugin creates a folder for that Lead, Account, Opportunity or Contact and adds the SharePoint document path to the Document Locations for that Entity. The Plugin will also rename the folder if required, this allows the user to store related documents at any level in CRM/SharePoint and access the files through CRM, Windows Explorer, SharePoint and any other SharePoint sync programs that you might be using.

Due to this integration, we have faced speed issues in our CRM Dynamics 2013 plugins because the SharePoint sub-sites that are being referenced in the code are not spun up and ready to react to incoming requests. So while looking around for the right solution, I had to take some code from several others who have solved this issue in the past for their own environments (as we all have slightly different environments and needs).

Some things to keep in mind if you are using CRM Dynamics in this capacity; The CRM Dynamics 2013 List component really doesn’t work as designed in child site collections (i.e. mysharepointsite.com/sites/someothersite), but it does work well in the root or sub-sites – hopefully they will fix this in later releases of CRM Dynamics or the List component. Also, be careful with you plugin design these plugins and easily be called too often and could slow down other functions or have adverse effects on your system.

Here are some of the folks that borrowed code from so you can look at the differences and come up with your own solution:

http://sharepointryan.com/2011/03/29/warming-up-sharepoint-2010-using-powershell/

https://sharepointobservations.wordpress.com/2014/02/21/sharepoint-2013-warm-up-script/

Here is the final script that we are running as a scheduled task on all of the front ends in the farm, I have introduced a sleep command because I would rather have it takes its time going through the almost 200 sub sites we have rather than adding artificial load on our servers.

 

Add-PSSnapin Microsoft.SharePoint.PowerShell
 
Start-SPAssignment -Global 

$username = "SBI\proc.xrm"
$password = "133T*&" | ConvertTo-SecureString -asPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($username, $password)

$apps = Get-SPWebApplication
foreach ($app in $apps)
{
    $sites = Get-SPSite -WebApplication $app.url -Limit ALL -ErrorAction SilentlyContinue -WarningAction SilentlyContinue
    foreach ($site in $sites) 
    {
        $url = $site.Url + "/SitePages/Home.aspx"
        $res = Invoke-WebRequest -URI $url -Credential $cred
        $status = $url + ": " + $res.StatusCode
        Write-Host $status;
        Start-Sleep -m 1000

        foreach ($subsite in $sites.AllWebs) 
        {
            $url = $subsite.Url + "/SitePages/Home.aspx"
            $res = Invoke-WebRequest -URI $url -Credential $cred
            $status = $url + ": " + $res.StatusCode
            Write-Host $status;
            Start-Sleep -m 1000
        }
    }
}
 
Stop-SPAssignment -Global

While using this script, if you receive a ‘401 Unauthorized’ message on the ‘Invoke-WebRequest’, make sure you pay attention to this line ‘$url = $site.Url + “/SitePages/Home.aspx”‘ as it failed for me when I was only referencing the root of the site or sub-site. This has to do with the ‘Minimum download strategy’ feature on the site and how you have it set. I sent a few hours chasing my tail on the 401 error, trying all sort of suggestions I found on line as to what might be causing it, including modifying the ;Policy for Web Application’ area of SharePoint under the Application Management area.

I would also strongly suggest just running this process under a user that has permissions to perform these tasks (-UserDefaultCredentials) rather than saving the password in the file. I will most likely convert this Powershell script in to C# format and add it other NT services that I already have running rather that proliferating the ‘Scheduled Task’ nightmare that some system administrators create for themselves and those who fill their shoes after they leave the company.

Good Luck!

Eric Niemiec

eric@techinternals.com

How to move all Microsoft SharePoint 2013 Databases to a New Instance of SQL Server

April 22, 2015 Comments off

I wrote a blog post on how to move SQL Server instances from one cluster to another cluster late last year and found myself needing to move a SharePoint instance from a standalone Hyper-V machine to our bare-metal physical database cluster. The main difference between the last instance move and this instance move is that the SERVER\INSTANCE_NAME would change this time around rather than staying the same but just moving to a different home. If the SERVER\INSTANCE_NAME didn’t need to change – things would have been easier. I started by referencing my old post and let the journey begin:

https://ericniemiec.wordpress.com/2014/11/25/how-to-move-a-sql-server-2008-r2-clustered-database-instance-to-a-new-sql-server-2012-cluster/

There are two main ways that you can move SharePoint databases, the easy network alias method where you really don’t tell SharePoint that you are moving the databases (that you have to then live with for ever) or the real way in which you do tell SharePoint that you have moved the databases and that it is aware of what you did. The main process for following both of these methods can be found in this KB article, but it is never as easy as following their steps. I believe the key to success on this move is to keep the old instance up and running while you perform the changes – then you can stop the SQL Server services on the old instance to test, pay attention to this below. DO NOT start hacking the registry in a brute force method to just make it work

https://technet.microsoft.com/en-us/library/cc512725.aspx

To get started, I followed a lot of my own steps in my previous blog post with I will leave out here and only cover the important steps:

  1. Make the SharePoint instance unavailable to your users – I did this through my load balancers, but you want to make sure that no one is changing the database in case you need to roll back or they are not working with the database after you already made backups of it.
  2. Just to be sure, I also stopped IIS and all related SharePoint services on my web front ends
  3. Making sure that all my SQL users from the old instance have been moved over to the new instance, follow this KB article to help you out: http://support.microsoft.com/kb/918992 (this is a bit on an art as you need to modify the file before you execute it, you may have old users, be referring to default databases that don’t exist yet, etc.)
  4. I run a script that puts all the databases in to Simple recovery, shrinks them and the log files and then I back up ALL the databases in my old SharePoint 2013 instance (SQL Server 2012). This makes the file copy to the new server go much faster.
  5. Copy the backup files to the new instance on the cluster (SQL Server 2014) and run the restore scripts (as seen in my last post).
  6. Once the databases are restored on the new instance and you can test access to the new instance with the same user that SharePoint will try to access them with from the SharePoint web front ends (use osql to test if you need to), you are ready to move on to the SharePoint related steps – because up to this point, you have not really touched SharePoint.
  7. On the SharePoint server, open the SharePoint 2013 Management Shell and run the following command to pipe all your databases out to a text file: Get-spdatabase | Out-File C:\databases.txt
  8. Once you have the text file with all of the SharePoint databases in it, you want to use that file to build a script that makes all the changes for you in a single pass. Take a single line like this and replicate this process for each line in the databases.txt file (the GUID is the only thing that you are really keeping from each source line):

    StateService_fee01336… 04d5b4bc-5341-4388-a870-7922a01928fe Microsoft.Offi…

    And convert it to this:

    $db = get-spdatabase -identity 04d5b4bc-5341-4388-a870-7922a01928fe

    $db.ChangeDatabaseInstance(“<YourNewServer\InstanceNameHere”)

    $db.Update()

    Write-Host $db.DatabaseConnectionString

  9. Once you have the file created/converted, then you can either test one database or run the entire file. Here is an example of testing a single database and confirming that the new SERVER\INSTANCE_NAME has indeed been changed. Make sure you have followed these rules to this point; No manual registry changes AND the old database instance is still online and accessible!

  10. The next step is to make ONE SINGLE registry change, update the SERVER\INSTANCE_NAME at this ONE location with the new one you used above:

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\15.0\Secure\ConfigDB]

  1. Now that the heavy lifting is done, follow these steps to wrap it all up and test it.
    1. Close all SharePoint 2013 Management Shells that are currently open
    2. Stop the SQL Server services on the OLD instance that you are moving away from (or shut he machine down for now – and hopefully for good)
    3. Open a new SharePoint 2013 Management Shell and run the following command: Get-spdatabase
      1. If you see the following results, you know it is connected to the new database instance and everything should work fine
      2. If you see a bunch of red errors, something is wrong and it can’t connect to the new database instance
    4. Now you can either restart IIS and all the related SharePoint 2013 services on your web front ends, or just reboot them (my suggestion is to reboot them) and wait for the system to come back online

If you have any issues, first check your database connectivity to the old and new instances while you are running the power shell commands. You can very easily roll the entire process back by simply putting your old instance name in the power shell commands and re-running them to point everything back to your old instance and updating the one item in the registry and rebooting your web front ends.

It is really not a complicate process, but the KB article doesn’t cover all the bases. You should be comfortable with SQL Server, SharePoint and Power Shell in order to get through this without too much heartache.

Good Luck!

-Eric Niemiec

Disabling SSL v3 on Windows 2008 R2, 2012 R2 Server & ASP.Net Applications

February 15, 2015 Comments off

We have been seeing the warnings from our vendors for weeks, telling us that they are turning off support for SSL v3 – not sure how this would affect us, we basically ignored the emails. Several weeks later, would found ourselves having to deal with the problem head on and now we know how it affects folks like us.

Not only do we sell gift certificates from our web sites but we also have several integrations with outside vendors. We quickly identified that it was an SSL v3 issue causing the problem, but could not find an all-inclusive article on how we should resolve the problem in our environment.

Here are some of the errors that we were seeing from our C# and ASP.Net 4.0 applications:

  1. The underlying connection was closed: An unexpected error occurred on a send. —> System.IO.IOException: The handshake failed due to an unexpected packet format.
  2. The underlying connection was closed: An unexpected error occurred on a receive.
  3. The client and server cannot communicate, because they do not possess a common algorithm
  4. The handshake failed due to an unexpected packet format.

What first set me on the right path as we too were having a problem with a different pay gateway, was this article (my responses are at the bottom of it):

http://stackoverflow.com/questions/26742054/the-client-and-server-cannot-communicate-because-they-do-not-possess-a-common-a

After doing partial fixes from various articles we found that the solution was simple, if you fully implemented the entire solution.

First, in your .Net code, you need to include the line:

ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls (for all .Net 4.0 and lower applications)

ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 (for all .Net 4.5 and higher applications)

For reference: https://msdn.microsoft.com/en-us/library/system.net.securityprotocoltype(v=vs.110).aspx

I simply put this line in the constructor of all the classes that I had where I performed some type of encrypted communication with a vendor. In some classes I already had this line, but it was setting it to the Ssl3 option. Also note that ONLY TLS 1.0 is available in .Net 4 and below, you can only use TLS 1.1 or TLS 1.2 in .Net 4.5 and above.

The second and final fix was to make sure that the protocols were correctly configured on the servers, Server 2012 R2 in my case. What I found was a mix of people assuming that some protocols were set one way and only setting the ones that they thought they cared about for their situation – especially when other teams might have been using the server for a .Net application of a different level.

Here is my entire registry settings file, it completely turns off SSL v2 & SSL v3 for client and server applications and enables the TLS protocols.

After applying these registry settings, adding that single line of code to my .Net 4.0 applications and rebooting the server – ALL PROBLEMS went away. I have repeated this fix across multiple servers including Windows 2008 R2, 2012 and 2012 R2.

The only thing you need to be aware of is that when you upgrade your applications to .Net 4.5 and above, you need to change the SecurityProtocolType in you code and modify the registry settings to disable TLS 1.0 and TLS 1.1 and only keep TLS 1.2 enabled.

Here is a good resource to test your servers (and browser) to make sure everything is configured as you need it to be, it will show you how you rank and what you should fix. Be careful not to over-fix your server for your situation otherwise you will break everything again. The initial goal is to disable SSL v3, not TLS 1.0.

https://www.ssllabs.com/ssltest/

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Client] “DisabledByDefault”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Client] “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server] “DisabledByDefault”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 2.0\Server] “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Client] “DisabledByDefault”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Client] “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server] “DisabledByDefault”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\SSL 3.0\Server] “Enabled”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client] “DisabledByDefault”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server] “DisabledByDefault”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client] “DisabledByDefault”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server] “DisabledByDefault”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client] “DisabledByDefault”=dword:00000000

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server] “Enabled”=dword:00000001

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server] “DisabledByDefault”=dword:00000000

Categories: Web Farm & IIS Tags:

How to get the Server Name, Instance Name, Computer Name, Data Path, Log Path and Backup Path from a SQL Server 2014 Cluster

December 17, 2014 Comments off

From time to time I need to write dynamic scripts that need the current instance name and sometimes more specifically, the actually Computer Name that I am on in the cluster that I am working with. Here are a couple of helpful scripts help get me that information.

Some of the SERVERPROPERTY calls do not work on SQL Server 2008 R2 and you might need to default to the registry calls if you really need the data.

— gives you the ‘machine name’: ‘SQL-MYINSTANCE’

SELECT SERVERPROPERTY (‘MachineName’)

— gives you the ‘server name’: ‘SQL-MYINSTANCE\MYINSTANCE’

SELECT SERVERPROPERTY (‘ServerName’)

— gives you the local ‘ComputerName’ of the server in the cluster

SELECT SERVERPROPERTY(‘ComputerNamePhysicalNetBIOS’)

— this also gives you ‘physical machine name’ for the machine that you are running the command on: ‘S-TOL-SQL3’

DECLARE @dir VARCHAR(4000)

EXEC master.dbo.xp_regread
N’HKEY_LOCAL_MACHINE’,
N’SYSTEM\CurrentControlSet\Control\ComputerName\ComputerName’,‘ComputerName’,
@dir output
SELECT @dir

— gets the default data path

SELECT SERVERPROPERTY(‘INSTANCEDEFAULTDATAPATH’)

— gets the default log path

SELECT SERVERPROPERTY(‘INSTANCEDEFAULTLOGPATH’)

— gets you the default backup path

DECLARE @dir VARCHAR(4000)

EXEC master.dbo.xp_instance_regread
N’HKEY_LOCAL_MACHINE’,
N’Software\Microsoft\MSSQLServer\MSSQLServer’, ‘BackupDirectory’,
@dir output
SELECT @dir

Categories: SQL Server Tags: ,

How to move a SQL Server 2008 R2 or SQL Server 2012 clustered instance to a new SQL Server 2014 cluster

November 25, 2014 Comments off

Here are the steps that I took while upgrading a SQL Server 2008 R2, 2012 & 2014 mixed version SQL Server cluster that was running on Windows Server 2008 R2 to a new cluster running strictly SQL Server 2014 and Windows Server 2012 R2. I don’t cover all the steps (such as moving AS and SSRS), but I will try to give a good overview of my environment and what I needed to do and prepare for.

We started with a 2 node Windows 2008 R2 Server cluster running SQL Server 2008 R2 with 8 instances on it. Over time, we have slowly upgraded some SQL Server instances to SQL Server 2012 and some all the way to SQL Server 2014, but at the end of the day, I have all three versions of SQL Server running on this cluster. With our load increasing, it was time to add another node to the cluster for performance and HA purposes but took this as a good opportunity to upgrade the entire cluster in the process. This blog is dedicated to the first step of that move – taking one instance from the old cluster and moving it to the new cluster.

We built the new Dell R920 as a single node cluster running Windows Server 2012 R2 and ONLY SQL Server 2014 atop a new VNX2 SAN. Our goal will be to move half of our instances over to the new cluster, tear down one of the Dell R910 servers, reinstall it with Windows Server 2012 R2 and add it to the new cluster. Then moving the other half of the instances to the new cluster, then tear down the remaining Dell R910 server and reinstalling it with Windows Server 2012 R2 and adding it to the new cluster forming a new Windows 2012 R2 three node cluster running only SQL Server 2014. We have purposefully decided to skip upgrading the pre-2014 instances to SQL Server 2014 to save time during this process and because we can do this! (When you restore a database from a previous version of SQL, the restore process automatically upgrades the database for you). During this type of upgrade, we will temporarily lose some of our HA capabilities as one cluster or the other will be in a single node cluster until the entire process is complete.

I tried to document the entire process, so here we go…

  1. Since the instance that I am moving is named SQL-PRIVATE\PRIVATE, I simply named the new cluster SQL-PRIVATEX\PRIVATE since both will be renamed at the end. My plan is to move all of the databases to the new instance, rename the old instance to SQL-PRIVATEZ\PRIVATE, then rename the new instance back to the original name SQL-PRIVATE\PRIVATE. Doing it this way, no clients needed to know about the change as the connection strings never changed (DNS just takes a few minutes to catch up on the new IP – see commands below to help speed this up).
  2. Since SQL Server 2014 can now handle CSV volumes for the database engine but not AS, I made a rule that I would use shared disks for any instance that had AS installed, and CSV’s for all the others. When Microsoft finally decides to support AS in CSV’s, I will move the files. We are however using a local SSD drive for our TempDB (new feature starting in SQL Server 2012).  Be aware that the current version of Microsoft DPM DOES NOT support backing up databases that live on CSV volumes.  As we are currently migrating from DPM to DataDomain and Networker for our backups, I had to back up to Azure using the new Backup to URL feature in SQL Server 2014 until our new backup solution is in place.
  3. I made sure that the new instance on the new cluster is configured the exact way that I wanted it to be, very similar if not identical to the old instance on the old cluster, this included things such as the following: (I have created scripts for most of these steps)
    1. Making sure that all my users from the old instance have been moved over to the new instance, follow this KB article to help you out: http://support.microsoft.com/kb/918992 (this is a bit on an art as you need to modify the file before you execute it, you may have old users, be referring to default databases that don’t exist yet, etc.)
    2. Made sure all my instance level options were set such as fill factor, compression and paths
    3. Setup all my agent operators
    4. Configured my database mail settings
    5. Set my custom agent properties using the above agents and database mail settings to name a few.
    6. Recreated all the linked servers that existed on the old instance
    7. Be aware of your any reporting server or analysis server configurations.
    8. Took a look at all the jobs that would need to be recreated on the new instance after the databases were moved – including any backup jobs.
    9. In my example, this instance did not have AS or SSRS installed to worry about.
  4. I created a script to help me generate the backup scripts and restore scripts from the old instance on the old cluster. It worked well enough for me, I just needed to update the path when restoring to the new instance, but it got me close enough – especially for instances with a lot of databases on them such as SharePoint sites. Here is how it worked:
    1. Run one script to generate the backup folders and the backup commands
    2. Execute the backup commands to backup the databases on the old instance
    3. Copy the backup files to the new instance
    4. Run the second script to generate the restore commands on the old instance
    5. Copy the resulting commands to the new instance and fix the restore paths as they will be slightly different.
    6. Execute the restore commands on the new instance

      (please keep in mind that the databases in this example were simple (not small) databases, ONE data file and ONE log file – you will have to change the script or the restore commands ‘WITH MOVE’ option if you have more than that)

  5. I practiced everything above on the new instance before getting to the point where I tested renaming the clustered instances. In my case, I had a full clustered test instance to test the entire process with before running through it on a production instance.
  6. Here is how my night went on the final run:
    1. Stopped all applications and web sites that were talking with any of the databases. You might need to put your old instance in single user mode to stop applications from talking to it. Check Activity Monitor to make sure all conversations have stopped.
    2. Stopped all Agent and backup jobs, including things like DPM or other third party backups programs.
    3. Backed up the databases using the scripts mentioned above
    4. Took all of the databases in the old instance offline to make sure any services or applications were not connecting to them (I would rather have failed connections than to be modifying data in ‘old’ databases).
    5. Copied to the database backups to the new server
    6. Restored the databases using the scripts mentioned above
    7. Updated the db owner and compatibility modes
    8. Set to simple backup, shrank the database and log files
    9. Renamed the old instance to SQL-PRIVATEZ\PRIVATE using this kb article: http://msdn.microsoft.com/en-us/library/ms178083.aspx making sure to run these commands on the server that I performed the rename on:
      1. ipconfig /flushdns
      2. ipconfig /registerdns
      3. nbtstat –RR
    10. Waited about 5 minutes for the change to propagate in DNS
    11. Renamed the new instance to SQL-PRIVATE\PRIVATE and ran these commands on the new instance server:
      1. ipconfig /flushdns
      2. ipconfig /registerdns
      3. nbtstat –RR
    12. Waited about 5 minutes for the change to propagate in DNS
    13. Checked my routes on the new server and confirmed that other servers could see the new IP address
    14. Tested one of my applications from a remote server
    15. Set the databases back in to full recovery mode and reset/re-enabled my backups and performed a full backup to kick off the set
    16. Re-enabled (after recreating) my Agent jobs
    17. Tested my Linked servers (after creating them)
    18. Enabled all the applications, load balancers and websites
    19. Monitored the new instance!
  7. A couple of things you should think about and consider as most folks have many instances and have to repeat the same steps many time:
    1. Create generic scripts to do your works, ones that are not instance specific
    2. Make sure to upgrade all of the databases to SQL Server 2014 compatibility level and SA as the owner rather than a domain account that you were logged in with. (or some other owner – remember, if your database is owned by a domain user and that domain user goes away – you will have problems).
    3. Make sure to use the same port if a static port was used on the old instance

Good luck, it really did turn out to be an easy process after the research was done. Note: renaming the old cluster did generate an error (Windows 2008 R2) but it did work just fine by hitting Cancel after the error.

Good luck!

-Eric Niemiec

Creates the backup folders and the script to execute to create the backups:

(I am sure there are better ways to write some of the components of these scripts)

USE MASTER

GO

— To allow advanced options to be changed.

EXEC sp_configure ‘show advanced options’, 1

GO

— To update the currently configured value for advanced options.

RECONFIGURE

GO

— To enable the feature.

EXEC sp_configure ‘xp_cmdshell’, 1

GO

— To update the currently configured value for this feature.

RECONFIGURE

GO

EXEC sp_MSforeachdb @command1=

— set the database to the current database

USE [*]

IF (DB_NAME() NOT IN (”master”,”model”, ”tempdb”, ”msdb”))

BEGIN

    DECLARE @logLogicalFileName VARCHAR(100)

    DECLARE @logFilePathName VARCHAR(1000)

    DECLARE @dataLogicalFileName VARCHAR(100)

    DECLARE @dataFilePathName VARCHAR(1000)

    DECLARE @tmpMD VARCHAR(1000)

    DECLARE @databaseName VARCHAR(500) = DB_NAME()

    DECLARE @defaultBackupDirectory VARCHAR(1000) = REPLACE(CONVERT(VARCHAR(1000), SERVERPROPERTY(”INSTANCEDEFAULTDATAPATH”)), ”\DATA\”, ”\Backup\”)

    SELECT @logLogicalFileName = b.name, @logFilePathName = b.filename FROM master.dbo.sysdatabases a INNER JOIN master.dbo.sysaltfiles b ON a.dbid = b.dbid WHERE a.dbid = DB_ID() AND FileId = 1

    SELECT @dataLogicalFileName = b.name, @dataFilePathName = b.filename FROM master.dbo.sysdatabases a INNER JOIN master.dbo.sysaltfiles b ON a.dbid = b.dbid WHERE a.dbid = DB_ID() AND FileId = 2

    SET @tmpMD = ”MD ” + @defaultBackupDirectory + ”TempTransfer\”

    EXEC master.dbo.xp_cmdshell @tmpMD

    SET @tmpMD = ”MD ” + @defaultBackupDirectory + ”TempTransfer\” + REPLACE(@databaseName, ” ”, ”_”) + ”\”

    EXEC master.dbo.xp_cmdshell @tmpMD

    SET @defaultBackupDirectory = @defaultBackupDirectory + ”TempTransfer\” + REPLACE(@databaseName, ” ”, ”_”) + ”\” + REPLACE(@databaseName, ” ”, ”_”) + ”_” + CONVERT(VARCHAR(2), DATEPART(MONTH, GETDATE())) + ”_” + CONVERT(VARCHAR(2), DATEPART(DAY, GETDATE())) + ”_” + CONVERT(VARCHAR(4), DATEPART(YEAR, GETDATE())) + ”.bak”

    PRINT ”BACKUP DATABASE [” + @databaseName + ”] TO DISK = N””” + @defaultBackupDirectory + ””” WITH COPY_ONLY, NOFORMAT, INIT, COMPRESSION, STATS = 5, NAME = N””” + @databaseName + ””” ”

END

, @replacechar=‘*’

Creates the scripts to restore the databases on the new server (you will need to fix the beginning of the paths):

USE MASTER

GO

— To allow advanced options to be changed.

EXEC sp_configure ‘show advanced options’, 1

GO

— To update the currently configured value for advanced options.

RECONFIGURE

GO

— To enable the feature.

EXEC sp_configure ‘xp_cmdshell’, 1

GO

— To update the currently configured value for this feature.

RECONFIGURE

GO

EXEC sp_MSforeachdb @command1=

— set the database to the current database

USE [*]

IF (DB_NAME() NOT IN (”master”,”model”, ”tempdb”, ”msdb”))

BEGIN

    DECLARE @logLogicalFileName VARCHAR(100)

    DECLARE @logFilePathName VARCHAR(1000)

    DECLARE @dataLogicalFileName VARCHAR(100)

    DECLARE @dataFilePathName VARCHAR(1000)

    DECLARE @databaseName VARCHAR(500) = DB_NAME()

    DECLARE @defaultBackupDirectory VARCHAR(1000) = REPLACE(CONVERT(VARCHAR(1000), SERVERPROPERTY(”INSTANCEDEFAULTDATAPATH”)), ”\DATA\”, ”\Backup\”)

    SELECT @logLogicalFileName = b.name, @logFilePathName = b.filename FROM master.dbo.sysdatabases a INNER JOIN master.dbo.sysaltfiles b ON a.dbid = b.dbid WHERE a.dbid = DB_ID() AND FileId = 1

    SELECT @dataLogicalFileName = b.name, @dataFilePathName = b.filename FROM master.dbo.sysdatabases a INNER JOIN master.dbo.sysaltfiles b ON a.dbid = b.dbid WHERE a.dbid = DB_ID() AND FileId = 2

    SET @defaultBackupDirectory = @defaultBackupDirectory + ”TempTransfer\” + REPLACE(@databaseName, ” ”, ”_”) + ”\” + REPLACE(@databaseName, ” ”, ”_”) + ”_” + CONVERT(VARCHAR(2), DATEPART(MONTH, GETDATE())) + ”_” + CONVERT(VARCHAR(2), DATEPART(DAY, GETDATE())) + ”_” + CONVERT(VARCHAR(4), DATEPART(YEAR, GETDATE())) + ”.bak”

    PRINT ”RESTORE DATABASE [” + @databaseName + ”] FROM DISK = N””” + @defaultBackupDirectory + ””” WITH FILE = 1, ” +

    ”MOVE N””” + @dataLogicalFileName + ””” TO N””” + @dataFilePathName + ”””, ” +

    ”MOVE N””” + @logLogicalFileName + ””” TO N””” + @logFilePathName + ”””, ” +

    ”NOUNLOAD, REPLACE, STATS = 5”

END

, @replacechar=‘*’

Categories: SQL Server Tags:

How to Install SQL Server 2008 R2 or SQL Server 2012 Reporting Services on a Failover Cluster

November 21, 2013 Comments off

As many of you are aware, SQL Reporting Services is not cluster aware, so they say. But if you try adding it to a failover cluster installation, it does not give you that option because you are dealing with a cluster – but there is a way to do this. To be clear, SQL Server Reporting Services does not participate in the Windows Failover Clustering so therefor is consider not cluster aware from that standpoint.

Before you install a SQL Server failover node, you should do some planning before you install. For example, Analysis Services can’t be added after the first node has been installed without un-installing the entire thing. But, Reporting Service can be added later since it is not technically cluster aware.

My recommendation when installing a new SQL Server cluster failover node is to include Analysis Server and Native Reporting Services – you can always disable them later if you choose not to install them.

Let’s focus on Reporting Services for a minute. When installing your first node, you should include Native Reporting Services. If you don’t and want to go back and add it later, use the following setup switch to allow you to add this to any node on the cluster:

:\Setup.exe /SkipRules=StandaloneInstall_HasClusteredOrPreparedInstanceCheck /Action=Install

Even if you install SQL Server Reporting Services on the first node, when you install a second node, SQL Server Reporting Services will not be installed – you must use the above switch to add it after the node has been added.

There are two fundamental approaches you can use when installing SQL Server Reporting Services, install it on each node of the cluster that the instance is installed on OR, install it on servers that are not running SQL Server at all. Let’s go through some of the pros and cons of each.

If you have a small implementation and simply have a cluster setup for redundancy purposes and server load is not a concern, installing the SQL Server Reporting Services on each of the clustered nodes is not a bad thing, as long as it is installed on all nodes that the clustered instance can be active on and they are all joined in a scale-out-deployment. This will allow you to use the SQL Cluster Network Name to access the reporting services web services and reporting manager. This acts as a poor man’s load balancer, but the problem here is it will bring you to the reporting services that lives on the active node and doesn’t really load balance at all – it’s a convenience thing (no need to set up NLB or use something like a KEMP load balancer).

On the other hand, in a heavy use environment where load on the production SQL Servers is an issue, you should install SQL Server Reporting Services on dedicated servers, and point them to a database on the clustered instance and join them in a scaled-out-deployment. SQL Server Reporting Services does not have to be installed on a database server, just needs a database to store it’s configuration in. While SQL Server Reporting Services appears to be instance aware – it’s just a website that you configure with a URL that typically happens to have the instance name in it (which it does not have to have). Once you have the servers installed and the joined in a scaled-out-deployment, you can throw NLB or load balancers over the top of them to perform REAL load balancing.

There are always multiple ways to skin a cat, consider all your options before taking any one approach to solving your reporting needs. Consider off-loading all of your reporting to a replica server in SQL Server 2012 (there are several options that you can consider in 2008 AND 2012).

Hope this helps!

-Eric Niemiec

Categories: .Net
Follow

Get every new post delivered to your Inbox.