Quantcast
Channel: Secure Infrastructure Blog
Viewing all 177 articles
Browse latest View live

How to configure SQL Database mail to send emails using Office 365 (Exchange Online): A walkthrough

$
0
0

Introduction

SQL Server has a feature called database mail. This feature allows the database server to send emails to any external entity using SMTP server. The problem happens if you have installed an on-premise SQL server and an online (Office 365) Exchange server in the cloud. How can you use this Exchange server in the cloud to send database emails?

This blog post provides a complete walkthrough on how to configure this. This is based on the description provided in the KB article http://support.microsoft.com/kb/2600912.

The Walkthrough

 The steps start with the following.

Step 1: Get the SMTP settings for your Exchange online server

1-      Go to the address http://dev.office.com to sign up for a trial account for office 365.

2-      After the exchange service is provisioned go to the tenant administration page and click on Outlook
clip_image002[4]

3-      Click on the “clip_image003[4]” icon and then click options
clip_image005[4]

4-      Click on “Settings for POP or IMAP access…”
clip_image007[4]

5-      Take note of the SMTP server settings
clip_image009[4]

In this case the Server settings are noted as it will be used in the next step.

Step 2: Install and configure an On-prem SMTP server

Next you will need to install an SMTP server in your network to relay to the Exchange online. I am using Windows Server 2012 but you can use any SMTP server.

1-      Configure the SMTP server role on your local server.
clip_image011[4]

2-      Open the IIS 6.0 management console. Right click on the SMTP server and open the properties window

3-      Click on the delivery tab
clip_image013[4]

4-      Click “Outbound Security” and enter the login credentials you use for the Exchange online (and office 365) as below
clip_image015[4]
Remember to enable “TLS encryption”

5-      Click “OK” then click “Advanced”. Enter the SMTP server URL you got in the previous step in the Smart host edit box
clip_image017[4]
then click “OK”

6-      Click on “Outgoing connections” and set the port correctly to 587 (or depending on your SMTP settings)
clip_image019[4]

7-      Click “Ok” twice to apply the settings on the SMTP local server.

Step 3: Configure the SQL Mail

1-      Open the SQL management studio and connect to your local server

2-      Expand the “Management node” and then right click the “Database Mail” node and click “Configure Database Mail”
clip_image021[4]

3-      Follow the wizard and the critical part is to configure the access account as per the below screen
clip_image023[4]
Please note that you enter the server to send to as localhost and the email address as the email you have on the office 365 Exchange online for the same account you used to configure the delivery configuration of the local SMTP server.

4-      Once finished the configuration test the email sending and you should now be able to send emails to any external recipient using you Exchange online as the relay.

Conclusion

I have showed you in this post how to have an on premise SQL server connect and use an in the Cloud Exchange server to be able to send SQL database Emails.


Windows Azure – Remove Virtual Disks

$
0
0

This is very famous issue when you create many Virtual Machines on Windows Azure, after deleting the VMs the virtual disks (VHD files) are still there.

So far there is no way to remove these VHD files using the GUI interface, so I started to search how can I do it through PowerShell. The challenge of Azure PowerShell is that most of the resources available for developers using Visual Studio, for me any thing starts with Visual Studio is sort of mystery (The nice story that one of my developer friends identify SharePoint Site Collection: is a site contains collections is true story) that should be left to wisdom men (like this friend) who can understand it Smile.

For Infrastructure guys like me, the following are step by step of how to use PowerShell to connect to Azure and then delete the VHDs:

Step 1: Management Certificate:

In simple words we will need a certificate to connect with PowerShell to Azure, this certificate should be 2048 bits at least, self signed certificate is good option for testing and labs.

Creating Self Signed Certificate can be done by many ways, the easiest way is using the normal PowerShell:

1. Open PowerShell (as an Administrator).

2. Type the following cmdlet:

New-SelfSignedCertificate -DnsName azure -CertStoreLocation cert:\LocalMachine\My

This cmdlet will create new self signed certificate and place it in the local machine store, check the following snapshots:

image

Let’s make sure that the certificate is created:

1. From Run > Type mmc.

2. Click File > Add/Remove Snap-in.

3. Select Certificate and Click Add.

4. Select Computer Account > Finish.

5. Under Personal > Certificate > you should find the new certificate “azure”:

image

The next step is to upload this certificate with private key (the PFX file) to Windows Azure, in order to do that you will need to export the PFX file (the certificate with the Private Key) first:

1. From the same Snap-in.

2. Right click the certificate > All Tasks > Export.

3. Click Next > Select “Yes Export the Private Key” > Click Next.

4. On Export File Format click next > Then provide the password.

5. Browse to the location where you will save the certificate.

From Azure Management Portal create a new Cloud Service to use it with the certificate:

image

After having the PFX file we will need to upload it to Windows Azure:

  1. Log into the Management Portal.

  2. In the navigation pane, click Cloud Services, and then click the service for which you want to add a new certificate.

  3. On the ribbon, click Certificates, and then click Upload a Certificate.

  4. In the Upload certificate dialog, click Browse For File, go to the directory containing your certificate, and select the .pfx  file to upload.

  5. Type the password of the private key for the certificate.

  6. Click OK.

image

Step 2: Install Azure PowerShell

To install Windows Azure PowerShell:

  1. Download Windows Azure PowerShell: http://go.microsoft.com/?linkid=9811175&clcid=0x409 
  2. Install Windows Azure PowerShell.

Step 3: Connect to Azure:

Before we start using the Windows Azure cmdlets to remove the VHDs or anything else, we will need to configure connectivity between the workstation and Windows Azure. This can be done by downloading the PublishSettings file (this file will setup the PowerShell environment to use Windows Azure) from Windows Azure and importing it. The settings for Windows Azure PowerShell are stored in: <user>\AppData\Roaming\Windows Azure PowerShell.

  1. From Windows Azure PowerShell type the following cmdlet:

    Get-AzurePublishSettingsFile

    A browser window opens at https://windows.azure.com/download/publishprofile.aspx, where you can sign in to Windows Azure.

  2. Sign in to the Windows Azure Management Portal, and then follow the instructions to download your Windows Azure publishing settings. Use your browser to save the file as a .publishsettings file to your local computer. Note the location of the file.
  3. In the Windows Azure PowerShell window, run the following command:

    Import-AzurePublishSettingsFile FileName.publishsettings

image

Step 4: Remove the VHDs (Finally)

Once we are connected now to Azure, we can manage Azure with PowerShell and do anything we need, if you forgot what was the purpose of this article this is the time you will need to scroll up and read it again Smile

Run the following cmdlet to see all the virtual disks (the VHD file):

Get-Azuredisk

disk

In the above snapshot this is the disk that I want to remove (you will need to make sure that is not connected to any Virtual Machine), then run the following cmdlet:

Remove-AzureDisk <DiskName>

image

Now you will probably decide to leave the VHDs and not delete them which is easier Smile, I don’t blame you for that. But you will have this article as reference in case one day someone asked you the question: How can I remove the unused VHD file.

for more details about Azure PowerShell: http://msdn.microsoft.com/en-us/library/windowsazure/jj156055.aspx

for more information about Azure Management Certificate: http://msdn.microsoft.com/en-us/library/windowsazure/gg981929.aspx 

 

Securing Dynamic Data ASP.NET SQL Azure Published Web Site with ACS and Facebook as an Identity Provider

$
0
0

The Scenario

I wanted to implement an Azure web site that is using the Azure Access Control Service and integrates with an external identity provider to authenticate and authorize users. At first I thought of using Windows Live ID but it has a problem that the only claim offered by WLID is the unique identified which is simply a number and represents nothing from the user. Then I thought why not make things more interesting and use Facebook. I think things got more interesting than I thought. J

I wanted to implement a Dynamic database access web site so that it generates the views on top of an existing SQL Azure database and lets the end user manipulate the database tables and filter them. This is using Linq-to-SQL classes.

I am using Visual Studio 2012 latest version.

The Steps highlights

The steps at a glance of how to get this up and going are as below:

1-     Create your project:

a.      Create your database.

b.      Create a new Dynamic ASP.NET project.

c.      Add a Linq-To-SQL model to your database.

d.      Change the Framework version of the project.

e.      Set the “ScaffoldAllTables” to true.

2-     Download the latest “Identity and access tool”

3-     Create a new Azure web site and download the publishing settings.

4-     Setup your identify provider:

a.      Create a new Azure ACS namespace

b.      Create a new Facebook application

c.      Configure your Facebook application.

d.      Add your Facebook application as an identity provider.

e.      Configure you claims rules

5-     Set the ACS settings in the “Identity and access tool”

6-     Implement your custom claims authorization manager

7-     Complete the web.config configuration

8-     Publish your web site.

Solution Walkthrough and description

Create your project

This is the first step and as I described I wanted to create a dynamic ASP.NET site based on a custom database.

Create the Database

First I created the database in SQL Azure.

clip_image002

Created a new SQL server and provided the administrator permissions and allowed Azure services access to this server.

clip_image004

Then I allowed access to my IP to this database to allow me to manage it.

clip_image006

Then I started to design my database (this can be either done online or using Visual Studio)

clip_image008

clip_image010

Or from Visual Studio 2012

clip_image012

clip_image014

Once the database is created and ready to be used then you will go to the next step.

Create the Dynamic ASP.NET project

Open visual Studio and click on new project. You will need to switch to .net framework 4.0 to see the dynamic data ASP.NET SQL template.

clip_image016

clip_image018

clip_image020

clip_image022

Change the “Global.ascx” file to uncomment the line to connect to the classes to be as follows

DefaultModel.RegisterContext(typeof(TestDbDataClassesDataContext), newContextConfiguration() { ScaffoldAllTables = true });

Change the .net Framework version to be 4.5 to be able to see the Identity and Access tool link.

clip_image024

Download the Identity and Access tool

From the visual studio tools menu click on the “Extensions and updates”

clip_image025

Search for the “Identity and Access tool” and install it and restart your Visual Studio.

clip_image027

Create the Azure Web Site

Open the Azure management site and click on new web site.

clip_image028

Click on “Download the publish profile”

clip_image030

Save this file on your Hard disk. Now in Visual Studio click on the project and then Publish

clip_image031

Click on “Import”

clip_image033

Now select the publish settings file already downloaded.

Complete the publishing and test that the application is now published and working.

clip_image034

Now the next step is to configure the Facebook as an identity provider.

Setup Facebook as an Identity Provider

Create the Facebook application

Logon to your Facebook account and then open the link http://developers.facebook.com

Register yourself as a developer.

clip_image036

Now click on Apps and then create a new App

clip_image038

Give your application a name

clip_image039

Take note of your application ID and secret.

Also enter the URL of the ACS namespace you will create on the Azure ACS web site in the next step (It is better to create that namespace first and then return to this step later).

clip_image041

Click save Changes.

Create Your ACS Namespace

Open the Azure portal and click new to create a new ACS names space.

clip_image043

Once created you can click on the Manage link to start managing it.

clip_image045

Configure your ACS service

Start by adding the Facebook application as an identity provider. Click on identity providers and then “Add”

clip_image046

clip_image048

Enter the application ID and secret and click “Save”

clip_image050

Configure Your Project to Link ACS Service

While on the ACS management site click on “Management service” so see all management links required to be able to communicate with ACS.

clip_image052

clip_image054

clip_image056

Click on “Show Key” and then copy the symmetric key generated.

These will be the namespace and the management key of the namespace.

Right click on your project and then click “Identity and Access tool”

clip_image057

Configure your ACS with the namespace and the key already copied before after selecting “Use the Windows Azure Access Control Service”.

clip_image059

clip_image061

 

Now select the settings as below

clip_image063

Finally click “Ok”. This will configure your ACS service and the relying party application with the required pass-through rule for all provider claims as shown below.

clip_image065

The next steps are to create and implement the Claims authorization manager and configure your web.config.

Implement a Custom Claims Authorization Manager

Since we are using .Net framework 4.5 this is a little different than what we used to do in 3.5 since now the WIF is totally integrated in the framework.

Add reference to the System.IdentityModel assembly

clip_image067

Add and implement the new class “MyAuthorizationManager

This is done so that the code of the file would be as follows.

using System.IO;

using System.Xml;

using System.Collections.Generic;

using System;

using System.Web;

using System.Linq;

using System.Security.Claims;

 

namespace TestDbWebApplication

{

    publicclassMyAuthorizationManager : ClaimsAuthorizationManager

    {

        privatestaticDictionary<string, string> m_policies = newDictionary<string, string>();

 

        public MyAuthorizationManager()

        {

        }

        publicoverridevoid LoadCustomConfiguration(XmlNodeList nodes)

        {

            foreach (XmlNode node in nodes)

            {

                {

                    //FIND ZIP CLAIM IN THE POLICY IN WEB.CONFIG AND GET ITS VALUE

                    //ADD THE VALUE TO MODULE SCOPE m_policies

                    XmlTextReader reader = newXmlTextReader(newStringReader(node.OuterXml));

                    reader.MoveToContent();

                    string resource = reader.GetAttribute("resource");

                    reader.Read();

                    string claimType = reader.GetAttribute("claimType");

                    if (claimType.CompareTo(ClaimTypes.Name) != 0)

                    {

                        thrownewArgumentNullException("Name Authorization is not specified in policy in web.config");

                    }

                    string name = "";

                    name = reader.GetAttribute("Name");

                    m_policies[resource] = name;

                }

            }

        }

        publicoverridebool CheckAccess(AuthorizationContext context)

        {

            //GET THE IDENTITY

            //COMPARE WITH THE POLICY

            string allowednames = "";

            string requestingname = "";

            Uri webPage = newUri(context.Resource[0].Value);

            ClaimsPrincipal principal = (ClaimsPrincipal)HttpContext.Current.User;

            if (principal == null)

            {

                thrownewInvalidOperationException("Principal is not populate in the context - check configuration");

            }

            ClaimsIdentity identity = (ClaimsIdentity)principal.Identity;

            if (m_policies.ContainsKey(webPage.PathAndQuery))

            {

                allowednames = m_policies[webPage.PathAndQuery];

                requestingname = ((from c in identity.Claims

                                        where c.Type == ClaimTypes.Name

                                        select c.Value).FirstOrDefault());

            }

            elseif (m_policies.ContainsKey("*"))

            {

                allowednames = m_policies["*"];

                requestingname = ((from c in identity.Claims

                                   where c.Type == ClaimTypes.Name

                                   select c.Value).FirstOrDefault());

            }

            if (allowednames.ToLower().Contains(requestingname.ToLower()))

            {

                returntrue;

            }

            returnfalse;

        }

    }

}

This would authorize users based on their login name reported by the identity provider (Facebook).

Configure the Authorization manager in the Web.Config

The required steps are to add the following lines:

  <system.webServer>

    <modules>

      <addname="ClaimsAuthorizationModule"type="System.IdentityModel.Services.ClaimsAuthorizationModule, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"preCondition="managedHandler" />

    </modules>

  </system.webServer>

Please note the yellow highlighted section above as this is key to make this work.

<system.identityModel>

  <identityConfiguration>

    <securityTokenHandlers>

      <removetype="System.IdentityModel.Tokens.SessionSecurityTokenHandler,System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

      <addtype="System.IdentityModel.Services.Tokens.MachineKeySessionSecurityTokenHandler,System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

    </securityTokenHandlers>

  </identityConfiguration>

</system.identityModel>


Please note the highlighted section as publishing to Azure web site will not work without these lines.

Finally the claims authorization manager configuration.

<system.identityModel>

  <identityConfiguration>

    <claimsAuthorizationManagertype="TestDbWebApplication.MyAuthorizationManager, TestDbWebApplication">

      <policyresource="*">

        <claimclaimType="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name"Name="Mohamed Malek" />

      </policy>

    </claimsAuthorizationManager>

  </identityConfiguration>

</system.identityModel>

Publish and test your site

Now you have completed the site configuration publish it and you should be able to authenticate using Facebook and authorize the application to use your Facebook settings, and finally authorize the user.

clip_image068

clip_image070

 

Logon to Facebook as usual.

clip_image072

The app would request user permissions to give the user details to the ACS service.

clip_image074

Web site will work as required.

clip_image076

Happy coding J

Install Active Directory in Windows Azure in four simple steps

$
0
0

Introduction:

There is a very long and complex way to install Active Directory in Windows Azure environment described in details in Windows Azure documentation here http://www.windowsazure.com/en-us/manage/services/networking/active-directory-forest/, what I will propose in this post is a simple way to install Active Directory in a new Windows Azure without the need to go through all steps listed in Microsoft Windows Azure documentations that include creation of the virtual machine for Domain Controller using PowerShell.

Install Steps:

Installation steps are simple as follows:

1. Create a storage account that is in the same region as the affinity group, more details about how to create a storage in Windows Azure can be found here http://www.windowsazure.com/en-us/manage/services/storage/how-to-create-a-storage-account/.

2. Create a new Network associated with the storage account created in step-1, create a Network subnet (for example 172.16.0.0/24), and select DNS to be same Network VLAN IP Address end with “.4” (follow the same example 172.16.0.4), more details about how to create a virtual network in Windows Azure can be found here http://www.windowsazure.com/en-us/manage/services/networking/create-a-virtual-network/.

3. Create the first virtual machine and associate it with the storage account and network created in steps-1 & step-2, this virtual machine automatically will take IP Address from network VLAN end with “.4” (follow the same example 172.16.0.4), more details about how to create a virtual machine in Windows Azure can be found here http://www.windowsazure.com/en-us/manage/windows/tutorials/virtual-machine-from-gallery/.

4. Install Active Directory (Including DNS Service) on the first created virtual machine in step-3.

Now your Windows Azure environment is ready with Active Directory and all new virtual machines that will be created later will automatically point to the IP Address of the Active Directory Domain Controller IP Address as their primary DNS server in network cards.

Conclusion:

As a conclusion, you have the long professional way to install Active Directory, and in this post, you get how to install Active Directory in your new Windows Azure environment with simple way and very fast.

Hope that this will accelerate building your new Windows Azure environment in less than an hour.

Use Facebook as an Identity Provider for SharePoint 2013 – Part 1

$
0
0

Introduction:

This blog will describe in details how to use Facebook as an identity provider to login into your SharePoint application.

There are many ways to integrate you SharePoint application with Facebook; so you can start developing authentication and authorization mechanisms for your SharePoint application, OR you can start using Windows Azure Access Control Service (ACS) that provides an easy way of authenticating users who need to access your SharePoint application.

How does ACS work with SharePoint and Facebook?

Steps

  1. User requests to be authenticated against a relying party (In our case, it is a SharePoint web application), and then to choose from a dropdown list the required Identity Provider for authentication.
  2. The user will be redirected to the chosen identity provider (which is Facebook in our case).
  3. The user enter his/her username and password.
  4. The Facebook will generate and send a security token to the user that hold claims and other properties.
  5. Facebook will redirect the user to the ACS, and the user will send the generated security token to the ACS.
  6. ACS validate the security token, and generate a new security token.
  7. The ACS redirect the user to the SharePoint web application and send the new security token to the user.
  8. The user will send the security token to the SharePoint web application.
  9. The SharePoint application will validate the security token, and then redirect the user to the required page.

 

Configuration Steps:

PS: I will assume you have ONLY a configured SharePoint machine and connected to the internet only.

First of all, let us create a windows azure account:

1

  • click on Sign up for a free trial.

2

  • Choose your county and click the next arrow

3

  • Enter you mobile number, and click on send text message, wait 5 minutes, and you will receive a verification code.

4

  • Enter you verification code and click on verify code, then click the next arrow.

5

  • Enter your credit card and your billing information; (You will not be charged for this, remember the first 90 days are FREE).

 

6

  • Click Next.
  • Click on Portal on the top right of the page.

image

  • Click on ACTIVE DIRECTORY on the left navigation then click on ACCESS CONTROL NAMESPACES.
  • Click on CREATE A NEW NAMESPACE.

8

  • Fill the namespace (I name it MySharePointLogin; you can choose your own namespace) and click create.

9

  • The URL of your Access Control Namespace will be like this:
    • http://MySharePointLogin.accesscontrol.windows.net (Mine)
    • http://YourNamespaceTitle.accesscontrol.windows.net (Your namespace title)

 

 

 

1

  • Click Create New App

2

  • Fill the required information and click Continue

3

  • Fill the required Captcha
  • Fill the Access Control Namespace URL in the Site URL, and Click Save.
  • PS You will require the App ID and App Secret in the next phase.

5

 

Go to Part 2

Use Facebook as an Identity Provider for SharePoint 2013 – Part 2

$
0
0

At this stage we are done configuring the Facebook part.

Continue Configuration Steps:

  • Now we need to create a signing token certificate. This is used to sign tokens issued to SharePoint Web Applications.
  • Open command prompt, browse to MakeCert.exe command which can be found in the \Bin folder of the Microsoft Windows Software Development Kit (SDK) installation path.
    • If MakeCert.exe is missing, then download and install Windows SDK from here
  • Run the following command:

MakeCert.exe -r -pe -n "CN=mysharepointlogin.accesscontrol.windows.net" ^

-sky exchange -ss my -len 2048 -e 05/29/2014

 

  • After the operation succeed, go to Control Panel –> Administrative Tools –> Manage Computer Certificate.
  • Expand Certificates – Current User, Personal, and click on Certificate. You will find the newly created signing token certificate.

2

  • Right Click on the new certificate go to All Tasks –> Export.
  • Choose No, do not export the private key, and click next.

3

  • Choose Base-64 encoded x,509 (.CER), and click Next.

4

  • Save the Certificate on the Desktop, ex: "C:\Users\Administrator\Desktop\MySharePointLogin.cer"
  • Go again to Control Panel –> Administrative Tools –> Manage Computer Certificate.
  • Browse to the same certificate again (Current User –> Personal -> click on Certificate).
  • Right Click on the new certificate go to All Tasks –> Export.
  • Choose Yes, export the private key, and click next.

5

  • Choose Personal Information Exchange –PKCS #12(.PFX) and click Next.

6

  • Choose Password, and choose a password; remember this password as it will be used later.

7

  • Save the Certificate on the Desktop, ex: "C:\Users\Administrator\Desktop\MySharePointLogin.pfx”

 

  • Go to your Access Control Namespace URL:
    • http://MySharePointLogin.accesscontrol.windows.net (Mine)
    • http://YourNamespaceTitle.accesscontrol.windows.net (Your namespace title)
  • Click on Identity Providers.

11

  • Click Add

12

  • Select Facebook and click Add

13

14

 

  • Click on Relying Party Applications from the left navigation, then click Add.

15

 

  • Fill the related information for the relying party (SharePoint)
    • Name –> Web Application Host Header (ex: SharePointLogin.com
    • Realm –> http://WebApplicationHostHeader (ex: http://SharePointLogin.com)
    • Return URL –> Http://WebApplicationHostHeader/_trust (ex: http://SharePointLogin.com/_trust)
    • Token Format: SAML 1.1

image

  • Fill the related information for the relying party (SharePoint)
    • Token encryption policy –> None
    • Token lifetime (secs) –> 4000
    • Choose Facebook as Identity Provider.
    • Check Create New Rule Group
    • Browse to the certificate you exported from the previous step; choose the certificate with .PFX extension.
    • Enter the password you created when you exported the certificate.
    • Click Save.

18

  • Click Rule Groups from the left navigation and then click on Default Rule Group for MySharePointLogin.com

19

  • Click Generate

20

  • Choose Facebook and click Generate

21

  • Click Save

22

Go to Part 1

Go to Part 3

Use Facebook as an Identity Provider for SharePoint 2013 – Part 3

$
0
0

At this stage, we are done configuring the Azure Part

Continue Configuration Steps:

  • Go to your SharePoint Farm
  • Create a new web application
    • Make sure the claim authentication will be as the following:
      • Enable Windows Authentication = Checked
      • Integrated Windows authentication = Check
      • Select NTLM
  • PS: The Facebook authentication will be enabled later

1

4

  • After creating the web application, go and create a site collection

5

  • Run the following script:
    • Red parameters need to be changed depend pon your configuration
$realm = "http://mysharepointlogin.com"
$signinurl = "https://mysharepointlogin.accesscontrol.windows.net:443/v2/wsfederation?wa=wsignin1.0&wtrealm=http%3a%2f%2fmysharepointlogin.com%2f"
$certlocation = "C:\Users\Administrator\Desktop\MySharePointLogin.cer"
$rootcertificate = Get-PfxCertificate $certlocation
New-SPTrustedRootAuthority "MSharePointLogin" -Certificate $rootcertificate
$certificate = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($certlocation)
$ClaimTypingMapping1 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "Email" -SameAsIncoming
$ClaimTypingMapping2 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name" -IncomingClaimTypeDisplayName "Display Name"–LocalClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname"
$ClaimTypingMapping3 = New-SPClaimTypeMapping -IncomingClaimType "http://www.facebook.com/claims/AccessToken" -IncomingClaimTypeDisplayName "Access Token" -SameAsIncoming
$ClaimTypingMapping4 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier" -IncomingClaimTypeDisplayName "Name Identifier"–LocalClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn"
$ClaimTypingMapping5 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/expiration" -IncomingClaimTypeDisplayName "Expiration" -SameAsIncoming
New-SPTrustedIdentityTokenIssuer -Name "Facebook Authentication" -Description "Facebook Identity Provider" -Realm $realm -ImportTrustCertificate $certificate -ClaimsMappings $ClaimTypingMapping1,$ClaimTypingMapping2,$ClaimTypingMapping3,$ClaimTypingMapping4,$ClaimTypingMapping5 -SignInUrl $signinurl -IdentifierClaim $ClaimTypingMapping1.InputClaimType
 
  • realm = the URI or URL that is associated with a SharePoint web application that is configured to use a SAML token-based provider (SharePoint Web application URL)
  • signinurl = Access Control Namespace URL which was created in windows Azure.
  • $certloc = Physical path for the certificate. Make sure to select the .Cer
  • $rootcert = Get the .pfx certificate root
  • New-SPTrustedRootAuthority = Creates a trusted root authority.
  • $cert = to represents an X.509 certificate.
  • $ClaimTypingMapping = map between the new claim in SharePoint with the incoming claim from ACS.
  • New-SPTrustedIdentityTokenIssuer  = Create a new Identity Provider named Facebook Authentication

 

  • After running the script successfully – > Go and select the SharePoint Web Application you created – > General Setting from the top ribbon.

7 - Copy

  • Go to Claims Authentication Section and check Facebook Authentication

6

  • Go back and select the web application –> Click User Policy

7

  • In the pop windows –> Click Add Users

8

  • Select All zones and click Next

9

  • Click Browse Users

10

  • Select All Users –> All Users (Facebook Authentication) –> Click Add –> Click OK.

11

  • Select Full read – Has Full read-only access.

12

  • Click OK

13

Go to Part 2

Go to Part 4

Use Facebook as an Identity Provider for SharePoint 2013 – Part 4

$
0
0

At this stage, we are done with configuring the SharePoint Part

Demo:

  • Go to your SharePoint Site Collection you created in part 3.
  • From the drop down list, select Facebook Authentication

1

  • You will be redacted to Facebook page to enter your credential - The good thing that you don't share your credential with third party application :)
  • Click OK to make your Facebook Application access your public profile.

2

  • After clicking OK, watch the URL of your Internet Explorer; It will take you to Access Control Namespace URL and then redirect you again to you SharePoint Site Collection URL.

3

  • Now you are logged in to SharePoint using the Facebook credential with a full read access.

4

 

Conclusion:

ACS provides us the capability to authorize our application through different public identity providers like Hotmail, Outlook, Facebook, Gmail, and Yahoo which will save our time and effort to develop authentication application to connect to their APIs.

Go back to Part 3


ITPro: How to Convert your Surface RT to Surface Pro – Part 2

$
0
0
In Part 1: I’ve explained how to install Remote Desktop services on the server and fix common error that you may face, that was the preparation for this part where we are going to configure RemoteApp.
 
The following steps show how configure Remote Desktop Services and publish RemoteApp:

1.     Follow the following steps to create a Session Collection:

a.     From the Server Manager, click Remote Desktop Services on the left side.

b. Click Collections> click Tasks, and then click Create Session Collection. Check the following snapshot:

clip_image002

 

c.     On the Before You Begin page of the Create Collection wizard, click Next.

d.    On the Name the collection page, type SessionCollection in the Name box, and then click Next, as the following snapshot:

clip_image004

 

e.     On the Specify RD Session Host servers page, select the server name, click the right arrow, and then click Next.

clip_image006

 

f. On the Specify user groups page, accept the default selections (Domain Users), and then click Next.

g. On the Specify user profile disks page, clear the Enable user profile disks check box, and then click Next.

h. On the Confirm selections page, click Create.

2.     Configure Remote Desktop Web Access Parameters: Change the URL the users will use to connect to: in Windows 2008 R2 that was very easy, however in Windows 2012 this can’t be changed. From the server manager click Edit Deployment Properties, check the following snapshot:

clip_image008
 

a.     When you open the properties, the RD Web Access can’t be edited:

clip_image010
 

b.     To change the name where the users will be connected to, we will need to get this PowerShell script: http://gallery.technet.microsoft.com/Change-published-FQDN-for-2a029b80, and use the cmdlet Set-RDPublishedName ts.meamcs.com

 
clip_image012
 

3.     Allow Required Ports: Configure the Endpoints on Azure to allow access for HTTPS (443) and RDP (3389), from the Virtual Machine > click Endpoints.

a.     Create new rule for HTTPS, and modify the existing port for the Remote Desktop, as the following snapshot:

 
clip_image014
 

b.     The following snapshot shows the final settings for the endpoints:

 
clip_image016
 

c.      Don’t forget to change your RDP connection to use now 3389 other than the random port you used to connect through it before.

4.     Configure the Commercial Certificate:  to avoid getting warning on the HTTPs that the certificate is not trusted, we will need a certificate issued from commercial CA (or get internal generated certificate and configure your devices including Surface to trust the Root CA), in this scenario we will use publicly issued certificate:

a.     Import the certificate to IIS: open IIS, under server name select Server Certificates:

clip_image018
 

b.     Import the certificate.

c.      Configure Binding with the new certificate: click on the default website, from the Actions pane in the left select Bindings:

clip_image020
 

5.     Configure Remote Desktop to use the commercial certificate, so you don’t get any warning from RemoteApp (RDP connection) that the server is not trusted (default installation will use self-signed certificate):

a.     Open Server Manager.

b.     Browse to Remote Deployment Services.

c.      Click Edit Deployment Properties.

clip_image021
 

d. Under Certificates browse to the PFX file of your commercial certificate as the following snapshot:

clip_image023
 

6.     Install programs on the RDSH server: To ensure that an application is installed correctly to work in a multiuser environment, you must put the RD Session Host server into a special installation mode before you install the application on the RD Session Host server. This special installation mode ensures that the correct registry entries and .ini files that are needed to support running the application in a multiuser environment are created during the installation process.

To put an RD Session Host server into this special installation mode:

a.      Open Control Panel > open Install Application on Remote Desktop Server

clip_image025
 

b.     Click Next on Install Program from Floppy disk or CD-ROM. (didn’t know that there are still programs on the floppy disks J, reminded me of the golden days when we had Windows 3.11 on 9 floppy disks)

clip_image027
 

c.      Browse to the application that you want to install, in this scenario I’m installing Project Pro.

clip_image029
 

d.     Click next to install the program.

e.     Repeat these steps for all the programs that you want to install on your server.

7.     Publish RemoteApp Programs:

a.     Open Server Manager > On the left side of the window, click Remote Desktop Services.

b.     Under Collections, click SessionCollection.

c.      In the REMOTEAPP PROGRAMS tile, click Tasks, and then click Publish RemoteApp Programs.

clip_image031
 

d.     On the Select RemoteApp Programs page, select the programs that you want to add to the RemoteAPP, in my scenario this will be (Project, Visio, PowerShell ISE, PowerShell module for AD and Azure, Windows Live Writer), check the following snapshot:

clip_image033
 

e.     On the Confirmation page, click Publish.

clip_image035
 

f.       When the RemoteApp program is published, click Close.

8.     Configure the File type Association:

a. Open Server Manager > On the left side of the window, click Remote Desktop Services.

b. Under Collections, click SessionCollection.

c. Under the REMOTEAPP PROGAMS heading, right-click each program, and then click Edit Properties.

clip_image037

d. Click File Type Association.

e. Select the required extensions as the following snapshot:

clip_image039

f.       Click OK.

g.      Repeat the steps for all programs.

 
In this part we did all the required configuration for RemoteApp, next part we will test from client side and configure Surface RT with RemoteApp.
 
Links to All Parts:
Part 1: Install Remote Desktop Services
Part 2: Configure RemoteApp
Part 3: Configure Surface

ITPro: How to Convert your Surface RT to Surface Pro – Part 1

$
0
0

 

From long time I’ve started to have all my labs on Windows Azure (will share that experience in another blog), my main motive was to get rid of my heavy ugly laptop to a light and thin one. My first pilot was to use Surface as the replacement of my laptop.
 
For some reason (I don’t remember it now) I’ve decided to get Surface RT not Surface Pro, I’m trying to convince myself now that RT lighter, RT is 676 grams compared to 907 grams for the Pro.
 
But anyway here it is my RT device, I loved the experience and the display but the second day I looked for PowerShell ISE and couldn’t find it, PowerShell ISE was my magical solution to connect to Office 365 so it’s mandatory for me. The result: PowerShell ISE is not built in with Windows RT and can’t be installed from the store.
Ok no ISE let’s install the PowerShell Module for Azure (Windows Azure VMs cmdlets) and PowerShell Module Windows Azure AD (Office 365 module), obviously you can’t install anything to RT.
 
The huge one when I started to look for Visio and Project Pro, and yes both are not part of Office pro which installed on the Surface and are not available on the Store. Now it’s serious I can’t work without these programs.
 
I started to look for a solution and first thing came to my mind is to RDP to another machine running on Azure that have everything I need, nice Idea and worked fine but annoying as you need to share the files (copy it or send it by email) between the 2 PCs, didn’t like the overall experience.  
I was reading some of the new staff on Windows 2012 when I thought of RemoteAPP, what if I’ve a server on Azure running Remote App and I can use all my applications.
 
First things first, what is RemoteApp:
RemoteApp programs are programs that are accessed remotely through Terminal Services and appear as if they are running on the end user's local computer. Instead of being presented to the user in the desktop of the remote terminal server, the RemoteApp program is integrated with the client's desktop, running in its own resizable window with its own entry in the taskbar. Users can run RemoteApp programs side-by-side with their local programs. If a user is running more than one RemoteApp program on the same terminal server, the RemoteApp programs will share the same Terminal Services session.
 
In simple words I will have Visio installed on the server in Azure, on my Surface when I click for Visio, I will open RDP session in the background which will run Visio on the server and present it in my Surface RT as if it is installed locally, that is the magic solution that I was looking for.
 
Users can access RemoteApp programs in several ways. They can:

1. Access a link to the program on a Web site by using TS Web Access.

2. Double-click a Remote Desktop Protocol (.rdp) file that has been created and distributed by their administrator.

3. Double-click a program icon on their desktop or Start menu that has been created and distributed by their administrator with a Windows Installer (.msi) package.

4. Double-click a file where the file name extension is associated with a RemoteApp program. This can be configured by their administrator with a Windows Installer package

 
I always go for the easiest solution so of course number 3 is my choice the programs will be on my Start menu as they are installed on my Surface.
 
My target with this article is the ITPro, if you are with MSDN subscription you can activate your free hours on Windows Azure, check this for the details:
 
In my lab on Azure I’ve decided to use one of my servers as the RemoteAPP server.
 
To have the RemoteApp we will need the following components (logical as roles not physical as number of servers):

1.  RD Session Host: Remote Desktop Session Host (RD Session Host), formerly Terminal Server, enables a server to host Windows-based programs or the full Windows desktop. Users can connect to an RD Session Host server to run programs, to save files, and to use network resources on that server.

2.   RD Web Access: Remote Desktop Web Access (RD Web Access), formerly TS Web Access, enables users to access RemoteApp and Desktop Connection through the Start menu on a computer that is running Windows 7 or through a Web browser. RemoteApp and Desktop Connection provides a customized view of RemoteApp programs and virtual desktops to users.

When a user starts a RemoteApp program, a Remote Desktop Services session is started on the RD Session Host server that hosts the RemoteApp program.

3.  RD Connection Broker [Optional]: The RD Connection Broker database stores session state information that includes session IDs, their associated user names, and the name of the server where each session resides. When a user with an existing session connects to an RD Session Host server in the load-balanced farm, RD Connection Broker redirects the user to the RD Session Host server where their session exists. This prevents the user from being connected to a different server in the farm and starting a new session.

RD Connection Broker is also used to provide users with access to RemoteApp and Desktop Connection.

4.   DC: of course a Domain Controller is needed.

5.  Trusted Certificate [Optional]: if you don’t want to get a warning on the certificate you will need to get a commercial certificate. Otherwise you can use an internal generated certificate and configure your Surface to trust the Root/Issuing CA.

6.   DNS Record: you will need DNS record Points to your server.

 
The RD Connection Broker is an optional as in our step we will use a single server that hosts everything, so the same server can monitor the sessions without the need of the RDCB. However in Windows 2012 the standard setup or the quick setup both will include RDCB as the recommended installation thus we are going to use it. If you are using Windows 2008 R2 or if you will use a custom installation you can choose not to install it and only use the RDSH and RDWA.
 
Let’s not waste any more time, hereunder are the steps:

1.  Install the required roles: I will use Windows 2012 so on the dedicated machine:

a. Click Add Roles> Select Remote Desktop Services Installation:

clip_image002
 

b. Then select Standard deployment:

clip_image004
 

c. Select Session-based desktop Deployment:

clip_image006
 

d.  Under Role services the wizard will install the three roles: RDSH, RDWA and RDCB, as the following snapshot:

clip_image008

 

e.  Click Next.

f.  Only if you get the following error continue with the following steps, if you didn’t get error then skip to step i. Error:

“Unable to connect to the server by using Windows PowerShell remoting”

 
The following snapshot shows the error:
clip_image010

 

g.      Start PoerShell as Administrator, and run the following cmdlet:

Enable-PSRemoting -force
Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 1000

h.      The following snapshot shows the cmdlets:

clip_image012
 

i.   In RD Connection Broker: select the server name as the following snapshot:

clip_image014
 

j.    Do the same by adding the selected server in the RD WA and RD SH.

k.   On the confirmation page click to restart the server after the installation and click Deploy to start the installation.

clip_image016
 
In this part we have installed Remote Desktop Services, in next part we will configure the server for RemoteApp.
 
Links to All Parts:
Part 1: Install Remote Desktop Services
Part 2: Configure RemoteApp
Part 3: Configure Surface

ITPro: How to Convert your Surface RT to Surface Pro – Part 3

$
0
0
In Part 1: I’ve explained how to install Remote Desktop services on the server and fix common error that you may face, in Part 2 we finished configuring Remote Desktop Services and publish RemoteApp.
This part we will test from client side machine by browsing to the RDWA and will configure Surface with RemoteApp.
 
The following steps show how to test the connection to the Remote Desktop Web Access and test RemoteApp from browser:

1.     Test Client Access: first test to make sure that the configuration is working is to test through browser:

a.     From the browser open the URL points to your server, in my scenario this is: https://ts.meamcs.com/rdweb

clip_image002
 

b.     You will see all the published programs:

clip_image004
 

c. For testing open any of the programs, you will get prompted that the website is trying to run a RemoteApp program, as the following snapshot, click Connect.

d. When prompting for credential, enter the user name and password, also notice that the connection is with the new name configured earlier:

clip_image005
 

e.     Now we have Active Directory Module for PowerShell is working and the marked cmdlet from Active Directory Module for PowerShell:

clip_image007
 
Now is the time to configure our Surface RT with RemoteApp:
Configure RemoteApp on Surface:

1.     From the Start Menu > open Remote Desktop application.

2.     Either click on Access RemoteApp and Desktop Connection or from Settings charm click on Manage RemoteApp and desktops.

clip_image009
 

3.     Click Add New Connection, then enter the URL for your server https://URL/rdweb/feed/webfeed.aspx and click Connect.

clip_image011
 

4.     When prompt for credential put the username and password:

clip_image013
 

5. Now your Surface is connecting to the server running RemoteApp and retrieving all the programs:

clip_image015
 

6.     Click OK after retrieving the list of programs:

clip_image017
 

7.     This is the list of the available programs which now appears also in the start menu:

clip_image019
 

8.     From the list of programs or from the start menu let’s start PowerShell ISE:

clip_image021
 

9.     As you can see we have PowerShell ISE running with Windows Azure AD (Set-MsolUser is Office 365 cmdlet) and with Azure module (Add-AzureAccount is Windows Azure cmdlet).

 
Now I’m enjoying Surface RT with all programs that I need, and this blog has been uploaded using Live Writer from my Surface RT (or is it Pro now J)
 
 
Links to All Parts:
Part 1: Install Remote Desktop Services
Part 2: Configure RemoteApp
Part 3: Configure Surface

VHD Disk Requirements in Windows Azure Pack (WAP)

$
0
0

There are some prerequisites for VHD Disks that need to be ready before decide to present VHD Disk in WAP to be used why Virtual Machine Provisioning that will save you a lot of troubleshooting time if you verify all as follows:

  1. Library Share that hosts the VHD must be added to the list of Read-Only shares in Cloud Properties in VMM.
  2. The VHD must have “FamilyName” property configured.
  3. The VHD must have “release” property set in the format like n.n.n.n (e.g. 1.0.0.0).
  4. The VHD must have the Operating System property set.
  5. VHD must have the required Tags by Gallery item, the Tags requirements can be found in Gallery item’s readme, this property can be configured only through VMM PowerShell using set-scvirtualharddisk cmdlets, knowing that Data VHD Disk does not require Tag.

More resources with more details for each step that include videos as well can be found here:

Troubleshoot Gallery Item:

http://blogs.technet.com/b/privatecloud/archive/2013/11/27/troubleshooting-windows-azure-pack-and-gallery-items-part-2.aspx

Troubleshoot WAP:

http://blogs.technet.com/b/privatecloud/archive/2013/11/08/troubleshooting-windows-azure-pack-spf-amp-vmm.aspx

Troubleshoot SPF:

http://blogs.technet.com/b/scvmm/archive/2013/11/12/general-troubleshooting-list-for-windows-azure-pack-wap-and-spf-integration.aspx

As a conclusion, you should be aware about all the above prerequisites that need to be verified before start testing creating VM using VHD Disk or from Gallery.

Windows Azure Pack – Implementation Issue – ACTIVE (OUT OF SYNC)

$
0
0

Recently I faced an issue while building a complete Windows Azure Pack solution to provide Infrastructure as a Service (IaaS), initially everything was working properly while I was doing testing after each step of WAP deployment, so I able to create three subscriptions, created some virtual machines using the created subscriptions, and able to access these virtual machines, then after I finish the whole deployment and configure all post deployment activities found an issue when try to create any new subscription that shows subscription status as “ACTIVE (OUT OF SYNC)” with detailed error as below:

image

“One or more errors occurred while contacting the underlying resource providers. The operation may be partially completed. Details: Failed to create subscription. Reason: Message : An error occurred while processing this request., Innermessage: <!DOCTYPE HTML PUBLIC ”-//W3C//DTD HTML 4.01//EN””https://www.w3.org/TR/html4/strict.dtd”> <HTML><HEAD><TITLE>Bad Request</TITLE> <META HTTP-EQUIV=”Connect-Type” Content=”text/html;charset=us-ascii”></HEAD> <BODY><h2>Bad Request – Invalid Hostname</h2> <hr><p>HTTP Error 400. The request hostname is invalid.</p> </BODY></HTML>

So I start troubleshooting with below steps:

  1. First I check in VMM Console, under “Settings/Security/User Roles” and found that the new subscription administrator that should be created while creating the new subscription is not there (except admin users for the 3 subscriptions I created in the initial testing without issues), which means for me that the request did not reach to SCVMM, so the issue is almost in Service Provider Foundation (SPF).
  2. Follow the troubleshooting steps document her in very good way http://blogs.technet.com/b/privatecloud/archive/2013/11/08/troubleshooting-windows-azure-pack-spf-amp-vmm.aspx however issue was not solved and still get the same error.
  3. I start looking at each WAP component to be sure that same level of accumulative update is applied on each WAP role (Internal WAP Tier, External WAP Tier, SCVMM, SPF), but found all are the same.
  4. Start taking more deeper step and start enable debugging in both SCVMM and SPF follow steps documented here http://support.microsoft.com/kb/2850280& here http://support.microsoft.com/kb/2913445/en-us , and the results from this debugging that I become 100% that the issue is in SPF not in SCVMM, simply because the error is logged in SPF only, part of log file on SPF after enable debugging showing the same error is below:

[1]0AB8.0ACC::‎2015‎-‎01‎-‎08 15:39:33.118 [Microsoft-ServiceProviderFoundation]Component: Provider     Activity [WebAuthentication Call, id {f13f5bc4-d696-4beb-be36-60fc99d01c82}]  Parent activity [none, id {00000000-0000-0000-0000-000000000000}]    Elapsed: 0ms  Context: {9a298a09-5e36-4e1f-b163-6e91f54a8b14}    Message : An error occurred while processing this request., InnerMessage: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">  <HTML><HEAD><TITLE>Bad Request</TITLE>  <META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>  <BODY><h2>Bad Request - Invalid Hostname</h2>  <hr><p>HTTP Error 400. The request hostname is invalid.</p>  </BODY></HTML>

  • Then I followed some recommendations from communities to try to change the Service Provider registration to be done using SPF Local user, this local user should be created with the same name & password on all SPF servers, however after I did this change I found same issue still there.

  • Lately I start tracking all changes done in the environment after the first test that was succeeded before to create new subscriptions, and start rollback one of these changes which is replacing the self-signed certificate for SPF web site with a new certificate issued from internal enterprise CA, so after roll-back and return SPF website to use self-signed certificate found the issue is no more there, and I can create new subscription in addition I able to sync all the subscriptions created while troubleshooting the issue.

So it was Service Provide Foundation issue, because of replacing the self-sign certificate with new certificate from enterprise CA, although all WAP components are member in the same Active Directory AD Domain with the CA in the same AD, the most important thing is that I went through all troubleshooting techniques related to Windows Azure Pack to solve the issue and this will save me time in future Windows Azure Pack deployments.

Good luck in your WAP deployment, and hope that this post will help a lot of consultants while deploying Windows Azure Pack and Service Provider Foundation.

More Windows Azure Pack solved implementation issues can be found below:

Windows Azure Pack – Implementation Issue – Failed to create VM from sysprep vhdx

$
0
0

I went through another very interesting issue while WAP implementation during stabilizing a WAP production environment, although we able to create standalone virtual machine from different templates, in addition we able to create standalone virtual machine from Windows 2008 R2 sysprep vhdx, however the issue that WAP failed to create standalone virtual machine from Windows 2012 R2 sysprep vhdx, error shown in WAP was as follows:

Virtual Machine Manager cannot detect a heartbeat from the specified virtual machine. Either the virtual machine is not running or Virtual Machine Additions is not installed. Verify the status of the virtual machine by connecting to it using Virtual Machine Remote Client (VMRC), and then try the operation again. Please contact your system administrator with this error ID.

When check in SCVMM, error is as follows:

Error (609)

Virtual Machine Manager cannot detect a heartbeat from the specified virtual machine. Either the virtual machine is not running or Virtual Machine Additions is not installed.

Recommended Action

Verify the status of the virtual machine by connecting to it using Virtual Machine Remote Client (VMRC), and then try the operation again.

Good luck in your WAP deployment, and hope that this post will help a lot of consultants while deploying Windows Azure Pack and Service Provider Foundation.

What was little bit different in details of SCVMM error is that the step failed is called “Customize Virtual Machine

So I decided to use the same sysprep vhdx to create a virtual machine from SCVMM, found that the VM creation goes without issues, which means that something send in Windows Azure Pack request who is responsible about the issue.

After some discussions with folks, I realized that maybe something in the WAP plan responsible about virtual machine naming/renaming can be the source of issue, so I decided to go through all the plan settings in WAP admin site, and change the custom settings by selecting “Use template to define computer name” check box as shown in screenshot below,

clip_image002

Once saved the plan, log out & log in to tenant admin site again with one of the tenant administrators accounts, I able to create new VM from the same sysprep vhdx that was giving the error.

So, although error is generated from SCVMM but issue source was from Windows Azure Pack plan, once plan settings be fixed issue was solved.

More Windows Azure Pack solved implementation issues can be found below:

Good luck in your WAP implementation, and be tuned for more solved issues.

Windows Azure Pack – Implementation Issue – G2 Hardware Profile could not be selected when create VM from sysprep vhdx

$
0
0

Another minor issue I found while stabilizing Windows Azure Pack IaaS environment, it was noted that when try to create standalone virtual machine by selecting a sysprep vhdx from gallery in WAP you could not select any of the available Hardware Profiles configured with Virtual Machine Generation-2, while all Hardware Profiles with Generation-1 can be selected normally, so I decided to troubleshoot the issue by creating a new Hardware profile with the same configuration like the one with Generation-2 that I could not select but this time I configured it to be with Generation-2 and add it to the plan used by the same tenant admin I test with, I found the new hardware profile can be selected which make me assure that the issue from Generation-2 configuration in the hardware profile, so I decided to raise the issue to Windows Azure Pack product community, so I was notified that it is by design and the only way to create Generation-2 VM from WAP is to create it from template not from sysprep vhdx.

So it is not an issue other than it is a restriction by design in current WAP version that prevent you to use Hardware Profile configured with Generation-2 VM when create standalone virtual machine using sysprep VHDX, and the only way to create Generation-2 virtual machine in WAP is to create it from template configured with Generation 2 VM settings.

More Windows Azure Pack solved implementation issues can be found below:

Good luck in your WAP implementation, and be tuned for more solved issues.


Enable .net framework 3.5 on windows server 2012 R2 VM created in Azure Environment

$
0
0

To install SQL Server on Windows Server 2012 R2 VM Server in Azure, .net framework 3.5 should be installed first on the same Server.
.net framework 3.5 can be installed using below two steps:
1. DISM /Online /Enable-Feature /FeatureName:NetFx3 /All /LimitAccess /Source:"D:\sxs"
2. Enable the feature in the Add Roles and Features Wizard and specify path Source:"D:\sxs"

Both the above steps require a source file which is a part of installation disk but in Azure VMs, Installation disk is not available to specify the source path.

Below Error appear if installation fails

Feature installation
Installation of one or more roles, role service feature failed.
The source files could not be found. Try installing the roles, role services, or features again in a new Add Roles and features Wizard session, and on the Confirmation page of the wizard, click "Specify an alternate source path" to specify a valid location of source files that are required for installation. The location must be accessible by computer account of the destination server
.”

Solution to enable .net framework 3.5 role in azure: Download and install latest updates using windows update on the Windows Server 2012 R2 VM Server in Azure.


Follow the below steps to enable .net framework 3.5 feature on windows server 2012 R2 VM created in Azure Environment:
Go to Control Panel\System and Security\Windows Update
Check for Latest updates (note: Try to disable IE ESC under Server Manager, if unable to connect to internet for checking updates)
Download and install the latest updates
Restart Server if required.

Once the Server is restarted, Enable .NET Framework 3.5 by using the Add Roles and Features Wizard under Server Manager following below steps:

Open Server Manager and click on dashboard
In right pane, Click on Add roles and features
Click Next and choose Role-based or feature-based installation
Click Next till feature page appear, check the box for .net framework 3.5 feature and click next
Click install without specifying the alternate source path to install .net framework 3.5 features role.

The above issue occurs in Azure because we cannot specify an alternate source path for a source file due to unavailability of installation disk So to enable .net framework 3.5 role in Windows Server 2012 R2 VM in Azure as a prerequisite for SQL Server or any other application, check and download latest windows updates on Windows Server 2012 R2 VM and then enable the role without using alternate source path.

Run two PowerShell scripts on a same VM through custom script extension at different stage of Deployment in ARM

$
0
0

Introduction– This blog post illustrates the method through which you can run two different PowerShell scriptson a same VM through custom script extension at different stages /time of deployment in ARM. Currently , it is not possible to run two custom script to perform two different tasks on a same VM through custom script extension.

Assumptions – Here we assume that you are familiar with basics of deploying resources in azure preview portal in ARM mode and use/construction of JSONs.

Problem statement – I had requirement where I had to deploy an IaaS infrastructure in ARM through PowerShell orchestration script and JSON template. This include creation of IaaS VMs like domain controllers and SQL VMs etc. At one point I had to perform the task of creating AD domain users on domain controller through PowerShell script and at the final stage of deployment ( after SQL VMs configuration ) , I had to push group policy on the same domain controller through PowerShell script. After adding a resource block in JSON for second PowerShell script , I ran the complete deployment. As expected , at the final stage while pushing GPO PowerShell script into DC

New-AzureResourceGroup : 11:12:54 AM - Resource Microsoft.Compute/virtualMachines/extensions 'adgptst02/ADGPO' failed with message 'Multiple

VMExtensions per handler not supported for OS type 'Windows'. VMExtension 'ADGPO' with handler 'Microsoft.Compute.CustomScriptExtension' already

added or specified in input.'

This was expected as VM already had custom script extension in it which was injected previously for creating domain users at the earlier stage of deployment.

Resolution/Workaround – Follow the below to resolve this :

1. Remove the resource block for second PowerShell from JSON template as of now.

2. In the PowerShell orchestration script, from where you run the deployment ,  Add the below command to remove custom script extension once the deployment is done . This command will be right after New-azurermresourcegroupdeployment command which perform the deployment:

Remove-AzurermVMCustomScriptExtension -ResourceGroupName $ResourceGroupName -VMName $CustVMname –Name $customscriptname -Force

Note – Replace the variables with actual values. This will remove the custom script extension from the VM and will not have any effect on the configuration done by the custom PowerShell script earlier

3. Create a new JSON template ( from the same template you are using for deployment ) which will have only one resource block for second custom script extension.( delete all other resource blocks as those tasks will already be completed ).You don’t need to make any changes in parameters and variables section of the template as most of the values will not used and will not have any effect. Also you need not to make any change in template parameter file. Below is how the resource block for second custom script extension will look like:

"resources": [


       {
           "type": "Microsoft.Compute/virtualMachines/extensions",
           "name": "[concat(parameters('ADVirtualMachine'),'/ADGPO')]",
           "apiVersion": "2015-06-15",
           "location": "[parameters('location')]",
           "properties": {
               "publisher": "Microsoft.Compute",
               "type": "CustomScriptExtension",
               "typeHandlerVersion": "1.4",
               "settings": {
                   "fileUris": [
                       "[variables('ADGPOScriptFileUri')]"
                   ],
                   "commandToExecute": "[variables('ADGPOToExecute')]"
               },
               "protected Settings": {
                   "storageAccountName": "[variables('ADcustomScriptStorageAccountName')]",
                   "storageAccountKey": "[listKeys(variables('ADaccountid'),'2015-05-01-preview').key1]"
               }

 

           }
       }


   ]

4. Save the JSON template with different name along with parent template for the deployment.

5. In the PowerShell orchestration script , Remove-AzurermVMCustomScriptExtension, once again run the New-azurermresourcegroupdeployment , this time with new JSON template. Hence , in short , the process here is to remove the custom script extension first and then add it again with the required script.

 

Thanks folks , hope it is useful

Happy blogging

Import Database schema in Azure SQL DB from .SQL files programmatically with SQLCMD

$
0
0

Introduction – This blog post illustrates the method through which you can import your database schema and tables into empty Azure SQL DB ( PaaS ) programmatically. Currently azure SQL DB support import from BACPAC file in PowerShell and GUI but not from .SQL files.

Assumptions  - Here we assume that you already have .SQL files generated from on-premise data base and ready to upload to azure SQL DB.

Problem statement  - I had a requirement where I needed to import schema and tables into empty azure SQL DB from .SQL files. Currently Azure only provides import of BACPAC files out of the box from PowerShell and GUI and from SQL management studio but requirement here was to do it programmatically every time the ARM deployment script creates new azure SQL DB .

 

Resolution/Workaround – Below steps you should follow

1. Install SQLCMD on the VM/desktop from where you are running the script or deployment. SQLCMD cmdlets are used to deploy SQL files into Azure SQL DB. ODBC driver is required for installing SQLCMD

ODBC driver - http://www.microsoft.com/en-in/download/details.aspx?id=36434

SQLCMD - http://www.microsoft.com/en-us/download/details.aspx?id=36433

2. Save all the SQL files into a folder in local VM.

3. Get the public IP of you local VM/desktop using below code.

$IP = Invoke-WebRequest checkip.dyndns.com
$IP1 = $IP.Content.Trim()
$IP2 = $IP1.Replace("<html><head><title>Current IP Check</title></head><body>Current IP Address: ","")
$FinalIP = $IP2.Replace("</body></html>","")

4. Create a new firewall rule to connect to SQL server.

New-AzureRmSqlServerFirewallRule –FirewallRuleName $rulename  -StartIpAddress $FinalIP -EndIpAddress $FinalIP –servername $SQLservername –Resourcegroupname $resourcegroupname.

5. Save SQL server full name and sqlcmd path into a variable.

$Fullservername = $SQLservername + '.database.windows.net'
$sqlcmd = "C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn\SQLCMD.EXE"

6. Save SQL server credentials and Azure SQL DB name in variables.

$username = “SQLusername”

$password = “SQLpassword”

$dbname = “databasename”

7. Run the below command for each SQL files if u want to import it sequentially.

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\file1.sql"

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\file3.sql"

& $sqlcmd -U $username -P $password -S $Fullservername -d $dbname -I -i "C:\SQL\filen.sql"

 

NOTE – You can accumulate  all the code and use it in deployment scripts along with functions and error logging

 

Thanks folks, Hope it is useful.

happy blogging

Install and Configure Exchange 2013 Server Transport Role

$
0
0

When Exchange 2013 was released couple of years back there were many changes in Exchange 2013 Architecture notably consolidation of many Exchange roles in previous versions of Exchange into CAS and Mailbox Role in Exchange 2013.

However there was one role which was discontinued Edge Transpport role which was reponsible for providing first line of defence against Malwares, Spam and Viruses. Malwares and Spam filtering in exchange 2013 was done by CAS along with Mailbox role which was good but it didn’t address security scenario for every organization so with the Release of Exchange 2013 Sp1 Microsoft reintroducted Edge Transport role in Exchange 2013.

In this Article we would talk about to Install Edge Transport role and Configure it.

Overview: Edge Transport Server Role is one of the three roles now available with exchange 2013 Sp1 main purpose Edge Transport rule is minimize the attack surface by handling all Internet-facing mail flow, which provides SMTP (Simple Mail Transfer Protocol) relay and smart host services for your Exchange organization. Edge Transport Server is mostly placed in Perimeter network or the DMZ Zone. Edge Transport has some additional transport agents that are not installed on Mailbox servers. Here is the complete list of transport agents for Edge Transport:

In comparison, here is the list for the Mailbox server role.

Prerequisites for Installing Edge Transport.

Note: Edge doesn’t need to be Domain joined machine although you can also use a domain Joined Machine for installing Edge Transport Role. However, workgroup machine still need to resolve Mailbox Server Name and mailbox Server must be able to resolve Edge Server so FQDN is required.

Confiugre Edge Server Primary Suffix

  1. Change Computer name and also provide primary DNS Suffix as seen in below screenshot.

 

  1. Create a A Host Record in DNS for Edge Server.

Note: Make sure Edge Transport is pingable from internal network

 

Open Firewall ports.

 

  1. Port TCP 25 (SMTP) inbound/outbound between the internet and the Edge Transport server
  2. Port TCP 25 (SMTP) inbound/outbound between the Edge Transport server and the internal network

Port TCP 50636 and 50389 from the internal network to the Edge Transport server for EdgeSync

 

Installing Active Directory Lightweight Directory Service

Note: Edge Server Doesn’t have access to Active Directory, However sometimes Edge Server need to access information from AD like Mailbox Server Configuration and Recipents Info.

ADLDS is required to store configuration and recipient information which is used by Edge Transport server.

 

  1. Open powershell on Edge Server and run as Adminitrator

2. Import Powershell Server Manager module Import-Module SereverManager

3. Run Install-WindowsFeature ADLDS.

 

Installing Exchange Exchange Server 2013 Transport Role

 

  1.  Open Command prompt and Run as Administrator
  2. Type Setup.exe /m:Install /r:et /IacceptExchangeServerLicenseTerms

 

3. Restart the Server once Installation is complete.

              

 

 

Configuring Edge Subscription for Exchange Server 2013

An Edge Transport server doesn't have direct access to Active Directory. The configuration and recipient information the Edge Transport server uses to process messages is stored locally in AD LDS. Creating an Edge Subscription establishes secure, automatic replication of information from Active Directory to AD LDS. The Edge Subscription process provisions the credentials used to establish a secure LDAP connection between Exchange 2013 Mailbox servers and a subscribed Edge Transport server. The Microsoft Exchange EdgeSync service (EdgeSync) that runs on Mailbox servers performs periodic one-way synchronization to transfer up-to-date data to AD LDS. This reduces the administration tasks you perform in the perimeter network by letting you configure the Mailbox server and then synchronize that information to the Edge Transport server.

You subscribe an Edge Transport server to the Active Directory site that contains the Mailbox servers responsible for transferring messages to and from your Edge Transport servers. The Edge Subscription process creates an Active Directory site membership affiliation for the Edge Transport server. The site affiliation enables Mailbox servers in the Exchange organization to relay messages to the Edge Transport server for delivery to the Internet without having to configure explicit Send connectors.

One or more Edge Transport servers can be subscribed to a single Active Directory site. However, an Edge Transport server can't be subscribed to more than one Active Directory site. If you have more than one Edge Transport server deployed, each server can be subscribed to a different Active Directory site. Each Edge Transport server requires an individual Edge Subscription.

 

Create an Edge Subscription file:

Logon to your EDGE Transport Server and open Exchange shell Management and run as administrator.

Type below command to create subscription file.

Copy the Edge Subscription file to a Mailbox server

  1. Copy the Edge Subscription file to a Mailbox server or a file share that's accessible from the Active Directory site containing your Mailbox servers.
  2. Logon to Mailbox Server.
  3. Open Exchange Management Shell and run below command

 

Remove External Send Connector

Edge Subscription would create two send connectors for relaying mails over the internet. If you have earlier configured Internet bound Send Connectors you would need to remove them after you have deployed Edge Transport Server Role.

To get Send Connectors run below command

To remove any send Connector use below command

SCOM Advanced Authoring : Powershell Discovery from CSV file – Explained using “TCP Port Monitoring” Scenario

$
0
0

SCOM is exceptional tool which allows IT Administrators to customize the monitoring scenario to any extent. To have a customized monitoring solution, one must understand the authoring capabilities in SCOM, so that the solution can be easily implemented, highly optimized and has less overhead on SCOM Management servers and agents.

In this post we will discuss about Powershell Discovery from a centrally located configuration file (CSV format) with an example scenario involving TCP Port monitoring. We will also discuss the impact of this method on end users, IT/SCOM Administrators.

Basics – Classes, Objects, Targets and Discoveries:

An object is the basic unit of management in Operations Manager. An object typically represents something in your computing environment, such as a computer, a logical disk, or a database. A class represents a kind of object, and every object in Operations Manager is considered an instance of a particular class. A target in the Operations console represents all instances of a particular class. A discovery is a special kind of rule to populate the class with instances.

Scenario:

Consider a scenario where you have a datacenter with 1000+ Windows and Unix Servers. We as SCOM Administrators are requested to configure monitoring for various TCP ports from different watcher nodes across various servers in datacenter.

This can be accomplished using TCP Port Template in Authoring Pane of console. But the drawbacks of using this template are:

Each port for each server needs to be configured manually.

Each entry creates bunch of classes, groups, overrides and number of workflows increase which will impact SCOM performance.

There is no central configuration/information on what is being monitored and the monitoring criteria.

Future changes needs to be manually configured in console.

If a watcher node is decommissioned, each port monitored by the watcher node need to be moved to other watcher node manually.

Each time application team has a new request to add/delete or modify, SCOM administrator need to make changes. In real environment, this includes Change requests, approvals etc which can consume considerable time.

Effective Solution:

To overcome the issues, it would be better to have a configuration in a central location and pull the information to SCOM at regular intervals. This way, once the initial configuration is setup,

The application team can maintain the list and can follow their own approval process to add/modify/delete.

The list can be mass updated.

The addition/modification/deletion is automatically sync’ed with SCOM at regular intervals.

No new workflows are added to SCOM for every addition and hence the impact on SCOM performance is minimal. Thus with handful of monitors and rules 1000s of objects can be monitored.

The information is available centrally.

The monitoring solution is self supported and cost effective in terms of support hours.

Below is step by step process with xml fragments included. The entire MP XML file is attached to the blog which you can download and test it in your lab.

Step 1: Create a New Management Pack “GKLab.TCP.Port.Monitoring

Step 2: Add “Microsoft.SystemCenter.SyntheticTransactions.Library.mp” as reference.

Here is XML fragment for Step 1 and Step 2

1 <?xml version="1.0" encoding="utf-8"?><ManagementPack ContentReadable="true" SchemaVersion="2.0" OriginalSchemaVersion="1.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> 2 <Manifest> 3 <Identity> 4 <ID>GKLab.TCP.Port.Monitoring</ID> 5 <Version>1.0.0.0</Version> 6 </Identity> 7 <Name>GKLab.TCP.Port.Monitoring</Name> 8 <References> 9 <Reference Alias="SystemCenter"> 10 <ID>Microsoft.SystemCenter.DataWarehouse.Library</ID> 11 <Version>7.1.10226.0</Version> 12 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 13 </Reference> 14 <Reference Alias="Windows"> 15 <ID>Microsoft.Windows.Library</ID> 16 <Version>7.5.8501.0</Version> 17 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 18 </Reference> 19 <Reference Alias="MicrosoftSystemCenterSyntheticTransactionsLibrary"> 20 <ID>Microsoft.SystemCenter.SyntheticTransactions.Library</ID> 21 <Version>7.1.10226.1090</Version> 22 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 23 </Reference> 24 <Reference Alias="Performance"> 25 <ID>System.Performance.Library</ID> 26 <Version>7.0.8433.0</Version> 27 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 28 </Reference> 29 <Reference Alias="System"> 30 <ID>System.Library</ID> 31 <Version>7.5.8501.0</Version> 32 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 33 </Reference> 34 <Reference Alias="SC"> 35 <ID>Microsoft.SystemCenter.Library</ID> 36 <Version>7.0.8433.0</Version> 37 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 38 </Reference> 39 <Reference Alias="Health"> 40 <ID>System.Health.Library</ID> 41 <Version>7.0.8433.0</Version> 42 <PublicKeyToken>31bf3856ad364e35</PublicKeyToken> 43 </Reference> 44 </References> 45 </Manifest>

 

Step 3: Create a custom class “GKLab.TCP.Port.Monitoring.Class” to store TCP Port monitoring configuration. The base class is “Microsoft.SystemCenter.SyntheticTransactions.TCPPortCheckPerspective”

1 <TypeDefinitions> 2 <EntityTypes> 3 <ClassTypes> 4 <ClassType ID="GKLab.TCP.Port.Monitoring.Class" Accessibility="Internal" Abstract="false" Base="MicrosoftSystemCenterSyntheticTransactionsLibrary!Microsoft.SystemCenter.SyntheticTransactions.TCPPortCheckPerspective" Hosted="true" Singleton="false" Extension="false"> 5 <Property ID="ServerName" Type="string" AutoIncrement="false" Key="true" CaseSensitive="false" MaxLength="256" MinLength="0" Required="false" Scale="0" /> 6 <Property ID="Port" Type="int" AutoIncrement="false" Key="true" CaseSensitive="false" MaxLength="256" MinLength="0" Required="false" Scale="0" /> 7 <Property ID="NoOfRetries" Type="int" AutoIncrement="false" Key="false" CaseSensitive="false" MaxLength="256" MinLength="0" Required="false" Scale="0" /> 8 <Property ID="TimeWindowInSeconds" Type="int" AutoIncrement="false" Key="false" CaseSensitive="false" MaxLength="256" MinLength="0" Required="false" Scale="0" /> 9 </ClassType> 10 </ClassTypes> 11 </EntityTypes>

Step 4: Now we need to create discovery data source. Before that we will discuss the CSV file format we will be using to store the configuration data.

We will name it as “TCPPortMonitoringList.csv”. The CSV has ServerName, PortNumber, WatcherNode, IntervalSeconds, NoOfRetries and TimeWindowInSeconds as header.

ServerName – Monitored Server Name (NetBIOS or FQDN)

PortNumber – Port Number to be monitored in monitored server.

WatcherNode – Computer/SCOM Agent that needs to monitor the port in monitored server.

IntervalSeconds – Monitoring Interval in seconds.

NoOfRetries – Number of times the monitor should fail before the alert is generated. This will reduce the alerts generated due to network latency. (Minimum value – 2)

TimeWindowInSeconds – Total time interval within which the monitor has to fail to generate an alert. (Minimum value = IntervalSeconds)

image

Step 5: Since we will use Powershell Script discovery, create a custom data source with a System.SimpleScheduler module and a Microsoft.Windows.PowerShellDiscoveryProbe probe module.

Since we have a centralized configuration CSV file, we can run the discovery from any one management server and populate the objects. In SCOM 2012, we will target the discovery against All Management Server Resource Pool, so that anyone MS will pick up the workflow. The discovery is thus highly available. The CSV file path should be shared so that it can be accessed from any MS.

Below is XML fragment for Custom Discovery module with embedded Powershell script.

1 <ModuleTypes> 2 <DataSourceModuleType ID="GKLab.TCP.Port.Monitoring.Discovery.DataSource" Accessibility="Internal" Batching="false"> 3 <Configuration> 4 <xsd:element minOccurs="1" name="IntervalSeconds" type="xsd:integer" xmlns:xsd="http://www.w3.org/2001/XMLSchema" /> 5 <xsd:element minOccurs="1" name="SyncTime" type="xsd:string" xmlns:xsd="http://www.w3.org/2001/XMLSchema" /> 6 <xsd:element minOccurs="1" name="filePath" type="xsd:string" xmlns:xsd="http://www.w3.org/2001/XMLSchema" /> 7 </Configuration> 8 <OverrideableParameters> 9 <OverrideableParameter ID="IntervalSeconds" Selector="$Config/IntervalSeconds$" ParameterType="int" /> 10 <OverrideableParameter ID="FilePath" Selector="$Config/filePath$" ParameterType="string" /> 11 </OverrideableParameters> 12 <ModuleImplementation Isolation="Any"> 13 <Composite> 14 <MemberModules> 15 <DataSource ID="DS" TypeID="System!System.SimpleScheduler"> 16 <IntervalSeconds>$Config/IntervalSeconds$</IntervalSeconds> 17 <SyncTime>$Config/SyncTime$</SyncTime> 18 </DataSource> 19 <ProbeAction ID="Probe" TypeID="Windows!Microsoft.Windows.PowerShellDiscoveryProbe"> 20 <ScriptName>TCPPortMonitoringConfigDiscovery.ps1</ScriptName> 21 <ScriptBody> 22 param( 23 [string] $sourceId, 24 [string] $managedEntityId, 25 [string] $filePath ) 26 27 #Initialize SCOM API 28 29 $api = new-object -comObject 'MOM.ScriptAPI' 30 $discoveryData = $api.CreateDiscoveryData(0, $SourceId, $ManagedEntityId) 31 write-eventlog -logname "Operations Manager" -Source "Health Service Script" -EventID 999 -Message "TCP Port Monitoring: looking for CSV file" -EntryType Information 32 # $filePath variable contains UNC path of CSV Config file 33 if (test-path $filePath) { 34 write-eventlog -logname "Operations Manager" -Source "Health Service Script" -EventID 999 -Message "TCP Port Monitoring: Accessing CSV file" -EntryType Information 35 $contents = Import-Csv $filePath 36 try{ 37 $Path = (Get-ItemProperty "HKLM:SOFTWARE\Microsoft\System Center Operations Manager\12\Setup\Powershell\V2").InstallDirectory 38 $Path1 = $Path + "OperationsManager\OperationsManager.psm1" 39 if (Test-Path $Path1) 40 { 41 Import-Module $Path1 42 } 43 else 44 { 45 Import-Module OperationsManager 46 } 47 New-SCOMManagementGroupConnection 48 #Retrieve all windows computers which can be used as watcher nodes 49 $allServers = Get-SCClass | where { $_.Name -eq ("Microsoft.Windows.Computer")} | get-scommonitoringobject 50 } 51 catch{ 52 write-eventlog -logname "Operations Manager" -Source "Health Service Script" -EventID 999 -Message "TCP Port Monitoring: $_" -EntryType Information 53 } 54 #Read line by line from configuration file and create instance of TCP Port Monitoring Class 55 $contents | ForEach-Object{ 56 $ServerName = $_.ServerName 57 $PortNumber = $_.PortNumber 58 $WatcherNode = $_.WatcherNode 59 $NoOfRetries = $_.NoOfRetries 60 $TimeWindowInSeconds = $_.TimeWindowInSeconds 61 $Config = "$ServerName"+":"+"$PortNumber" # Will be used as display name 62 write-eventlog -logname "Operations Manager" -Source "Health Service Script" -EventID 555 -Message "Checking servers" -EntryType Information 63 $allServers | ForEach-Object{ 64 #Create instance only if the watcher node is managed by SCOM as the instance will hosted by the watcher node. 65 #The hosting object is windows computer whose display name is equal to watcher node value from CSV 66 #If there is no matching windows computer managed by SCOM, then the instance cannot be hosted. Hence the instance is not discovered. 67 if((($_.DisplayName).toLower()).contains($WatcherNode.toLower())){ 68 write-eventlog -logname "Operations Manager" -Source "Health Service Script" -EventID 555 -Message "Creating Instance for $Config" -EntryType Information 69 $instance = $discoveryData.CreateClassInstance("$MPElement[Name='GKLab.TCP.Port.Monitoring.Class']$") 70 $instance.AddProperty("$MPElement[Name='GKLab.TCP.Port.Monitoring.Class']/ServerName$", $ServerName) 71 $instance.AddProperty("$MPElement[Name='GKLab.TCP.Port.Monitoring.Class']/Port$", $PortNumber) 72 $instance.AddProperty("$MPElement[Name='GKLab.TCP.Port.Monitoring.Class']/NoOfRetries$", $NoOfRetries) 73 $instance.AddProperty("$MPElement[Name='GKLab.TCP.Port.Monitoring.Class']/TimeWindowInSeconds$", $TimeWindowInSeconds) 74 #The hosting object is windows computer whose display name is equal to watcher node value from CSV 75 $instance.AddProperty("$MPElement[Name='Windows!Microsoft.Windows.Computer']/PrincipalName$", $_.DisplayName) 76 $instance.AddProperty("$MPElement[Name='System!System.Entity']/DisplayName$", $Config) 77 $discoveryData.AddInstance($instance) 78 return 79 } 80 } 81 } 82 } 83 $discoveryData 84 Remove-variable api 85 Remove-variable discoveryData 86 </ScriptBody> 87 <Parameters> 88 <Parameter> 89 <Name>sourceId</Name> 90 <Value>$MPElement$</Value> 91 </Parameter> 92 <Parameter> 93 <Name>managedEntityId</Name> 94 <Value>$Target/Id$</Value> 95 </Parameter> 96 <Parameter> 97 <Name>filePath</Name> 98 <Value>$Config/filePath$</Value> 99 </Parameter> 100 </Parameters> 101 <TimeoutSeconds>300</TimeoutSeconds> 102 </ProbeAction> 103 </MemberModules> 104 <Composition> 105 <Node ID="Probe"> 106 <Node ID="DS" /> 107 </Node> 108 </Composition> 109 </Composite> 110 </ModuleImplementation> 111 <OutputType>System!System.Discovery.Data</OutputType> 112 </DataSourceModuleType> 113 </ModuleTypes> 114 </TypeDefinitions>

Step 6: Now that we have created discovery data source, we will create a discovery GKLab.TCP.Port.Monitoring.Discovery.

Below is the discovery xml fragment. The UNC Path is mentioned in filePath.

1 <Monitoring> 2 <Discoveries> 3 <Discovery ID="GKLab.TCP.Port.Monitoring.Discovery" Enabled="false" Target="SC!Microsoft.SystemCenter.AllManagementServersPool" ConfirmDelivery="true" Remotable="true" Priority="Normal"> 4 <Category>Discovery</Category> 5 <DiscoveryTypes> 6 <DiscoveryClass TypeID="GKLab.TCP.Port.Monitoring.Class" /> 7 </DiscoveryTypes> 8 <DataSource ID="DS" TypeID="GKLab.TCP.Port.Monitoring.Discovery.DataSource"> 9 <IntervalSeconds>500</IntervalSeconds> 10 <SyncTime>00:00</SyncTime> 11 <filePath>\\SCOM2012R2\Configs\TCPMonitoringConfig.csv</filePath> 12 </DataSource> 13 </Discovery> 14 </Discoveries> 15 </Monitoring>

Step 8: Add Language Packs and close the ManagementPack tag.

1 <LanguagePacks> 2 <LanguagePack ID="ENU" IsDefault="true"> 3 <DisplayStrings> 4 <DisplayString ElementID="GKLab.TCP.Port.Monitoring"> 5 <Name>GKLab TCP Port Monitoring</Name> 6 <Description>This Management pack monitors the list of ports discovered from config file.</Description> 7 </DisplayString> 8 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Class"> 9 <Name>GKLab TCP Port Monitoring Class</Name> 10 <Description>Class Contains Instances of TCP Ports that needs to be monitored from specific watcher nodes</Description> 11 </DisplayString> 12 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Class" SubElementID="NoOfRetries"> 13 <Name>No Of Retries</Name> 14 </DisplayString> 15 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Class" SubElementID="Port"> 16 <Name>Port</Name> 17 </DisplayString> 18 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Class" SubElementID="ServerName"> 19 <Name>Server Name</Name> 20 </DisplayString> 21 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Class" SubElementID="TimeWindowInSeconds"> 22 <Name>Time Window In Seconds</Name> 23 </DisplayString> 24 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Discovery"> 25 <Name>GKLab TCP Port Monitoring Discovery</Name> 26 <Description>Discovers TCP Port Monitoring Configs from given CSV file.</Description> 27 </DisplayString> 28 <DisplayString ElementID="GKLab.TCP.Port.Monitoring.Discovery.DataSource"> 29 <Name>GKLab TCP Port Monitoring Discovery Data Source</Name> 30 <Description>Data Source used by TCP Port Monitoring Discovery Rule</Description> 31 </DisplayString> 32 </DisplayStrings> 33 </LanguagePack> 34 </LanguagePacks> 35 </ManagementPack>

Step 9: Now import the management pack in SCOM and check if the configuration from CSV are discovered. 

Go to Discovered Inventory in SCOM Console and change target to “GKLab TCP Port Monitoring Class” to view the discovered items.

image

Step 10: Now you can develop custom monitors and rules targeting this class.

Thus the entire configuration can be maintained in a CSV file located in a shared location. For any new or modification in requirement, the CSV file can be updated accordingly. There is no changes required in SCOM side unless any additional headers are added and need to be absorbed in SCOM.

I will post details about monitors and rules for TCP Port monitoring in future posts.

 

Happy SCOMing!!!

GKLab.TCP.Port.Monitoring.xml

Viewing all 177 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>