Adding All Services to an Existing Office 365 User License

When working with our clients, we often find that they have enabled only some of the services within an Office 365 license.  Some companies, for example, may enable E3 licenses for a subset of users, but they don’t enable Lync Online.  While it’s very easy to add a service from within the Office 365 Admin Center, this method is not very efficient when a company has to modify several hundred or thousands of accounts and instead want to leverage Windows PowerShell.

By combining the use of the New-MsolLicenseOptions and Set-MsolUserLicense cmdlets, it’s possible to remove and add services.  In the following example, the account has been assigned all E3 services except for Office 365 ProPlus (OFFICESUBSCRIPTION) and Lync Online ‎(Plan 2) (MCOSTANDARD):

clip_image001

The company wants to add the Office 365 ProPlus service, but keep the Lync Online service disabled.  Running the following cmdlet will set the disabled service to only “MCOSTANDARD”:

$LicenseOptions = New-MsolLicenseOptions -AccountSkuId "company:ENTERPRISEPACK" -DisabledPlans MCOSTANDARD

Running this next cmdlet will change the license settings:

Get-MsolUser -UserPrincipalName john.doe@company.com | Set-MsolUserLicense -LicenseOptions $LicenseOptions

Since the “OFFICESUBSCRIPTION” service was not explicitly excluded in the “DisabledPlans” parameter, by default, it will now be enabled:

clip_image002

Note that the “ProvisioningStatus” for OFFICESUBSCRIPTION changed from “Disabled” to “PendingInput”.  When viewing the license settings in the Admin Center, the service will now be enabled under the E3 license details:

clip_image004

Now, again consider the scenario where a company has assigned E3 licenses, but left the Office 365 ProPlus and Lync Online (Plan 2) services disabled for all E3 licensed users.  The company now wants to enable all services, and not exclude any services.  In the past, Microsoft support has always provided that the only way to accomplish this is to remove the license, then reassign it without any “LicenseOptions”, effectively enabling all services.  While this method is perfectly safe, some companies are a bit apprehensive to make this change to a large number of accounts at once, for fear of disconnecting the users’ mailboxes and causing a service outage.

Instead of removing and re-adding the license, it’s possible to accomplish the same task by setting the “DisabledPlans” parameter to “$Null” within the “New-MsolLicenseOptions” cmdlet.  Example:

$LicenseOptions = New-MsolLicenseOptions -AccountSkuId "company:ENTERPRISEPACK" -DisabledPlans $Null
Get-MsolUser -UserPrincipalName john.doe@company.com | Set-MsolUserLicense -LicenseOptions $LicenseOptions

Note that both the OFFICESUBSCRIPTION and the MCOSTANDARD “ProvisioningStatus” have changed to “PendingInput”, and the services will show as enabled under the E3 license details in the Admin Center:

clip_image005

clip_image007

I hope you find this tip useful when managing your Office 365 licenses with Windows PowerShell.

Barry Thompson
Principal Consultant

How do I get to all my applications in Windows RT 8.1 Blue Preview?

With Microsoft releasing the Windows 8.1 (Blue) upgrade for download yesterday evening and us always wanting to jump into new technology, our first impressions of Windows 8.1 (Blue) upgrade on our test Windows RT tablet were pretty good. There were some good things, and some difficulties. One of those difficulties were around getting to our applications using the familiar ways we learned in RT. The following is from one of our consultant’s experiences. Keep checking back often as we blog about our experiences with the Windows Server 2012 R2 and Windows 8.1 previews!

All my apps are gone!!!

For those of you who have installed the 8.1 Blue preview, you may have found it more difficult to find any of your applications that are not pinned to the start screen.

Windows 8.1 Start Screen

Windows 8.1 All Applications

Previously in Windows RT (and in Surface Pro), you could just swipe up and then click on the icon in the corner to view all you applications.

Windows RT - All apps icon

However, in the update, this has been replaced by an icon for customizing the groups of apps in the start screen (sorting and naming groups). This is easier now than it was before for those functions, however it didn’t get me to what I wanted, which was access an application tile not on my start screen.

Windows RT 8.1 - Customize icon  image

All was not lost however. I could still search for an app (swipe from the right and choose search from the charms menu) and then open it. But to actually get to an app’s tile and then select it to pin to the Start, I found the following two ways:

First, the swipe method:

Once in the start tile screen, just swipe up from the middle of the screen to be presented all of your applications. Swiping up or down then swaps between all apps and the start screen. It makes sense, but wasn’t as intuitive as I expected and was discovered with some trial and error.

Second, the more apps icon:

The second isn’t obvious, but if you notice small things is pretty easy to catch. If you swipe your start screen all the way to the right you will notice an arrow in the lower left corner pointing down. clicking on that will take you to all of your applications, same as the swipe down does.

Windows RT 8.1 Start Screen - More Windows RT 8.1 - More apps icon

Take away:

While not immediately intuitive, I think my kids could have found these quickly enough and after using it a few times I find it to be a much faster way to get to my apps without having them on the start screen.

 

I hope our consultant’s experience can help some of you who are wondering where all of your applications are in the Windows 8.1 preview. We hope to have more of their experiences in the coming posts to give you some exposure to Microsoft’s newest version of Windows 8.

Jason Condo
Principle Consultant

Importing ConfigMgr 2007 task sequences XML to ConfigMgr 2012 ZIP

With the new 2012 import/export functionality, the new file format is “.zip” file. This compressed file contains not only the task sequence XML can also include any dependencies to the task sequence like a boot image. While this is awesome for migrating between a test and production ConfigMgr 2012 environment, it does not help if you are trying to import task sequences from a disconnected 2007 environment.

In my consulting practice, we do a lot of OSD implementations using a base set of task sequences that we already have pre-configured. Once at a customer, we customize our base templates for the specific project and then export the XML or ZIP to the project documentation. Well today I was at a client that we had previously done work for and they had already performed a 2012 upgrade and removed their old 2007 environment. However, they did not migrate any of the OSD and were looking for us to re-implement OSD in their new environment. Instead of importing our canned OSD for 2012 and then customizing for their needs, we wanted to use the customized 2007 task sequences we had implemented for their old environment. The first problem, however, was the only copy of those were from the archived XML from our project files we had left them. The second is that you can’t import that XML through the 2012 console. Not to worry though, we can still make it work.

The 2012 exports are just compressed files full of the resources, some configuration files, and then the task sequences XML. This 2012 task sequence XML is not the same as the old 2007, but we are able to insert the 2007 XML into the appropriate spot to make it useful. This enabled us to save a bunch of time from recreating the old TS logic. The following is a quick example of how this works.

Start with a 2012 exported task sequence. This is in .ZIP format.

Export a Configuration manager 2012 task sequence

task sequence exported to .zip

Once exported, open the zip file and navigate to the task sequence folder and copy out the object.xml

open the object.xml file

Open the object.xml file and you will see a lot of new xml, however, scrolling almost to the end of the file you will find a section with embedded task sequence XML.

look for the embedded task sequence xml

This XML is the same task sequence XML as you have in a normal exported task sequence from 2007, however you need to be sure only to grab the appropriate XML nodes and not the whole task sequence. To do so, in the old 2007 XML, copy the nodes and data from the sequence xml node:

<sequence version=”3.00″>
…..
</sequence>
copy the 2007 task sequence xml

and paste it into the object.xml in the CDATA section in the 2012 XML replacing the existing embedded sequence node:

<![CDATA[
….
]]
paste the xml into the 2012 task sequence

You don’t have to worry about the text/line formatting. Save the file and then copy it back into the .ZIP file. You can then import the ZIP file into your 2012 environment and adjust your referenced objects accordingly. This is great when you have a master task sequence of custom tasks and you just would like the ability to copy/paste them into your new 2012 task sequences. One thing to remember is that your old task sequences were built on the package/program model for software installs. If you are leveraging the new applications model (which you should be) you will have to recreate those specific tasks anyways.

Jason Condo
Principle Consultant

   

June 26th
Additional Notes:

It seems that some people are having problems importing. While I’m not sure as to what they are seeing specifically, I found that the best option that worked for me was to create a blank default task sequence (not a MDT task sequence) to use as the export template from 2012. I grabbed the sequence node from the old and inserted it into the new, replacing the embedded sequence xml node. I don’t see why you couldn’t grab below the sequence node as well (after <sequence version=”3.00″>). It think may address some of the users’ experiences of having 3.10 as a sequence version. Hope that helps and keep sharing your experiences.

Leveraging SQL Server Profiler to troubleshoot 18456 Events

Many times I am brought in to assist in troubleshooting strange things that the client can’t identify easily on their own. In this particular occasion I was assisting in supporting a SharePoint solution and SQL Server kept generating the following 18456 event: “Login failed for user ‘NT AUTHORITY\NETWORK SERVICE’. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: <local machine>]”  every minute in the event log. The client was not sure why this was occurring and thought it may have been from an outage they had recently.

Event 18456 - Login failed for user\

A quick web search of the event showed people who had problems with applications accessing a database, but none with this specific account. That is because this is a generic message showing that some account is accessing some database from some computer and doesn’t have the appropriate permissions to do so. Some of that information is provided, however it doesn’t tell us why it is happening. So how do we get more information so that we can suggest the correct path to resolve it?

On the surface, my first impression was that a service was trying to access a database within SQL Server running as the Network Service, and was not permitted to access it. I gathered this from the fact the login was listed as ‘NT AUTHORITY\NETWORK SERVICE’ and the client was defined as local machine, CLIENT: <local machine>. Going with my first thoughts, I opened the Services console and sorted by login to determine the services running as Network Service.

image

This directed me to what I was pretty sure the problem was. If you look, there are two services related to SQL that were configured to run as Network Service. In addition, the client had all of the other SQL services configured to run with a defined service account, so these were anomalies to not also have been configured in the same manner. While confident this was most likely the source of the event generation, I needed to be sure.

SQL Server Profiler to the rescue!

This is where SQL Server Profiler comes in handy. This is a great tool to give you incite into your SQL environment and what is happening on a transactional basis. You can use it to trace events occurring in SQL to find stored procedures that are having problems, or long running queries, or any number of other problems that you just aren’t sure and need additional view of. In this case, we are looking for failed login attempts.

For this troubleshooting session, I knew that the logged event was only once every minute. This meant that if I configured the trace correctly, I would not be scrolling through a lot of event instances looking for my event. As well, I would not need to capture a lot of data, so outputting the profiler to a database or file wasn’t necessary.

Getting Started: Setting up the Trace.

imageTo get started, open the Start Menu and navigate to Microsoft SQL Server 2008 R2 > Performance Tools > SQL Server Profiler (also available from SQL Server Management Studio under the Tools tab). When you first launch SQL Server Profiler, it will prompt you for the trace properties. the first tab (General) defines the initial properties of the trace. The section ‘Use the Template’ is of most interest of us in this troubleshooting. This defines the most probable list of counters and columns that we want to start with for capturing information in the trace. This is because  the actual amount of information we can choose from is vast and can be overwhelming if this is your first look into tracing or if you are not a seasoned SQL admin. The additional fields for saving the output to a file/database and trace stop time are not relevant to our isolated troubleshooting. However they can be handy when you are trying to find an intermittent problem and want to run a trace for a long time or have a lot of events you are capturing. Again, not relevant in this particular instance.

SQL Server Profiler trace properties

For this troubleshooting let’s start with the Standard (default) template. Once selected, go to the Events Selection tab. This will show you all the events and columns that are selected to be captured and displayed in the trace.

image

As you can see, we are capturing a lot of additional data that is probably not relevant to what we are looking for. Namely, we were looking for something associated with logins (remember: “Login failed for user ‘NT AUTHORITY\NETWORK SERVICE’…). With that, I removed the events that I didn’t think would be required. I also unchecked columns of data that I didn’t think would help me once I found the appropriate event (I don’t care about which CPU is being used, or the duration, etc…)

image

Now I could run this trace as-is, and you can even do so just to see the amount of data being captured and the information in the trace session. However, this will not give me the event I was looking for. This is because my specific event is a failed login. This trace will only show me successful logins and logoffs. So how do we get the data I really want?

Finding Audit Login Failed

First, I select Show all events to show all the possible events that I can trace. From the selection above, you will see that Security Audit has some events already selected.

image

I want to be more specific however. I unchecked the Audit Login  and Audit Logoff events and instead chose Audit Login Failed. This chose all the standard columns but won’t give us all the information we need. For that, I selected Show all Columns.

image

To troubleshoot I then chose NTUserName, SPID (can’t uncheck that one), ApplicationName, DatabaseName, and Error.

image

I then clicked Run to start tracing the events. Because this event only triggers once a minute, I only had to wait a short time to see the error captured. As you can see, it was the Report Server (Reporting Services Service) accessing the master database. You can also see that we have the matching 18456 event number.

SQL Server Profiler trace output

With that I had the information needed to take back to the client and inquire more as to why this service might have had access removed (not being defined in SQL security), be misconfigured (changed from a specific login to Network Service as in maybe it was recently added as a feature but misconfigured), or if there was some other explanation.

In this case, it turns out that the engineer troubleshooting an earlier problem wasn’t aware as to the state of the services and set SQL Reporting Services and SQL Integration Services from disabled to automatic and started them in an attempt to resolve a SQL problem that they were having. It didn’t solve their problem, but because they didn’t document their troubleshooting (or perform proper analysis as like above) they left those services running and in a state that caused additional work to troubleshoot and resolve.

While this is a very specific incident and resolution, I hope that this quick view into the SQL Server Profiler gives you an additional tool to properly research errors and resolve your problems. For additional information on the tool, please explore this MSDN link : http://msdn.microsoft.com/en-us/library/ms181091.aspx

Jason Condo, MCITP
Principal Consultant, Systems Management and Operations

Enabling Users for Office 365 Licensing Made Easy

When transitioning to the cloud, some of Microsoft’s new PowerShell commands can be hard to find solid answers on and are completely different in some cases from what most Exchange administrators are used to using.  The Set-Msoluser cmdlet for instance is not something that you had ever seen in Exchange 2007 or 2010.  I’m often asked the easiest method for giving a user a license without manually clicking through the Portal for 1000+ users.  Today I’ll share a simple way to get this done and save time during a migration.

These new PowerShell cmdlets make assigning licenses to users very simple and much faster than using the Online Services portal to manually assign licenses. This post will walk you through how to make a basic PowerShell script that reads a CSV file to activate and assign licenses to users in your target Office 365 environment.

Step 1 – Determine the license types you have.

Before we configure the script we’ll need to know what the license types are in order to include them in our script. 

Connect to the Msol service using Connect-MsolService and enter in valid administrator credentials.

Run the following PowerShell command to determine the license types:

Get-MsolAccountSku | Format-Table AccountSkuId, SkuPartNumber

The output should look something like this:


AccountSkuId SkuPartNumber
------------ -------------
COMPANY:STANDARDPACK STANDARDPACK
COMPANY:ENTERPRISEPACK ENTERPRISEPACK

I’ve removed the company name before the : but it should be similar. For this example’s purposes let’s call the licenses TEST:STANDARDPACK and TEST:ENTERPRISEPACK.

Step 2 – Configure your CSV input file

Now we can start to create the input file.  We’ll need a bit more information about your users but generally we’ll set the CSV up with 3 columns:  Username, LicenseType, Location.  The Username column will be the UPN or online login name of the user (these should match).  The LicenseType will be based upon the outputs we received from step 1.  The Location is where the location needs to be set for each user, such as >US for United States, IN for India, MX for Mexico, and so on.  It should look something like the following.

username licensetype location
user01@test.onmicrosoft.com TEST:STANDARD us
user02@test.onmicrosoft.com TEST:ENTERPRISE in
user03@test.onmicrosoft.com TEST:ENTERPRISE mx
user04@test.onmicrosoft.com TEST:STANDARD us

Save the inputfile in ‘msol_activate.csv’. Now we can go on the script.

Step 3 – Configure the Script

Now that our input file has been setup we can write the script accordingly.  Let’s open a notepad file.

We’ll need to define our list and just refer to it as $List and use Import-CSV to read the CSV file:

$List = Import-CSV “msol_activate.csv"

No we’ll start the loop using foreach and call out $User as each line of the $List

foreach ($User in $List)
{

In the first line we’ll set the UsageLocation since it is required before setting the license.  The name of the columns from our CSV file will be added to $User so the script can grab them specifically.

Set-MsolUser -UserPrincipalName $User.Username -UsageLocation $User.Location

Now we’ll add the license type:

Set-MsolUserLicense –userprincipalname $User.Username –addlicenses $User.LicenseType

And lastly we’ll set a default password and force the user to change that password the first time they log in, then close the loop:

Set-MsolUserPassword -UserPrincipalName $User.Username -ForceChangePassword $true -NewPassword "Office365Rules"
}

You could also add a column to your CSV file named Password and have the –NewPassword be $User.Password.

Put it all together and the script looks like this:


$List = import-csv  “msol_activate.csv"
foreach ($User in $List)
{
    Set-MsolUser -UserPrincipalName $User.Username -UsageLocation $User.Location
    Set-MsolUserLicense –UserPrincipalName $User.Username –addlicenses $User.LicenseType
    Set-MsolUserPassword -UserPrincipalName $User.Username -ForceChangePassword $true -NewPassword "Office365Rules"
}

Summary

So now we have successfully determined our license types, take our user data and put it into an input file and created a script that easily gives our users a license and a temporary password.  This script will help get your users to the cloud faster and assigned licenses much faster than manually using the portal.

 

Peter Gleek
BA Advanced Infrastructure Principal Consultant

“Previous Versions” and Shadow Copies with Very Long Paths

I was working on a server the other day and needed to recover files from a previous version of a folder through Previous Versions.  (A backup was not available on the particular folder for reasons I won’t get into here.)  I ran into a problem that I couldn’t really find documented very well anywhere and thought I would document it for others.

I first made sure there was a previous version available:

image

In fact, this was through a shadow copy, which will turn out to be very important:

image

So, we have the previous state of the folder, it’s in a local snapshot, so I could get it back, right?  Let’s see what happened.  I clicked Open on the Previous Versions tab, then navigated through Explorer to the folder that has the file I want to get.  It turns out this file is nested a few levels deep in a set of long folder names, and has a long filename as its exposed filename:

\\localhost\C$\Users\Administrator\Desktop\Somewhat Long Folder Name\Another really long folder name for a good reason that you do not know\Yet another long nested folder name believe it or not (‎Today, ‎January ‎26, ‎2012, ‏‎12 minutes ago)\Long file name here as well that will be a problem for us soon.txt

Why is the long name a problem?  Well, when I tried to copy the folder out of the shadow copy, I got this 100% correct yet not helpful error:

image

Or, in text form:

The source file name(s) are larger than is supported by the file system. Try moving to a location which has a shorter path name, or try renaming to shorter name(s) before attempting this operation.

Questionable grammar aside, the error’s suggestions, which relate to changing the source, are useless, because shadow copies are read-only.  So now what?

Well, the reason the path is too long is because with the shadow copy overhead added to the path, the filename has a length longer than MAX_PATH, or 260 characters.  I suspect Explorer still cares due to backwards compatibility, which is why the Unicode 32K path length doesn’t come into play, but that’s just a guess.  Anyway, this still leaves the problem of getting a shorter filename.

The answer is to surface or expose the shadow copy as a drive letter.  There are multiple ways to go about this.  The first one that I thought of – to use the diskshadow command that is new in Windows Server 2008 – didn’t work as I expected.  Let’s see what happened, then explain a solution.

First, we find the exact name of the shadow copy.  I listed them to a file (I used diskshadow for consistency, although vssadmin would also let me do that piece), then searched the file in Notepad:

C:\Users\Administrator\Desktop>diskshadow /l shadows.txt
Microsoft DiskShadow version 1.0
Copyright (C) 2007 Microsoft Corporation
On computer:  DEMOSERVER,  1/26/2012 11:20:09 AM

DISKSHADOW> list shadows all

… shadow listing here …

Number of shadow copies listed: 196

DISKSHADOW> exit

C:\Users\Administrator\Desktop>notepad shadows.txt

In this case I wanted the 10:41:53 AM snapshot on January 26, 2012 for the C: drive, which looked like this in the log:

* Shadow copy ID = {1cbf48de-1e49-4ae4-9a24-0c75d3dc4c6d}
– Shadow copy set: {55c21b6c-b34f-4f0c-88df-e03fc952f39e}
– Original count of shadow copies = 1
– Original volume name: \\?\Volume{12cba6d6-7540-11e0-bd41-806e6f6e6963}\ [C:\]
– Creation time: 1/26/2012 10:41:53 AM
– Shadow copy device name: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy266
– Originating machine: DEMOSERVER
– Service machine: DEMOSERVER
– Not exposed
– Provider ID: {b5946137-7b9f-4925-af80-51abd60b20d5}
– Attributes:  No_Auto_Release Persistent Client_accessible No_Writers Differential

Next, I want to map the path to a shorter location.  I thought I could do this through diskshadow, but it turns out there’s a restriction that prevents this:

DISKSHADOW> expose {1cbf48de-1e49-4ae4-9a24-0c75d3dc4c6d} P:
Client accessible shadow copies cannot be exposed.

The GUID in the expose command is the “Shadow copy ID” given in the listing.  Because the shadow copy is accessible to the client (through Previous Versions), I couldn’t directly map it to a drive.  So now what?

Well, the trick was on a Microsoft blog — using a symbolic link to get to the shadow copy:

C:\Users\Administrator\Desktop>mklink /d c:\s \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy266\
symbolic link created for c:\s \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy266\

The link is from a very short-named local folder to the “Shadow copy device name” given in the listing.  As explained in the blog post, I didn’t forget to add a trailing slash to the mapping (it won’t work if you don’t do that).  Now, I can look at this linked version in Explorer, and copy the data!

image

Then, I deleted the symbolic link with rmdir c:\s to clean up, and that was that!

I hope this helps should you run in to the same error trying to copy from a previous version.

— Michael C. Bazarewsky