Troubleshooting Login Enterprise Login Timer Issues

One of the issues that we consistently run into with our Continuous Tests are Login Times. In addition to an inordinate number of Login Failures, we also receive many Login Threshold Exceeded events.

While Login Enterprise does a nice job of capturing the Login Times and breaks those down into the Connection Requested/Established time, GPO processing time and User Profile Load times, looking at this information is cumbersome. And it’s almost impossible to see this information across multiple Tests/Targets/Time frames as you can only see one test’s data at a time.

To that end, I have written a script that will pull the Login Measurements and create an Excel Spreadsheet. I chose an Excel spreadsheet rather than a CSV file (which I would normally do) because I will generally want to create charts so that I can see trends. Excel is just more useful for any kind of meaningful analysis.

Approach

As usual, I am pulling the data directly from the MS SQL database. Using the API has the same limitations as the GUI in that you can only pull the measurements for a single test (technically a testrun, but since I’m using continuous tests, these are synonomous. For Load and Application tests, these are different GUIDs). Also, as usual, pulling from the SQL db is 3-5 times faster.

Another thing to consider is that the database table that stores these measurements (LoginEnterprise.Measurements) contains all measurements, including application timers, latency, etc. And there is a a single record for each individual measurement, rather than having a row with columns for each measurement. So, I had to transform this data to make it useful.

I’ll explain as I show each snippet of code.

The Code

This first step is to create the objects to build an Excel workbook and worksheets. In this snippet, I am creating 6 worksheets, one for each of my Login Enteprise pods. The rest of the code will only show for the first pod, but can be easily copied for each pod you are managing. I like to put the timestamp of when I pulled the data in my filenames.

$ExcelObj = New-Object -comobject Excel.Application
$ExcelObj.visible=$true
$ExcelObj.SheetsInNewWorkbook = 6
$workbook = $excelObj.Workbooks.Add()

$Pod1wksht = $workbook.Worksheets.Item(1)
$Pod1wksht.Name = "Pod1"
$Pod2wksht = $workbook.Worksheets.Item(2)
$Pod2wksht.Name = "Pod2"
$Pod3wksht = $workbook.Worksheets.Item(3)
$Pod3wksht.Name = "Pod3"
$Pod4wksht = $workbook.Worksheets.Item(4)
$Pod4wksht.Name = "Pod4"
$PodGwksht = $workbook.Worksheets.Item(5)
$PodGwksht.Name = "PodG"
$PodVwksht = $workbook.Worksheets.Item(6)
$PodVwksht.Name = "PodV"

$timestamp = Get-Date -Format o | ForEach-Object { $_ -replace ":", "." }
$ExcelFile = "<folder>\LoginTimers-$timestamp.xlsx"

Next, I create the Powershell objects to connect to the SQL database. I’m using global objects so that I know that they persist inside and outside of functions (though I wound up NOT using any functions, so feel free to change these back to normal objects.

The SQL query does a few things during that I should point out:

Date/Time Stamp: Since LE stores date and time information I update this to my timezone (US Eastern) and also change the format to MM/dd/yyyy, hh:mm:ss AM.

Note: I was having issues sorting by date when I used just the reformatted Timestamp. So, I created the TimeStamp1 column, which is the formatted one. I actually wound up sorting this in Excel after the worksheet is complete, so I will remove this in the future.

Duration: LE stores the durations for each measurement in milliseconds. This is hard to consume, especially when looking at thousands of measurements, so I divide the result by 1000.

How many records to pull: I used this statement “LoginEnterprise.Measurements.Timestamp >= DATEADD(n,-60,getutcdate())” to pull all records from the past 60 minutes. This can be adjusted for your needs. For instance, if you wanted to pull all of the records for the past day, change the DATEADD parameters to (d,-1,getUTCdate()) or (d,-1,getdate()).

$global:BSQLServer = "<SQL server>"
$global:db1 = "<database name>"

$global:queryLoginTimers = "select LoginEnterprise.Measurements.UserSessionId,LoginEnterprise.Measurements.Timestamp,FORMAT (LoginEnterprise.Measurements.Timestamp AT TIME ZONE 'UTC' AT TIME ZONE 'Eastern Standard Time',  'MM/dd/yyyy, hh:mm:ss tt') as Timestamp1,LoginEnterprise.Measurements.MeasurementID,LoginEnterprise.Measurements.Duration/1000 as Duration,LoginEnterprise.UserSessions.Target,LoginEnterprise.UserSessions.LauncherName,Management.Accounts.Username
from LoginEnterprise.Measurements
Left JOIN LoginEnterprise.UserSessions
ON LoginEnterprise.Measurements.UserSessionId = LoginEnterprise.UserSessions.UserSessionId
Left JOIN Management.Accounts
ON LoginEnterprise.UserSessions.AccountId = Management.Accounts.[Key]
WHERE LoginEnterprise.Measurements.Timestamp >= DATEADD(n,-60,getutcdate()) AND (MeasurementId IN ('connection') OR MeasurementId IN ('group_policies') OR MeasurementId IN ('user_profile') OR MeasurementId IN ('total_login_time'))

Order By Timestamp Desc, UserSessionId;"

This will provide a result like this:

Next, I write out the headers for the worksheet. This should be self-explanatory.

$Pod1wksht.Cells.Item(1,1) = "Timestamp"
$Pod1wksht.Cells.Item(1,2) = "ASP"
$Pod1wksht.Cells.Item(1,3) = "User"
$Pod1wksht.Cells.Item(1,4) = "Launcher"
$Pod1wksht.Cells.Item(1,5) = "Connection"
$Pod1wksht.Cells.Item(1,6) = "GPO"
$Pod1wksht.Cells.Item(1,7) = "User Profile"
$Pod1wksht.Cells.Item(1,8) = "Total"

Now, let’s start collecting the data, transforming it and populate the worksheet with it.

As we are going to write each row of data, let’s set a variable to hold our row value and iterate that as we progress. I’m going to start writing on row 3. This will leave a space between the header row and the date.

$Row = 3

Now, let’s perform our SQL query and store the data in an array.

$LoginTimers1 = Invoke-Sqlcmd -ServerInstance $global:BSQLServer -Database $global:db1 -Query $global:queryLoginTimers -Username "<SQL user>" -Password "<SQL password>" -Verbose -TrustServerCertificate

Since the records are stored individually, it will return 4 records for each login. But, each set of 4 is tied together by the UserSessionID, so we can group the values using that as the key.

The first step I did was to create an array that has the unique set of UserSessionIDs. This will also sort them in descending order based on the Timestamp (the original, not the re-formatted).

$Sessions1 = $LoginTimers1.UserSessionID  | Sort-Object Timestamp -Descending | Get-Unique

So, now we will loop through the $Sessions array and for each one, find the four values for that particular login session. We’ll grab each datum and place it in a variable to write it to the correct cell. I’m also formatting the cells that hold the timer value to a 2 digit decimal number. Lastly, we increment the $Row value for the next iteration of the loop.

ForEach ($Session in $Sessions1) {
    $Timers = $LoginTimers1  | Where-Object UserSessionID -eq $Session.GUID
    $ConnectionTimer = $Timers | Where-Object MeasurementID -eq "connection"
    $GPOTimer = $Timers | Where-Object MeasurementID -eq "group_policies"
    $ProfileTimer = $Timers | Where-Object MeasurementID -eq "user_profile"
    $TotalTimer = $Timers | Where-Object MeasurementID -eq "total_login_time"
    $Timestamp = $Timers[0].Timestamp1
    $ASP = $Timers[0].Target
    $Account = $Timers[0].Username
    $Launcher = $Timers[0].LauncherName
    $Pod1wksht.Cells.Item($Row,1) = $timestamp
    $Pod1wksht.Cells.Item($Row,2) = $ASP
    $Pod1wksht.Cells.Item($Row,3) = $Account
    $Pod1wksht.Cells.Item($Row,4) = $Launcher
    $Pod1wksht.Cells.Item($Row,5) = $ConnectionTimer.Duration
    $Pod1wksht.Cells.Item($Row,5).NumberFormat = "0.00"
    $Pod1wksht.Cells.Item($Row,6) = $GPOTimer.Duration
    $Pod1wksht.Cells.Item($Row,6).NumberFormat = "0.00"
    $Pod1wksht.Cells.Item($Row,7) = $ProfileTimer.Duration
    $Pod1wksht.Cells.Item($Row,7).NumberFormat = "0.00"
    $Pod1wksht.Cells.Item($Row,8) = $TotalTimer.Duration
    $Pod1wksht.Cells.Item($Row,8).NumberFormat = "0.00"

        $Row++ 
    }

I found that, even with all of the sorting that I was doing in the SQL statement and while processing the data, the results were still not sorting properly. So, I wound up adding a command to sort the worksheet based on the Timestamp column after it was completely written. And I also resized the columns to auto-size the width.

$Pod1Range = $Pod1wksht.UsedRange
$Pod1Range2SortBy = $Pod1Range.Range("A:A")
[void]$Pod1Range.Sort($Pod1Range2SortBy,2,$null,$null,1,$null,1,1)
#adjusting the column width so all data’s properly visible
$Pod1Range.EntireColumn.AutoFit() | Out-Null

In my case, I performed this for the next 5 pods. If you are just dealing with the one environment, then you can simply save the Excel file.

$workbook.SaveAs($ExcelFile)

This is just a tool for collating this data. There is a ton of login data in my environment seeing as we run 1400 continuous tests, some scheduled to run every 10 minutes and some to run every 15 minutes. So, in my first pod, with 1 hour’s worth of data, I had 2400 records.

One hour of data is not really sufficient to see trends or troubleshoot why or when some logins might be taking longer than normal. But, this is a good start to gathering the data. I’ll next be working on some repeatable ways to process this data, most likely with charts so that we can visually compare by time, datacenter, server pools, users or launchers. If all goes well, I’ll post an article on my approach.

Leave a Reply