Capturing Robocopy Metrics: Scanning the Log File.

[This is Part One of a multi-post series on how my team is combining Robocopy, PowerShell, SQL, and Cold Fusion to track the metrics we generate when we use Robocopy for replicating data within our file server environment.]

Good day All

This year in my workgroup we are focusing on capturing metrics on the tasks that we do, and one thing that we do a great amount of is file replication.  Whether we are replacing a server, and need to move all of the data to another server, or are simply load balancing the available space on a server’s data disks, we use Robocopy for the data replication.  Robocopy has a host of features that make it our default copy tool, but the one that relates to our discussion today is its logging capability.

As stated above, part of the metrics focus for the year is to capture the amount of data the we are replicating.  The primary method for capturing replication metrics is to parse the Robocopy logs we generate when we perform our daily replications.  Using Regular-Expressions, and PowerShell, this blog post will examine a method for extracting the text needed for our metrics.  The script file being discussed in this post is called Scan-RobocopyLogs.ps1; it and the test Robocopy log file we are scanning can be downloaded from my Box folder here.To get started here is an image of a Robocopy log file.  In this particular screenshot very little new data was discovered, so the amount of files shown as moved is pretty small.

Robocopy_LogFile_Totals

Fig. 1: Robocopy Log File

 

 

A few notes on the above image:

  • I have renamed the  names of the servers in the text file to keep anybody in my company from getting nervous. So it is not an original Robocopy output file, but the values are still valid.
  • You will see a total of four files marked as either ‘New File’ or ‘Newer’.  On a busy day, or a first time replication, many thousands of files could be potentially displayed.  This is an incremental replication, and only those files that have been added or changed are actually copied.
  • Robocopy summary information is contained in the red rectangle that I have overlaid onto the image of the log file.  This red bordered area will be the primary focus of the metrics parsing.

In the red rectangle the data is presented in rows called Dirs, Files,  Bytes, Times, Speed, and Ended.  Also you should notice several columns labelled Total, Copied, Skipped, Mismatch, Failed, and Extras.  Notice the number of Copied-Files is 4, and the number of of Skipped-Files is 30916. If any files had not been successfully replicated they would  have been listed in the Failed-Files number.  So that is the general description of the log file.  Now on to the parsing.

For each Robocopy log file that is scanned a data object will be created.  The data object will be used to hold the various metrics values that we capture using Regular-Expression matching.  He is a screenshot of the data object properties:

Metrics Object Properties

Fig. 2: Metrics Object Properties

Notice that we are capturing a wide variety of data from the text in the log file.  As we convert the data into objects it becomes much more useful than plain text: now we can perform mathematical functions, and use the data to feed database tables. So how do we capture and convert this text from a log file into objects?  Regular-Expressions!

A lot of information is available on Regular-Expressions (known here after as RegEx).  There are many methods to perform the type of RegEx filtering we are doing here.  As I write scripts for our team at work, I try to be verbose in the way I do things to make the script readily understandable for all team members. So, the following can be done in a more succinct manner, but I chose this implementation for the sake of clarity.  If you downloaded the file previously using the Box link provided above, you can follow along with this general line by line discussion.

Lines 1 – 10: The parameter input for the script. Only 1 parameter is used, -FilePath provides the full path to the log file being scanned.

Lines 15 – 42: Set up the local folder values, and setup the error logging mechanism.

Lines 45 – 51: Set up global variables for use in the script.

Lines 53 –  69: Create an object to hold the values parsed from the text read from the log file.

Lines 71: Test to see if the path to the log file is valid.

Lines 73 –  92: A function to clear the contents of the parsed text data object.

Lines 94 –  110: Set the values in the data object to ‘Logfile Corrupted’. This will be used when a log file has been found to be  badly formatted.  This will occur when a log file is being written to simultaneously by multiple Robocopy sessions.

Lines 112: Read the contents of the log file, and store them into $roboCopyStr.

Lines 113 –  276: Loop to read through each line ($line) of the value stored in $roboCopyStr.  When the text in $line matches various RegEx statements it will be assigned as a value for a property in the data object.

The following table is a summary of the RegEx matches used in this script.  They are accompanied by an example of the type of text they are looking for, and the variable within the script that is affected. The ‘Regular Expression’ column in Table 1 should be compared to Lines 113  – 276 of the script.

Regular Expression Matched Text Affected DataObject Property
‘^\s+Started\s:.*’ Started : Sun Feb 08 21:00:01 2015 $dataObject.Start
‘^\s+Source\s:.*\\\\.*\\[A-Z]\$.*’ SourceServer\S$ $sourceServer / $sourceDrive
‘^\s+Dest\s:.*\\\\.*\\[A-Z]\$.*’ TargetServer\T$ $targetServer / $targetDrive
‘^\s+Ended\s:.*’ Ended : Sun Feb 08 21:05:53 2015l $dataObject.End
‘^\s+Source\s:.*’ Source : \\SourceServer\S$\… $dataObject.Source
‘^\s+Dest\s:.*’ Dest : \\TargetServer\T$\… $dataObject.Destination/td>
‘^s*Replicating_SysAdmin’ Replicating_SysAdmin: PP1071 $replicatingSysAdmin
‘^\s+Dirs\s:\s*’ Dirs : 3217 1 3216 … $dataObject.TotalDirs = $dirs[0]
$dataObject.CopiedDirs =$dirs[1]
$dataObject.FailedDirs =$dirs[4]
‘^\s+Files\s:\s[^*]’ Files : 30920 4 30916 … $dataObject.TotalFiles = $files[0]
$dataObject.CopiedFiles = $files[1]
$dataObject.FailedFiles = $files[4]
‘^\s+Bytes\s:\s*’ Example: Bytes : 1.607 g 0 … $dataObject.TotalMBytes = $tempByteArray[0]
$dataObject.CopiedMBytes = $tempByteArray[1]
$dataObject.FailedMBytes = $tempByteArray[4]
‘^\s+Speed\s:.*min.$’ Dest : \\TaargetServer\R$\Pl… $dataObject.Destination

Table 1: RegExs and Their Affect in This Script.

Line 278: When each line of $roboCopyStr has been read the For loop is exited. The each parsed line of the read text that has matched  one of the RegEx expressions has been stored into the data object $dataColl, or one of the other variables: $sourceServer,$sourceDrive,$targetServer,$targetDrive,  and $replicatingSysAdmin.

Here is a sample run of the script:

.\scan-RobocopyLogs.ps1 -filePath .\150208_TargetServer_S_MSO_R.TXT 

Scan-RobocopyLogs.ps1 Run_Example

Fig. 3: Scan-RobocopyLogs.ps1 Run_Example

Looking at the output of Fig 3, you can see that when we run the script it provides several valuable pieces of information that were parsed from the log file. The output from this script is intended to be used as the return data from a call in another script.

That is all for now. Up next will be:

“Capturing Robocopy Metrics: Setting Up the Replication Log”

Thank you for reading.

Have a great day.

Patrick

Advertisements

Capturing Robocopy Metrics: Overview

Good day All

Today I am feeling ambitious, and I would like to launch a multi-post series on some of the work we are Christen_the_Seriesdoing this year.  Part of our requirements for the year are to capture the metrics that we create for the daily tasks that we do.  My first effort toward that end has been to create a process for tracking the amount and sizes of files that we move as part of our daily load balancing or server migration replications.

I have had good success with this endeavor, and wanted to share the results with the loyal followers of TheScriptLad.com. I am presenting my results in the following sequence:

Capturing Robocopy Metrics: Scanning the Log File. Using regular expressions to parse a Robocopy log file; doing this to extract the amount of data moved, and to count any errors to indicate any replication failures.

Capturing Robocopy Metrics: Setting Up the Replication Log. This is will be a look into the Robocopy template that we use. This is something that we have tweaked and nudged over the years, and I think we have a great replication process in place.

Capturing Robocopy Metrics: Uploading the Results to a Database. Take the results from the Robocopy log files, and add them into an SQL database.

Capturing Robocopy Metrics: Displaying the Results with Cold Fusion.  The final output. Display the Robocopy replication metrics in a Cold Fusion page. This will work with a standard HTML page, but we are applying the results to a Cold Fusion server.

I’m looking forward to sharing my results with you, and welcome any comments and suggestions you would like to present.

Have a great day,

Patrick

Continual Ping of a Computer with an Alert When Successful.

Good day.
Yesterday was a Friday, and it had been a full, but good week of server fun. I was driving home from dinner, and the last thing I wanted to do was stare at my computer screen. Then the on-call page arrived. A server in California was down because of a power outage at the server’s location. While I could do nothing about the power outage, I was responsable for letting our outage management team know when the server was back online, and our server’s shared resources were available.

Well I start the classic DOS command “PING SERVERNAME -T”, and then watched as the time out notices started to scroll up the screen. The power outage was expected to last anywhere from 30 minutes to several hours, and I really didn’t want to watch the computer while the pings kept scrolling on the screen. What I needed was for the ping to continue, and then stop and notify me with an audible alert when the ping was coming back successfully, and the server was back online. This would allow me to enjoy my Friday, and keep an ear tuned to the alert from my PC in the next room.

Some quick Google research got me what I needed, and I was able to create a quick and simple function to make that happen. I started the function, and was able to walk away. Two hours later, I was alerted, and was able to notify our outage management team of the server status, and the fact that the resources were available.

It is now Saturday morning, and I liked the function so much I have polished it up, and enhanced it just a bit. The function name is:
Test-ConnectWithAlert
I have added headers to the function, so to see the syntax of the function type the following:

get-help Test-ConnectWithAlert  -Examples

pp1071_1377176449036

From the screen shot you can see that there are two parameters: -ComputerName and -Voice

  • -ComputerName: Default is LOCALHOST. This is the name of the computer you want to “ping”. Put a computername here
  • -Voice. If used this will give a Text-to-Voice alert instead of an audio tone.  I have a Cepstral voice installed, so the voice is more pleasant than the default MicroSoft voice.

For both examples that I am showing in the following screenshots I am using my LOCALHOST, but there won’t actually be a time out delay.

pp1071_1377176449127

First Example: Test-ConnectWithAlert -ComputerName LOCALHOST

pp1071_1377176449141

Second Example: Test-ConnectWithAlert -ComputerName LOCALHOST -voice

 

Q: So what is actually happening here?  There are three areas of interest.  The first one occurs at line 21.  This Do Loop repeats untill the Test-Connection cmdlet commands  back with a TRUE value.  That occurs when the pinged computer is once again pingable.  Once that condition is met then the script will continue on, and notify the user of the computer’s status.

pp1071_1377533498931

DO Loop: Repeats until the target computer is found to be online.

Also of interest are the two responses to the two methods of notification.  The first method uses a standard WAV file.  I have used chime, but any valid file can be used, including one that you create.  I found the method to play the WAV file here: playing-sounds-in-powershell.html . It is an old post, but a good one, and it details several methods for playing WAV files.

pp1071_1377533498951

WAV Notification: Repeats until the User Responds.

The second notification method uses Text-to-Voice to read a predefined text statement, and convert it to voice. There are some standard Windows voices that you can use, but I have purchased a voice from Cepstral (http://www.cepstral.com/).  I use a lot of voice  notifications for script completions so to me it was worth the slight expense of the voice. I go into more detail on text to voice in other posts, but in the following screen shot you can see generally what is going on .

pp1071_1377533498981

Text-to-Voice Notification Do Loop: Repeats until the User Responds.

 The PS1 file containing the function can be downloaded here:  Test-ConnectionWithAlert.PS1 or you can get it from my Box.Net widget at the bottom of this page. Look for the folder called “Test-ConnectionWithAlert”.  I have commented the various parts of the script with what I think is helpful information, and links to other resources.

Thanks for reading. Let me know if you have any questions or comments.

Have a good day,

Patrick

Find Co-Workers of an Account Listed in Active Directory

When managing user accounts in MS Active Directory one of the frequent request we get is to relocate the Home Directory for an employee.  There could be a number of different reasons for this,  but some common causes for the request are:

  • The relocation of an employee to a new city or work location.
  • A Home Directory has grown too large and needs to be relocated for diskspace load balancing.
  • The Home Directory folder is not located in a location that provides fast enough access speed across the WAN.

When we are relocating a Home Directory one of the common things we need to know is What server provides employees the best access speed for the new location of the requesting employee“?  One easy way to answer this question is by determining what server the employee’s new or current Co-Workers are using.  In this case Co-workers means those employees that report to the same manager.

What is a good way to find the homedirectory of an employee and his peers if you are not familiar with their workgroup?

The following PowerShell function provides a solution to that question.

<#
.SYNOPSIS
This script is used to get detailed information on a user account found in Active Directory.
.DESCRIPTION
This script is used to get detailed information on a user account found in Active Directory.
The result can be sent to the output screen and the Out-GridView cmdlet.
Get-QADUser is required; part of the Quest AD management software.
Found here: http://www.quest.com/powershell/activeroles-server.aspx

.PARAMETER <paramName>
UserID – The account that will searched in the domain detailed by -Domain.

.EXAMPLE
Get-UserInfo AA9999
This will search the default domain for an account using the standard ATT account type. The default domain is ITServices.
.EXAMPLE
Get-UserInfo “LastName, FirstName MI”
This will search the default domain for an account using the Full Display Name. The default domain is ITServices.

#>

#This function will provide a work of employees that work with an employee. They all have the same manager.
#Name, displayname, homedirectory, and email-address are returned.

function get-coworkers([PSObject] $UserID) {
if ((Get-QADUser $userID | select-object manager) -ne $null){

Get-QADUser -Manager (Get-QADUser -Identity $UserID | select-object manager).manager | select-object name, displayname,homedirectory | sort-object homedirectory | ft -AutoSize

}

else {

write-host “$UserID was not valid, or no manager was listed for the account.” -foregroundcolor red

}

}

get-coworkers $UserID -UserID $Args[0]
#End

Here is a sample of the function being run using my ATTUID as the input:

 

Sample of the Get-Coworker function being run using my ATTUID as the input (Some Output Obscured)

Another simple way to run it is by applying the “Full Display Name to the function input:


Sample of the Get-Coworker function being run using my Full Display Name as the input (Some Output Obscured)

If you apply an account to the function, or the account does not have a manager applied to it you will see the following error:

 

Sample of the Get-Coworker function being called on an invalid AD account(Some Output Obscured)

To download the full text of this script please go here:  Get-CoWorkers.PS1

Find Users Actively Connected to a Share

The work I do for AT&T deals extensively with performing data migrations; moving user and group data from one server to another.  To make the data transition easier for the active share users I send emails to them indicating when a share is going to move, and what its new location will be?

Q:How do I capture that active user information?

A:  I use a WMI query using the class Win32_ServerConnection.

With Powershell you can easily query a remote server to find out what accounts are connected to all shares, or a specific share.

One of the very nice things about Powershell is that you can create a “one-liner” to grab the information quickliy.  This would be used  as a quick reference.  Here is an example of a one liner to find all of the employees connected to server ServerBravo1

Get-WmiObject Win32_ServerConnection -ComputerName ServerBravo1 select username, sharename, computername | sort sharename | Format-Table -AutoSize

Here is the break down of that command:

Get-WmiObject Win32_ServerConnection: Performs the WMI query using the Get-WMIObject cmdlet.

-ComputerName ServerBravo1: Runs the query on the remote server ServerBravo1.  If the -ComputerName property is excluded then the command is run on the local computer.

selectusername, sharename, computername:  This determines which properties are returned from the query.  I find these to be the most useful properties, but there are a lot more that can be returned.

Here is a list of the properties that could be useful:

Name                           MemberType
—-                                ———-
ActiveTime               Property
Caption                       Property
ComputerName      Property
ConnectionID          Property
Description              Property
InstallDate               Property
Name                          Property
NumberOfFiles       Property
NumberOfUsers     Property
ShareName              Property
Status                         Property
UserName                Property

sort sharename: This sorts the results based on the value of the ShareName property.

Format-Table-AutoSize: This formats the output in columns.  The -autosize option places the columns in a nice compact presentation.  Other output options include format-list, and my personal favorite out-gridview.

The one-liner is nice but you have to type the full text each time.  Since I use this command so much, I prefered to make a function where I can type the function name followed by a server name.  The required typing is a lot less for each use, and you don’t really need to remember the specific property names.

Here is how that funtion would look:

Function to Find Active Share Users on a Server

function get-ShareUsers

{

<#

.SYNOPSIS

Determine which shares are actively being used by employees.

.DESCRIPTION

This provides a live time view of shares currently being accessed by employees. The output can be to the Powershell screen, the out-gridview window, a CSV file, or all of the above.

 

.PARAMETER <paramName>

ServerName – Used to determine the server to scan.

GridView – Enables the output to the gridview.

Export – Enables the output to a CSV file using the export-csv cmdlet.

.EXAMPLE

get-ShareUsers S47715C014001

Description

———–

This command scans a server called S47715C014001 for active share users. The result is sent to the Powershell screen.

.EXAMPLE

get-ShareUsers S47715C014001 -Gridview

Description

———–

This command scans a server called S47715C014001 for active share users. The result is sent to the Powershell screen, and to the out-gridview window.

.EXAMPLE

get-ShareUsers S47715C014001 -Gridview -Export

Description

———–

This command scans a server called S47715C014001 for active share users. The result is sent to the Powershell screen, to the out-gridview window, and to a CSV file called S47715C014001 _Share_Users.csv

#>

[CmdletBinding()]

Param

(

#First parameter

[parameter(Mandatory=$true, #Makes this a required parameter. The user will be prompted for this item if it is not provided.

ValueFromPipeline=$true)] #Allows the server name to be “Piped” into the function.

[String[]] $ServerName, #The name against which to run the query.

#Second parameter – Sends the output to the out-gridview display.

[switch] $Gridview,

#Third parameter – Sends the output to a CSV file for later used.

[switch] $Export

)

 

#Default output to the Powershell interface.

Get-WmiObject Win32_ServerConnection -ComputerName $ServerName | select username, sharename, computername | sort sharename | Format-Table -AutoSize

if ($Gridview -eq $true) #Use this switch if you want to output to the Out-Gridview window.

{

Get-WmiObject Win32_ServerConnection -ComputerName $ServerName | select username, sharename, computername | sort sharename | Out-GridView -Title “$computername Share Users”

}

if ($Export -eq $true) #Use this switch if you want to output to a CSV file.{

[string]$filename = $ServerName+ “_Share_Users.csv”

Get-WmiObject Win32_ServerConnection -ComputerName $ServerName | select username, sharename, computername | sort sharename | Export-Csv -Path $filename -NoTypeInformation

}

}

A few final comments:

  • To make this function available all of the time when you are using PowerShell, paste the function into your PowerShell profile document.  When you do that it will load each time you start PowerShell.
  • Once it is loaded into your PowerShell session, you can find help on this function by typing the following in the PowerShell command line window:

help get-shareusers -Full

This will give examples of how to use the function, and also give detailed information on each of the parameters.

  • Finally, to make it easier to use this function, I have uploaded the text of the script here at my Google page:

Get-ShareUsers.ps1

I hope this is a helpful utility for you.

Please let me know if you have any questions about this, or any of my other posts.

Have a good day.

Patrick

Get-DriveSpace on One or Multiple Computers

One common administrative task is to find the  available disk space on a server.  The standard methods to do this are connecting to the computer remotely to verify disk space, and using VBScript scripting to gather the data.  But there is a better way…

With Powershell’s extensive use of WMI and the “piping” options, a very useful one-liner can be used to retrieve the disk space for one or more servers.

Here is the WMI property that we will be using: Win32_LogicalDisk.

To find the disk space available for one computer you would use this one-liner:

get-wmiobject Win32_LogicalDisk -computername Server001 | select __server, Name, Description, FileSystem, @{Label=”Size”;Expression={“{0:n0} MB” -f ($_.Size/1mb)}}, @{Label=”Free Space”;Expression={“{0:n0} MB” -f ($_.FreeSpace/1mb)}} | out-gridview -Title “Disk Space Scan Results”

Lets take a look at the different sections of that one-liner.

get-wmiobject Win32_LogicalDisk -computername Server001 |

This is the standard cmdlette for accessing WMI in Powershell.  The number of available WMI classes is quite extensive.  In the above example, -computername would be followed by a valid computer name. The output of that is piped  into the  Select-Object cmdlette.

select __server, Name, Description, FileSystem, @{Label=”Size”;Expression={“{0:n0} MB” -f ($_.Size/1mb)}}, @{Label=”Free Space”;Expression={“{0:n0} MB” -f ($_.FreeSpace/1mb)}} |

Select-Object, in this one-liner aliased as select, allows us to choose which properties of the returned query we want to use.  By default the Win32_LogicalDisk class provides us with the following properties:

DeviceID     : C:

DriveType    : 3

ProviderName :

FreeSpace    : 10997723136

Size         : 21478666240

VolumeName   : C_Drive

We do two things differently with our one-liner. First we grab an extra property that is always available, but not always needed.  We are requesting the “__server” property.  This will be useful when we put everything in columns later on, and really useful when we are asking for the disk space on multiple servers.

The second thing we are doing differently is to apply formatting to the “Size: Property and “Free Space” property. If you notice above the output is in bytes.  We are use to thinking of hard drives in MB or GB, not bytes.  That is way too weird.  So we are going to created two calculated properties.  There is a calculated property for “freespace” and “size”.

The calculated property is much simpler than it might first appear. To specify a calculated property we need to create a hash table; that’s what the @{} syntax does for us.  Inside the curly braces we specify the two elements of our hash table: the property Label (in this case Size or Free Space) and the property Expression (that is, the script block we’re going to use to calculate the property value).  The Label property is easy enough to specify; we simply assign a string value to the Name, like so:

Label=”Size” and Label=”Free Space”

And, believe it or not, the Expression property (which is separated from the name by a semicolon) isn’t much harder to configure; the only difference is that Expression gets assigned a script block rather than a string value:

Expression={“{0:n0} MB” -f ($_.Size/1mb)}} and Expression={“{0:n0} MB” -f ($_.FreeSpace/1mb)}} are the expressions we are using.  So what actually is going on here?

Using .NET to Format Numbers in Windows PowerShell

Powershell doesn’t have any built-in functions or cmdlettes for formatting numbers. But that’s OK; we don’t need any built-in functions or cmdlettes. Instead, we can use the .NET Framework formatting methods.

The heart-and-soul of our command is this little construction: “{0:N0}”. That’s a crazy-looking bit of code to be sure, but it’s also a bit of code that can easily be dissected:

The initial 0 (That’s a zero)  (the one that comes before the colon) represents the index number of the item to be formatted. For the time being, leave that at 0 and everything should work out just fine.

The N represents the type of format to be applied; in this case, the N is short for Numeric. Are there other types of formats we can apply? Yes there are, and we’ll show you a few of those in just a moment.

The second 0 (the one after the N) is known as the “precision specifier,” and, with the Numeric format, indicates the number of decimal places to be displayed. In this case we don’t want any decimal places, so we set this parameter to 0. Suppose we wanted to display three decimal places? No problem; this command takes care of that: “{0:N3}” -f $a. Run that command and you’ll end up with output that looks like this: 19,385,790,464.000.

That’s about all we have to do; after specifying the format type we tack on the –f (format) parameter, then follow that by indicating the value we want to format $_.FreeSpace and $_.Size.

In our one-liner we are dividing the number variables $_.size and $_.freespace by the Powershell constant mb. We could do kb or gb as well. It depends what output you want to see.

For a more extensive discussion on .net formatting, please go here:

http://technet.microsoft.com/en-us/library/ee692795.aspx

http://msdn.microsoft.com/en-us/library/dwhawy9k.aspx

This is the source I used for much of this information.

The final section of the one-liner outputs the results to the Out-Gridview window.  I have modified the “title” property so that it shows what we are attempting to do with the query.

| out-gridview -Title “Disk Space Scan Results”

 

The above information details how to find the disk space for one server.  Because the “computername” property of the Get-WmiObject cmdlette will accept an array, or a single string.  This allows us to check mutiple servers with the same one liner.  This can be done two ways.

In the first method, the computer names can be directly type into the one liner:

get-wmiobject Win32_LogicalDisk -computername server1, server2, server3

In the second method the computer names can be read from a test file using the get-content cmdlette.

get-wmiobject Win32_LogicalDisk -computername (Get-Content c:\temp\computers.txt) `

select __server, Name, Description, FileSystem, @{Label=”Size”;Expression={“{0:n0} MB” -f ($_.Size/1mb)}},`

@{Label=”Free Space”;Expression={“{0:n0} MB” -f ($_.FreeSpace/1mb)}}`

| out-gridview -Title “Disk Space Scan Results”

I prefer the second method because you can gather the disk information on many servers very quickly.

Please let me know if you have any questions about this procedure.  I would be glad to explain anything, or answer any questions.

Thanks,

Patrick

Test Multiple Network Locations with Test-Path

Frequently we relocate a large number of employee home folders in bulk and we need to verify that the move was successful, or we want to test the validity of a large number of network shared folders. This utility does that utilizing the Powershell test-path commandlette.

If you want a basic understanding of how the test-path commandlette works, type this in your Powershell console window:

get-help test-path –full

Here is a general description of what this script utility, Test-Paths.ps1 does:

.SYNOPSIS

This script will test the validity of paths that are contained in the paths.ini file. Output is generated to a CSV file, and to Out-GridView.

.DESCRIPTION

The targets of the test-path command are pulled from the “paths.ini” file that is collocated with the test-paths.ps1 file.

Each target is tested using the Powershell test-path commandlette. Results are stored along with the path name in two output methods.

out-gridview and filename.csv

.PARAMETERS

-nogridview: Prevents the script from generating the Out-GridView window.

-noexport: Prevents the script from generating the exported CSV file.

-outfile filename.csv: Use an alternative name for the output file. CSV extension is best. The default if this switch is not added is testpathresult.csv.

.EXAMPLES

This will give two outputs. A file named testpathresult.csv and the out-gridview window:

.\test-paths.ps1

This example will give no out-gridview window, but will save a CSV file named patrick.csv:

.\test-paths.ps1 -nogridview -outfile patricks.csv

This example will give only the out-gridview window:

.\test-paths.ps1 –noexport

Here are a couple of examples with the script in action. In this first one I will get all failed status results for the test-path commands, but that is because I am using simulated directory paths.

Here are the contents of the paths.ini file which is collocated with the script:

Figure 1: Contents of Paths.ini File

Here are a few screen shots of the utility being run, and some of the selected output screen shots.

.\test-paths.ps1

Figure 2: Command Window Output

You can see that the Powershell command window echoes the target currently being tested.

Since the above example did not use either the of the two exclusion switches, both out-gridview and a CSV file were generated. Here are images of both types of output:

Figure 3: Out-gridview

Figure 4: Testpathresult.CSV File

Notice there are two columns in Figure 3: Accessible and HomeDirPath. In each of these the path tested was shown as False because the path was not found.

Here is another example, but this one excludes the export to the CSV file.

.\test-paths.ps1 –noexport

I added “c:windows” to the paths.ini file to show that the test-path can actually find a valid path. With this one we still see the out-gridview window, but no CSV file is generated. Notice that now we have a True in the Accessible column.

Figure 5: True Path Now Found

And finally, the last example where an alternate output file name is generated using the –outfile parameter:

.\test-paths.ps1 –outfile february28th.csv -nogridview

With this one no out-gridview window is generated, but the output file is unique and will not be overwritten the next time the utility is run.

Figure 6: Alternate Output File Naming

In summary, this utility provides an easy way to test a few, or thousands of network paths very easily.

It is run in a Windows Powershell environment. The target paths are inserted in the paths.ini text file, and the command is run as detailed above.

Let me know if you have any questions about this.

Thanks

Patrick Parkison

Below is the code used in the test-paths.ps1 script.

###################################################################################

<#

.Patrick Parkison

pp1071@att.com

.SYNOPSIS

This script will test the validity of paths that are contained in the paths.ini file. Output is generated to a CSV file, and to Out-GridView.

.DESCRIPTION

The targets of the test-path command are pulled from the “paths.ini” file that is co-located with the test-paths.ps1 file.

Each target is tested using the Powershell test-path commandlette. Results are stored along with the path name in two output methods.

out-gridview and filename.csv

.PARAMETER

-nogridview: Prevents the script from generating the Out-GridView window.

-noexport: Prevents the script from generating the exported CSV file.

-outfile: Use an alternative name for the output file. CSV extension is best. The default if this switch is not added is testpathresult.csv

.EXAMPLES

This will give two outputs. A file named testpathresult.csv and the out-gridview window:

.\test-paths.ps1

This example will give no out-gridview window, but will save a CSV file named patrick.csv:

.\test-paths.ps1 -nogridview -outfile patricks.csv

This example will give only the out-gridview window:

.\test-paths.ps1 -noexport

#>

param([switch] $noGridview, [switch] $noExport, [string]$outfile = “testpathresult.csv”)

#Change the title bar of the script window. This is helpful for long running scripts.

$Host.UI.RawUI.WindowTitle = “Running test-path utility.”

#Makes an array, or a collection to hold all the object of the same fields.

$dataColl = @()

#Get location of the script. Info will be used for getting location of all test targetrs, and for saving output to the same folder.

function Get-ScriptPath

{

Split-Path $myInvocation.ScriptName

}

#ScriptPath will be used to place the output file.

$scriptPath = get-scriptpath

#Paths.ini is a text file containing a list of targets e.g. \servernamesharename

$sourcefile = $scriptPath + “paths.ini”

<#

This is the output CSV file. It is overwritten each time the script is run.

If a historical record is desired, a date can be appended to the file name. See this reference on how to do that: https://thescriptlad.com/?s=date

#>

$outputfile = $scriptPath + “” + $outfile

foreach ($path in (gc $sourcefile)){

$dataObject = New-Object PSObject

Write-Host “Scanning: $path”

Add-Member -inputObject $dataObject -memberType NoteProperty -name “Accessible” -value (Test-Path $path )

Add-Member -inputObject $dataObject -memberType NoteProperty -name “HomeDirPath” -value $path

$dataColl += $dataObject

}

#This section is used to generate the out-gridview display.

if (!$noGridview)

{

$label = “Test-Path Results. Total Responses: ” + $dataColl.count

$dataColl | Out-GridView -Title $label

}

#Output to the CSV file for use in Excel.

if (!$noExport)

{

$dataColl | Export-Csv -noTypeInformation -path $outputfile

}

#Restore the default command window title bar.

$Host.UI.RawUI.WindowTitle = $(get-location)

Voice Spoken Weather Report

Here is a Powershell script that is fun, and useful if you like to be able to get a spoken weather report quickly.  It uses the SAPI com object in Windows to convert text to voice. If you have more than one TTS engine on your PC, you will be able to modify it under the Windows Control Panel.

To start with you will need to create a function that will be used to convert text to voice.

function say

{

$Voice = new-object -com SAPI.SpVoice #Make a voice object using the com object.

$Voice.Speak( $Args[0], 0 )|out-null

}

The second part of this script comes from /\/\o\/\/. You’ll find a detailed post on how to connect to a web-service to capture weather information.  That is located here:

http://thepowershellguy.com/blogs/posh/archive/2009/05/15/powershell-v2-get-weather-function-using-a-web-service.aspx

What I’ve done is to put his work into a function that allows you to select a country or city based on command line parameters, and then speak the results over your computer speakers. If you select the -help parameter and use the -country countryname parameter, all of the cities for your country will be selected.

Here is that function:

Function Get-Weather ([switch]$help,$city,$country,$filter = ”)

{

$weather = New-WebServiceProxy -uri http://www.webservicex.com/globalweather.asmx?WSDL

if ($help)

{

write-host “Starting help.”

$xml = [xml]$weather.GetCitiesByCountry($Country)

$xml.NewDataSet.table | sort city | Out-GridView

}
else

{

([xml]$weather.GetWeather($City,$country)).CurrentWeather

}

}

Now that the functions are out of the way, here is the “main” portion of the script.

#Main
#Determine if help switch is active.

switch ($help)

{

{$_ -eq $true }

{

get-weather -help -country $country

break

}

default

{

if ($full)#This is the full text output. No audio.

{

Get-Weather  -country $country -city $city

}

else #Audio output, streamlined for quicker information.

{

$currentWeather = get-weather -city $city

Write-Host $currentWeather

$currentTemperature = $currentWeather.temperature

$currentTemperature = $currentTemperature.split()

$currentSkyConditions = $currentWeather.SkyConditions

$wrsentence = “Here is the weather information you needed. In ” +  $city + ” the temperature is ” + [int]$currentTemperature[1] + ” degrees fahrenheit, and it is ” + $currentSkyConditions

say $wrsentence

}

}

}

I’ve used the default city of Memphis that I set up in the parameters, but it works just as well if I used the command line parameters.

I’ve saved all of this in a script called get-weather.ps1

Here is a sample using  Quebec, Canada

.\get-weather  -country Canada -city Quebec

If  I am not sure of all of the city names that are available for Mexico, then I can type:

.\get-weather  -country Mexico -help

This will open up an out-grid view with all of the cities available in Mexico.

Since I set Memphis as the default city in the parameters, just typing the following will give me local weather information.

.\get-weather.ps1

If I want just screen output for a city, I can use something like the following:

.\get-weather -full -country canada -city Quebec

Give this a try and see if it works. Let me know if you have any questions.

Thanks,

Patrick

User Home Folder Size and other Information (without Quest)

Frequently during my daily work I need to gather information on users that are contained in our Active Directory listing.  In a previous blog post, I had a script which does this work using a Quest commandlette.  Some environments don’t allow this, so a method to gather user data without using Quest is helpful to have available.
This script is used to quickly gather information on a users home folder, SAM account name, their email address, and the size of their home folder.

I use this a lot when I am moving users home folders from one server to another. To save time I frequently have the line that gathers home folder size remarked out with #.

The user account names that I am searching for are contained in a text file called
accounts.txt. This file is contained in the same folder as this script. The output is sent to the screen as well as a log file called output.csv.

#************************************************************

$userArray = @(“SAMID,HomeDirectory,EmailAdress,HomeFolderSize”)
$allUsers = gc .\accounts.txt
$tempArray = @()
function logfile($strData)
{
Out-File -filepath output.csv -inputobject $strData -append
}
function getAccountInfo
{
$strName = $currentUser
$strFilter = “(&(objectCategory=User)(samAccountName=$strName))”
#Get User AD info
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.Filter = $strFilter
$objPath = $objSearcher.FindOne()
$objUser = $objPath.GetDirectoryEntry()
[string]$folder = $objUser.homeDirectory
[string]$email = $objUser.mail
[string]$samID = $objUser.sAMAccountName
[string]$folderSize= getFolderSize($objUser.homeDirectory)
#$objUser.memberOf

$result = “$samID,$folder,$email,$folderSize”
$result #This causes the output to steam out, and be piped as the return from the function.
$folderSize = $null
$fs = $null
}
function getFolderSize($strPath)
{
$fs = New-Object -comobject Scripting.FileSystemObject
#Check validity of $strPath
if ($fs.FolderExists($strPath))
{
[double]$tempSize = ($fs.GetFolder($strPath).size) / 1024 / 1024
$tempSize = ‘{0:N}’ -f[double]$tempSize
$tempSize
}
else
{
$tempSize = “Bad folder path!”
$tempSize
}
}
$header = “SAMID,HomeDirectory,EmailAdress,HomeFolderSize”
Out-File -filepath output.csv -inputobject $header
foreach ($currentUser in $allUsers)
{
$tempOutput = getAccountInfo $currentUser
$tempOutput
$userArray += [string[]]$tempOutput
logfile($tempOutput)
}

#************************************************************
Let me know if you have any questions about this, or if it is helpful.

Thanks
Patrick

User Home Folder Size and other Information (with Quest)

Here is a function that can be used to quickly gather folder information about a user’s home folder.

There is one stipulation.

For this to work you must have the Quest Active Directory Snap-In configured for your Powershell session.
This will apply to users contained within an Microsoft Active Directory structure.
I have used the “^” in place of the “select-object” command. This is an alias that I use to make typing much faster. It is a symbol that I have never had a conflict on.

I have called the function GQUF. This is short for Get-QADUserFunction, but you may call it whatever you like of course.

Here is the syntax of the command. There are three options that are available when this command is run with the second command line switch.

“GQUF userid –groups” or

“GQUF userid -explorer” or

“GQUF userid –size.”

The –groups switch will detail all of the active directory groups in which the member is included.

The –explorer switch will open an explorer window pointed at the user’s home folder.

The –size switch will detail the total size of the user’s home folder.

Here is the code. See a screen shot at the bottom.

#*******************************

#This function looks up a user home drive and home directory

#Uses get-qaduser

function gquf

{

$UserID = $Args[0]
$Domain = $Args[1]

$result = Get-QADUser $Args[0] | Select-Object SamAccountName, homedirectory, homedrive, email, displayname # | ft -autosize

#$result | ft -autosize

Write-Host “Display Name:” $result.displayname -foregroundcolor green

Write-Host “Email Address:” $result.email -foregroundcolor green

Write-Host “HomeDir:” $result.homedrive $result.homedirectory -foregroundcolor green

Write-Host “”

Write-Host “Permissions for “$result.homedirectory -foregroundcolor Yellow

get-aclf $result.homedirectory

switch ($Args[1])

{

{$_ -eq “-groups”}

{

write-host “Member Of:”

(Get-QADUser $Args[0] | ^ memberof).memberof | sort

}

{$_ -eq “-explorer”}

{

explorer $result.homedirectory

}

{$_ -eq “-size”}

{

Write-Host “Calculating the size of the homefolder…” -foregroundcolor red

$fs New-Object -comobject Scripting.FileSystemObject

$tempSize $fs.GetFolder($result.homedirectory).size/1024/1024

$tempSize ‘{0:N}’ -f [double]$tempSize

Write-Host “$tempSize MB”

}

}

}
#*******************************

Here is the screen shot for the –size switch. Sensitive information has been blocked out.

Thank You,

Patrick