My Entry for the Advanced Event #1 of the 2013 Scripting Games

I've been pretty excited about the annual scripting games. This is only my second Games, but they have been a terrific Powershell learning experience for me. This year it's being run by Don Jones and his gang:

http://scriptinggames.org/

http://powershell.org/wp/

Their PHP-based website has already shown to be a little buggy, and I will eat road salt before I enable Java in my browser, so I won't be using their chat room, but you have to cut them some slack as it's a brand new site that has never been used before.

When people comment on the scripts you submit, it can be humbling but is also a good learning experience for being able to tell what people wanted out of your script but didn't get. There's not any error-handling to speak of in this script - I knew that was a risk I was taking by submitting a script with no error handling, but the event description stated that I should "display all errors clearly," which is exactly what the script does with no error handling. Still, I could have still used error handling to make it a little more elegant. Also, I guess I've got to break down and start doing the -WhatIf and -Confirm junk, even though I don't exactly want to, it's going the extra mile.

Without further ado:

#Requires -Version 3
Function Move-OldFiles
{
	<#
		.SYNOPSIS
			Move files that are older than a specified number of days. (Default is 90 days.)
			Use the verbose switch if you want to see output, otherwise the Cmdlet shows only errors.
		.DESCRIPTION
			Move files that are older than a specified number of days (default 90) from the 
			source directory to a destination directory. Directory recursion is on by default,
			but can be disabled with the -NoRecurse switch. The subdirectory structure will be 
			preserved at the destination. By default, all files are moved, but a file pattern
			can be specified with the -Pattern parameter. By default, files that already exist at
			the destination and are readonly are not overwritten, unless the -Force switch is used.
			This cmdlet works with drive letters as well as UNC paths. By default, only errors are shown.
			Use the -Verbose switch if you want to see more output. This function requires Powershell 3.
		.PARAMETER SourceDirectory
			Specifies the source directory from which you want to move files. E.g. C:\Logs or C:\Logs\
			This must be a valid directory. The alias for this parameter is Source.			
		.PARAMETER DestinationDirectory
			Specifies the destination directory to which you want to move files. E.g. E:\Archives or
			E:\Logs\Old\ or \\SERVER02\Share\Logs. This must be a valid directory. The alias for
			this parameter is Destination.
		.PARAMETER OlderThan
			The number of days that a file's age must exceed before it will be moved. This is
			an optional parameter whose default is 90 days. This parameter must be a positive
			integer. The alias for this parameter is Age.
		.PARAMETER Pattern
			This is an optional filename filter. E.g., *.log or *.txt or Report*.html.
			The alias for this parameter is Filter.
		.PARAMETER NoRecurse
			This is a switch that indicates whether the cmdlet will process files in subdirectories
			underneath the specified source directory. By default, recursion is on. Optional.
		.PARAMETER Force
			This is a switch that indicates whether files that already exist at the destination
			and are readonly will be overwritten. By default they are not overwritten. Optional.
		.EXAMPLE
			PS C:\> Move-OldFiles -Source C:\Application\Log -Destination \\NASServer\Archives -OlderThan 90 -Filter *.log
		.EXAMPLE
			PS C:\> Move-OldFiles C:\Logs \\NASServer\Archives 90 *.log
		.EXAMPLE
			PS C:\> Move-OldFiles -SourceDirectory C:\Logs -DestinationDirectory \\NAS\Archives -Age 31 -Pattern *.log -Force
		.EXAMPLE
			PS C:\> Move-OldFiles C:\Logs \\NAS\Archives
	#>
	[CmdletBinding()]
	Param([Parameter(Position = 0, Mandatory = $True, HelpMessage = 'Source directory, e.g. C:\Logs')]
			[ValidateScript({Test-Path $_ -PathType Container})]
			[Alias('Source')]
			[String]$SourceDirectory,
	      [Parameter(Position = 1, Mandatory = $True, HelpMessage = 'Destination directory, e.g. \\NASServer\Archives')]
			[ValidateScript({Test-Path $_ -PathType Container})]
			[Alias('Destination')]
			[String]$DestinationDirectory,
		  [Parameter(Position = 2, Mandatory = $False, HelpMessage = 'The number of days old the file must be in order to be moved.')]
			[ValidateScript({$_ -GT 0})]
			[Alias('Age')]
			[Int]$OlderThan = 90,
		  [Parameter(Position = 3, Mandatory = $False, HelpMessage = 'The file pattern to match, e.g. *.log')]
			[Alias('Filter')]
			[String]$Pattern = "*",
		  [Parameter(Position = 4, Mandatory = $False, HelpMessage = 'Disable directory recursion, i.e. only copy the directory specified.')]
			[Switch]$NoRecurse = $False,
		  [Parameter(Position = 5, Mandatory = $False, HelpMessage = 'Specify to overwrite existing readonly files at the destination.')]
			[Switch]$Force = $False)
	
	$Start = Get-Date
    If(!($SourceDirectory.EndsWith("\")))
    {
	    $SourceDirectory = $SourceDirectory + "\"
    }
    If(!($DestinationDirectory.EndsWith("\")))
    {
        $DestinationDirectory = $DestinationDirectory + "\"
    }
	
	Write-Verbose "Source Directory:       $SourceDirectory"
	Write-Verbose "Destination Directory:  $DestinationDirectory"
	Write-Verbose "Move Files Older Than:  $OlderThan Days"
	Write-Verbose "Filename Filter:        $Pattern"
	Write-Verbose "Exclude Subdirectories: $NoRecurse"
	Write-Verbose "Overwrite if Readonly:  $Force"
	
	If($NoRecurse)
	{
		$SourceFiles = Get-ChildItem -Path $SourceDirectory -Filter $Pattern -File | Where-Object LastWriteTime -LT (Get-Date).AddDays($OlderThan * -1)
		Write-Verbose "$($SourceFiles.Count) files found in $SourceDirectory matching pattern $Pattern older than $OlderThan days."
	}
	Else
	{
		$SourceFiles = Get-ChildItem -Path $SourceDirectory -Filter $Pattern -File -Recurse | Where-Object LastWriteTime -LT (Get-Date).AddDays($OlderThan * -1)
		Write-Verbose "$($SourceFiles.Count) files found in $SourceDirectory matching pattern $Pattern older than $OlderThan days."
	}
	
	[Int]$FilesMoved = 0
	ForEach($File In $SourceFiles)
	{
		Write-Verbose "Moving $($File.FullName)"
		$DestinationFullName = $DestinationDirectory + $($File.FullName).Replace($SourceDirectory, $null)
		$DestinationFileDirectory = $DestinationFullName.Replace($DestinationFullName.Split('\')[-1], $null)
		If($PSBoundParameters['Verbose'])
		{
			Write-Progress -Activity "Move-OldFiles" `
						   -Status "Moving files..." `
						   -CurrentOperation "Transferring $($File.FullName)`..." `
						   -PercentComplete $([Math]::Round($FilesMoved / $SourceFiles.Count * 100, 0))
		}		
		If($Force)
		{
			If(!(Test-Path $DestinationFileDirectory -PathType Container))
			{
				Write-Verbose "Creating directory $DestinationFileDirectory"
				New-Item $DestinationFileDirectory -Type Directory | Out-Null
			}
		    Move-Item -Path $File.FullName -Destination $DestinationFullName -Force | Out-Null
		}
		Else
		{
			If(!(Test-Path $DestinationFileDirectory -PathType Container))
			{
				Write-Verbose "Creating directory $DestinationFileDirectory"
				New-Item $DestinationFileDirectory -Type Directory | Out-Null
			}
		    Move-Item -Path $File.FullName -Destination $DestinationFullName | Out-Null
		}
		$FilesMoved++
	}
	$End = Get-Date
	Write-Verbose "$($SourceFiles.Count) files were moved in $([Math]::Round(((New-TimeSpan $Start $End).TotalSeconds), 1)) seconds."
}

Get-FQDNInfo.ps1

Someone recently asked me if I could write a script for them.  He had a list of several hundred fully qualified domain names (internet URLs essentially) in a file, and he had to get the IP address(es) of each FQDN and then some whois information about those IP addresses.  Running down a list of names and resolving them scriptomatically is a breeze of course, but the whois stuff sounded a little more tricky.  Luckily, ARIN has a sweet REST API ready to go, that made the whole script a snap.

I took the time to return all that data as objects so the output can be pipelined, and there is also an optional "save to CSV" parameter.  I think there are a couple more ways in which the script could be improved, but it works for now.  The output looks like this:

Get-FQDNInfo.ps1

And here's the whole script:

<#
.SYNOPSIS	
	Feed me a bunch of FQDNs, one per line, and I will give you as much info as
	I can about that IP address.
	
.DESCRIPTION
	This script takes an input file. The input file contains a list of FQDNs, one per line.
	With each FQDN, the script will attempt to resolve the name, and then find as much
	info as it can using ARIN REST services.
	
.PARAMETER InFile
	Specify a text file containing the FQDNs you want to scan.
	Each FQDN goes on a separate line. For example:
	
	host.foo.com
	barney.purpledinosaur.com
	et.phonehome.org

.PARAMETER OutFile
	Optional file to write the results to.

.INPUTS
	[System.String]$InFile - The name of the input file to read.
.INPUTS
	[System.String]$OutFile - Optional, the name of the output file to write.

.OUTPUTS
	[System.Object]$FQDNInfoCollection - Contains resolved FQDNInfo objects.

.OUTPUTS
	[System.IO.File]$OutFile - Optional, the output file to write.
	
.EXAMPLE
	PS C:\> .\Get-FQDNInfo.ps1 .\fqdns.txt outfile.txt

.EXAMPLE
	PS C:\> "fqdns.txt" | .\Get-FQDNInfo.ps1
	
.NOTES
	Name  : Get-FQDNInfo.ps1
	Author: Ryan Ries
	Email : ryanries09@gmail.com
	Date  : April 19, 2013

.LINK	
	http://www.myotherpcisacloud.com
#>

Param([Parameter(Mandatory=$True,ValueFromPipeline=$True)][ValidateScript({Test-Path $_ -PathType Leaf})][String]$InFile, [Parameter(Mandatory=$False,ValueFromPipeline=$False)][String]$OutFile)
$FQDNInfoCollection = @()
$EntriesProcessed = 0
Foreach($FQDN In Get-Content $InFile)
{
	$FQDNInfo = New-Object System.Object
	$FQDNInfo | Add-Member -Type NoteProperty -Name "FQDN"        -Value $FQDN
	$FQDNInfo | Add-Member -Type NoteProperty -Name "AddressList" -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "CSVSafeList" -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetRange"    -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "CIDR"        -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetName"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetType"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "RegDate"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "Updated"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "Comment"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "SOA"         -Value $null
	
	Try	{ $FQDNInfo.AddressList = $([System.Net.Dns]::GetHostEntry($FQDN)).AddressList }
	Catch {	}
	If($FQDNInfo.AddressList -ne $null)
	{
		ForEach($A In $FQDNInfo.AddressList) { $FQDNInfo.CSVSafeList += "$($A)|" }
		$FQDNInfo.CSVSafeList = $FQDNInfo.CSVSafeList.TrimEnd('|')		
		Try
		{
			$ARINData = $(Invoke-WebRequest http://whois.arin.net/rest/ip/$($FQDNInfo.AddressList[0].ToString())`.txt).Content
			$ARINData = $ARINData.Split([Environment]::NewLine)
			Foreach($l In $ARINData)
			{
				If($l.StartsWith("NetRange:"))    { $FQDNInfo.NetRange = $l.SubString(16) }
				Elseif($l.StartsWith("CIDR:"))    { $FQDNInfo.CIDR     = $l.SubString(16) }
				Elseif($l.StartsWith("NetName:")) { $FQDNInfo.NetName  = $l.SubString(16) }
				Elseif($l.StartsWith("NetType:")) { $FQDNInfo.NetType  = $l.SubString(16) }
				Elseif($l.StartsWith("RegDate:")) { $FQDNInfo.RegDate  = $l.SubString(16) }
				Elseif($l.StartsWith("Updated:")) { $FQDNInfo.Updated  = $l.SubString(16) }
				Elseif($l.StartsWith("Comment:")) 
				{ 
					$FQDNInfo.Comment += $l.SubString(16)
					$FQDNInfo.Comment += " "
				}
			}
		}
		Catch { }
		& nslookup -q=soa $FQDN 2>&1> $Env:UserProfile`\temp.txt
		Foreach($_ In Get-Content $Env:UserProfile`\temp.txt)
		{
			If($_.Contains("primary name server =")) { $FQDNInfo.SOA = $_.Split('=')[1].Trim() }
		}		
	}	
	$FQDNInfoCollection += $FQDNInfo
	$EntriesProcessed   += 1
	Write-Host $EntriesProcessed "FQDNs processed."
}

If($OutFile.Length -gt 0)
{
	$FQDNInfoCollection | Export-CSV $OutFile -NoTypeInformation	
}
return $FQDNInfoCollection

Processor Shopping for SQL Server 2012

AMD vs. IntelI almost never talk about SQL Server here, which is a shame, because I think SQL Server is amazing.  If you're planning on deploying SQL Server 2012, and you haven't picked out your hardware yet, then I hope this post finds you in time and helps you make your decision about what processor architecture to choose.  (I hope the graphic doesn't give it away...)  Also, keep in mind the date in which this is written - computer hardware changes rapidly.

 

 

You know you pretty much have two choices in CPUs: Intel or AMD.  There are several factors to weigh here: performance, hardware cost, and licensing cost.  So let's break those down and compare:

Performance: Keep in mind that we're designing a SQL Server here.  Different SQL Servers are under different types of workloads, but OLTP (online transaction processing) is one very common type. The TPC (Transaction Processing Performance Council) introduced the TPC-E benchmark in 2007, which simulates an OLTP workload on a SQL server.  What we end up with is a pretty solid method for benchmarking SQL servers of varying hardware configurations running identical workloads.  If you visit the website, it's pretty hard not to notice that the top 10 highest-performing servers and the top 10 best price/performance all belong to Intel processors with no exception.  But just for fun, let's see the numbers:

System Processor TPC-E Sockets Total Cores Score/Core
HP Proliant DL380 G7 Intel Xeon X5690 1284.14 2 12 107.01
IBM System x360 M4 Intel Xeon E5–2690 1863.23 2 16 116.45
HP Proliant DL385 G7 AMD Opteron 6282SE 1232.84 2 32 38.53
HP Proliant DL585 G7 AMD Opteron 6176SE 1400.14 4 48 29.17
IBM System x3850 * 5 Intel Xeon E7–4870 2862.61 4 40 71.57
NEC Express 5800/A1080a Intel Xeon E7–8870 4614.22 8 80 57.68

The trends evident from that table are that AMD prefers more cores per socket, AMD cores tend to perform much worse per core than Intel cores on an OLTP workload, and that crazy numbers of processor cores present with diminishing returns regardless of the manufacturer.  So far things are not looking good for AMD.  AMD can pack more cores on a die, but that just simply does not make up for their gap in single-threaded performance.

Hardware Cost: Let's get right down to some hardware prices. I'm going to price only the processors themselves, not the entire servers, because there are so many customizable options and accessories to choose from when speccing out an entire server and that would take me way longer than what I wanted to spend on this blog post.

Processor CDW.COM Price
Intel Xeon X5690 $1886.99
Intel Xeon E5–2690 $2332.99
AMD Opteron 6282SE $1287.99
AMD Opteron 6176SE $1505.99
Intel Xeon E7–4870 $5698.99
Intel Xeon E7–8870 $7618.99

AMD has a bit of a price advantage here, especially when you start getting to the high-end processors, but it's all for nothing once you take into account the 3rd piece of the puzzle:

Licensing: To be frank, Microsoft SQL Server 2012 Enterprise Edition is very expensive.  SQL used to be licensed on a per-socket basis.  SQL 2012 is now licensed per physical core.  This means "logical" cores such as those created by Intel's Hyperthreading are essentially free in regards to SQL 2012 licensing.  (There is the alternative Server + CAL licensing model as seen with the Business Intelligence Edition, but that's kinda' out of the scope of this article.  Enterprise Edition is where it's at.)  Each physical socket in your SQL server must use a minimum of 4 core licenses, and then you license two cores at a time after that for any additional cores more than 4 you have on your processor.

If you're thinking ahead, you can already tell this is bad news for AMD-based servers aspiring to run SQL 2012.  AMD processors have more cores, which equals higher SQL licensing costs, with lower performance per core to boot.  Microsoft realized this, and so they did AMD a favor by specifically giving most AMD processors a 25% discount on licensing costs.  But even with that discount, the numbers still speak for themselves, and AMD still comes out way behind:

AMD Opteron 6282SE 16 $82,488 2 $164,976 Intel Xeon E5–2690 8 $54,992 2 $109,984 Intel Xeon E5–4650 8 $54,992 4 $219,968 Intel Xeon X7560 8 $54,992 4 $219,968 Intel Xeon E7–4870 10 $68,740 4 $274,960 AMD Opteron 6180SE 12 $61,866 4 $247,464 AMD Opteron 6282SE 16 $82,488 4 $329,952

Processor Cores Per Socket Cost Total Sockets Total License Cost per Server
Intel Xeon X5690 6 $41,244 2 $82,488
AMD Opteron 6282SE 16 $82,488 2 $164,976
Intel Xeon E5–2690 8 $54,992 2 $109,984
Intel Xeon E5–4650 8 $54,992 4 $219,968
Intel Xeon X7560 8 $54,992 4 $219,968
Intel Xeon E7–4870 10 $68,740 4 $274,960
AMD Opteron 6180SE 12 $61,866 4 $247,464
AMD Opteron 6282SE 16 $82,488 4 $329,952

It just got really hard for me to recommend an AMD processor for use in a SQL Server 2012 server under almost any circumstances.  Let's take our Intel Xeon X5690 and our AMD Opteron 6282SE, which both have pretty similar TPC-E benchmark scores... only the AMD costs $82,488 more to license!  This is with AMD's 25% discount!  These are full retail prices of course, but the concept is the same, regardless of your Enterprise Agreement.

So, my fellow IT pros... please do the math before you pull the trigger on that new server, and make sure your $2000 in hardware savings isn't steamrolled by $80,000 of extra licensing costs.

* Citation - these numbers are from the book Professional SQL Server 2012 Internals and Troubleshooting by Bolton, Langford, Berry, et al.

AD Recycle Bin and a Eulogy for the Infrastructure Master

Ruminate with me a while, won't you?

Ah, the Infrastructure Master.  Probably the least-appreciated FSMO role of all.  In discussions such as technical job interviews, most people can list the five FSMOs for me... maybe even tell me which are per-forest and which are per-domain... but if you then start asking for specifics about what each one of them actually does, the interviewee usually gets a bit more wobbly.  And I think the Infrastructure Master in particular is probably the most difficult of all to grasp.  I know it was certainly the last one for me to really "get."

I won't spill it all out here on what exactly the IM does - there's plenty of documentation out there if you're really interested. I would also direct you to this ServerFault post wherein I give a real-world example of what the IM does and what might happen if the IM is on the wrong domain controller.

This brings me to the Active Directory Recylce Bin.  The AD Recycle Bin was introduced in 2008 R2, and was a long time coming.  Before, restoring AD objects was a lot more arcane and cumbersome than it is with a good ole' Recycle Bin.  Considering the AD Recycle Bin is going on 5 years old now, even though it's an optional feature, there's less and less of an excuse as time goes on for you to not have it enabled in your AD domain.

(You don't, do you?)

So here's the interesting bit that you might not have known: (sorry for wasting your time if you did know) once you've enabled the AD Recycle Bin, your Infrastructure Master no longer has anything to do.  Nothing.  Not even in an environment where some domain controllers are not also global catalogs.

From Technet:

When the Recycle Bin optional feature is enabled, every DC is responsible for updating its cross-domain object references in the event that the referenced object is moved, renamed, or deleted. In this case, there are no tasks associated with the Infrastructure FSMO role, and it is not important which domain controller owns the Infrastructure Master role.

So as the AD Recycle Bin becomes more and more commonplace in Active Directory environments, it seems that the Infrastructure Master may slowly dwindle away until only the old guard even remembers what it was, and budding young IT pros will only have 4 FSMOs to remember.

ShareDiscreetlyWebServer v1.0.1.2

Several improvements over the last release the past few days:

  • Some code optimizations. Pages are rendering about an order of magnitude faster now.
  • Just about all the HTML and Javascript has been exported to editable files so that an administrator can change up the code, color schemes, branding, etc., without needing to recompile the code.
  • The server can now send an S/MIME, digitally signed email to the person you want to send the URL to. Unfortunately some email clients (such as the Gmail web client) don't natively understand S/MIME, but Outlook handles it just fine. You can also get various plugins and programs to read S/MIME emails if your email client doesn't understand it. I'd rather lean toward more security-related bells and whistles than max compatibility for this project.

You can access the secret server at https://myotherpcisacloud.com.