ShareDiscreetlyWebServer v1.0.0.3

I wrote a web service.  I call it "ShareDiscreetly".  Creative name, huh?

I wrote the server in C# .NET 4.0.  It runs as a Windows service.

ShareDiscreetlyWebServer serves a single purpose: to allow two people to share little bits of information - secrets - such as passwords, etc., in a secure, discreet manner.  The secrets are protected both in transit and at rest, using the FIPS-approved AES-256 algorithm with asymmetric keys supplied by an X.509 certificate.

Oh, and I made sure that it's thoroughly compatible with Powershell so that the server can be used in a scriptable/automatable way.

You can read a more thorough description of the server as you try it out here.

Please let me know if you find any bugs, exploits, or if you have any feature requests!

Powershell: Get Content Faster with ReadCount!

Do you use Powershell?  Do you use Get-Content in Powershell to read files?  Do you sometimes work with large text files?

If you answered yes to any of the questions above, then read on - this post is for you!

I have a very simple tip that I used today in a script I was writing.  Thought I'd share.

Let's say you have a large text file, such as a packet log from a DNS server that you're debugging.  It might be 300 megabytes and millions of lines.  I was writing a script to parse the file and collect some statistics that I was after.

#
$LogFile = Get-Content $FileName
ForEach($_ In $LogFile)
{
    Do-Stuff
}

When I ran this script against a 52MB text file, the script executed in about 22 seconds.  When I ran the script on a 150MB text file, Powershell proceeded to consume over 3GB of RAM within a few seconds, the script never finished, and after bringing my laptop (Win7 x64, 4GB RAM, 4CPU, PS v3, .NET 4.5) to a crawl for about 5 minutes, Powershell just gave up and returned to the prompt without outputting anything.  I guess it was some sort of memory leak.  But come on... a 150MB file is not even that big...

So I started looking through the help for Get-Content, and it turns out there's an easy workaround:

#
$LogFile = Get-Content $FileName -ReadCount 0
ForEach($_ In $LogFile)
{
    Do-Stuff
}

The -ReadCount parameter specifies how many lines of content are sent through the pipeline at a time. The default is 1. A value of 0 sends all of the content through at one time.

Now when I run the script against the 52MB file, it completes in 2.8 seconds, and when I run it on the 150MB text file, it finishes in 10.2 seconds!