Here in the US, legislation that enacts radical change in our daily lives so rarely passes, that most people become apathetic towards politics altogether.  The government that we have built in this land is a sleeping giant in that it has the potential to affect our lives in profound ways, but is usually so deadlocked with compromise and fraught with infighting that we see it as mostly harmless.  We become desensitized to it because it moves so slowly.  We hope that it just leaves us alone, and it doesn't pester us enough to act when we should.  It is not without irony that because of this, we become the frog in hot water that doesn't realize what is happening until it’s too late. 

I am hoping that because I write to an intelligent group of people who are particularly enthusiastic and knowledgeable about technology and who understand the Internet, that this missive has a persuasive effect. 

There are bills in the House and Senate right now that threaten to change the face of the Web as we know it.  PROTECT-IP in the Senate and SOPA in the House.  One should not even bother with what the self-parodying acronyms stand for, as those words have nothing to do with the actual meaning, intent or effect of the bills themselves, and only serve to obfuscate the issue and sell the bills to the politicians who will vote on them. 

As an example, SOPA seeks to assign liability to site owners for everything that the users of their websites post.  The site owners could face DNS and search engine blacklisting, jail, and/or heavy fines for the content that their users post.  Furthermore, this bill denies said site owners due process of law by automatically initiating a DNS blacklisting based solely on an assertion made by an individual copyright or intellectual property owner, and without any necessary notification or forewarning to the site owner. 

Imagine all of the websites out there today on which users collaborate and post content.  Facebook, Youtube, every forum on the Internet, Twitter, reddit, LinkedIn, Google+, etc.  Hopefully I’ve mentioned a website that you use and enjoy today. 

Now imagine that those websites can no longer be, or rather can only exist as read-only channels for advertisements, because what website is going to accept the risk of liability for everything posted by the public on their site, moderated or not, under penalty of having their entire website shut down, or worse?  Even if such websites switched to an aggressive moderation/approval process where each and every post was approved by the site owners before it was allowed to be displayed, the site owners would be no less liable for the one offense that the moderators missed. 

What is probably the greatest irony of all is that as the free flow of information and ideas grinds to a halt because of this, the successful passing of this bill stands to benefit no one and the supporters of the bill will have only shot themselves in the foot.  I’ll never go see that movie in the theater because I never read that forum post where somebody posted a clip of it and talked about how awesome it was.  I’ll never buy that album because I never got to see the video where that really cool song was playing in the background.  I’ll never have that great idea because that site where like-minded individuals used to get together and share ideas was shut down. Who benefits from any of this nonsense, other than luddites bent on sending us back to the 20th century? 

On that note, we really need to talk to our Congressmen and women.  I know the person who reads this post is likely an IT pro and is probably really smart, and as such, getting people like us to do something is like herding cats.  But please, go to this site and enter your zip code and street address. (Or your neighbor’s street address if you’re paranoid.)  This will get you the names and phone numbers of the offices of your local representative, and our state senators.  For example:

I know that as a tech-savvy crew, we are used to dealing in emails and texts, but phone calls are more personal and will carry more weight than emails, especially to someone who may not fully understand the far-reaching implications of the bills they’re about to vote on. Take some time before you call to think about or write down what you want to say. Something polite about how you, as a constituent, would like to urge them to reconsider their support of this bill.


À propos of my last post, my latest creation is finished and needs some beta-testing now. It's called RSSInYourFace, and it's an RSS feed reader. Yes, I realize that there are already dozens of RSS feed readers out there, but this one is mine.

I wrote it so that I could stay current on all my favorite blogs and news sites without actually having to waste time surfing the web. This is particularly useful at work where you might want to appear as though you're doing something besides surfing the web all day. The gist of the program is that you supply a list of URLs to your favorite feeds, and the program will sit quietly in your system tray until something new gets posted from one of them, at which point it will alert you with a balloon popup about the new item. Subsequently clicking the balloon as it pops up will launch that website/news item in your default browser.

It was written and tested on Windows 7 64-bit, but should work on 32-bit as well. .NET 4.0 is required.

I would really like some feedback if you would like to use it. Try to break it, make it crash, etc. I'd really like to squash any bugs or inefficiencies that I haven't caught yet. Also feel free to make feature requests - I can almost certainly make it happen. My main concern right now is maintaining support for the myriad different formats of feeds out there. Imagine my dismay when I realized that there wasn't just one standard type of RSS feed. There's never just a standard...

Link to version 1.03 Release Build: RSSInYourFace1.0.3.exe (87.00 kb)
Link to version 1.02 Release Build: RSSInYourFace.1.02.exe (87.00 kb)


.NET Programming Love, Multi-Threadedness, and CPU Usage

I do not want to ever be known as a professional software developer.

It's not because I don't enjoy programming, and certainly not because I don't appreciate the people who do love being professional developers. I do enjoy it. It's fun. I've been writing computer programs since I was drawing multi-colored circles in QuickBasic at 14 years old, and I still find it so fun that I do it as a pastime for no other reason than pure personal pleasure.

It's not because "I'm a server guy, or a hardware guy, or an architecture guy, or a network guy... and so therefore I don't need to write programs."  I know plenty of hardware, network and server architecture guys that have entire life-long careers going and don't care to write a line of code ... sometimes not even a line of script!

I don't want to be a professional developer because I'm deathly afraid that if I were forced to code for a living, it might suck all the fun out of it. It's like playing poker for a living. Sure it's fun to play poker with your buddies on a Saturday night and drink and laugh and play ... but what if you needed to play poker to earn a living? What if your performance in Texas Hold 'Em dictated how well you ate or if you made your rent for the month?  All of a sudden, Texas Hold 'Em isn't so fun any more...

You might hear me reiterate that sentiment again throughout the course of this blog, as (at this point in my life) I feel it is one of the defining sentiments of my career as an IT professional.

So, with all that said, I want to share some programming goodness that I encountered tonight, simply as an enthusiast and an amateur programmer. You are more than welcome to comment on how amateurish and unevolved as a programmer I am. I encourage it actually, as there is no better way for me to learn than by accepting constructive criticism.

I really like .NET. I love it. And what makes that very ironic is that years ago, I used to denounce languages such as Java because of its JVM nature. "Why not program against the machine itself? It would obviously be more efficient," I used to say. Well years later, here I am professing my love for .NET - a CLR language, which basically makes it the Microsoft equivalent of Java! I get the feeling that Russinovich and his ilk don't have much respect for us CLR/JVM coders because we operate in a managed sandbox environment and don't have to worry as much about things like byte-by-byte memory allocation... and god forbid we forget to de-allocate something.

But you know what? .NET gives me time to focus on solving business problems, instead of agonizing over every last little memory leak.  That's what makes .NET the perfect "Systems Engineer's" or "Systems Administrator's" language. I don't care about typecasting every last delegated sub type as an enum... and such as.  (Source: Miss South Carolina)

.NET just lets me focus on doing things and solving problems.

Tonight, I was working on a little program for myself, as opposed to something for the company where I work.  Just something that I was personally interested in.  And I had only just recently really grasped the concept of multi-threading my .NET applications. It's something that I had been struggling with for a long time... and I'm not so ashamed to admit that because there are some enterprise-grade applications that I use at work today that our company pays a lot of money for where the GUI freezes up hard for several minutes while the application downloads data and you're left staring at a non-responsive GUI... And now I know that I can do it better than that.

I guess I'll add that to list of things they should be paying me for, eh?

It's not that creating parallel threads is difficult. Least of all in .NET. You can find hundreds of examples out there on Google right now. Knowing when and where to put them in your program and how to use them effectively and design the rest of your application to work with threads effectively is the tricky part. It's a design problem, not a technical one.

It's the C# BackgroundWorker that really got me in to designing good, multi-threaded GUIs.

BackgroundWorker workerEngine = new BackgroundWorker();
workerEngine.WorkerReportsProgress = true;
workerEngine.ProgressChanged += new ProgressChangedEventHandler(worker_ProgressChanged);
workerEngine.DoWork += new DoWorkEventHandler(worker_DoWork);

And now you're executing this code in a parallel thread:

void worker_DoWork(object sender, DoWorkEventArgs e)
    // do some stuff
    // Sending "progress report" to the worker_ProgressChanged method
    ((BackgroundWorker)sender).ReportProgress(0, "I'm still working hard over here!"); 

Which can send "updates" to this method:

void worker_ProgressChanged(object sender, ProgressChangedEventArgs e)
    // Sending results back to the main/GUI thread

Which can in turn interact with your GUI. You can do interesting things like populate a listbox on your GUI with updates from the background worker. See, that's one of interesting challenges for us novice coders trying to get in to multi-threaded programming... All of your GUI elements exist on one thread, and you cannot manipulate them or even get any information from them at all from another thread. So the difficulty is in creating a bridge (or a delegate, if you will,) between your threads, which we have now done with a BackgroundWorker object.

Now when you run this code, be aware that your worker thread is trying to try run as fast as he possibly can, which means he can eat up CPU time:

That's my background worker thread doing nothing but spinning in an infinite loop, causing significant CPU spikes on every core. At least the GUI remains responsive. Now let's stick a System.Threading.Thread.Sleep(100) (milliseconds) in the background thread so that perhaps he'll pace himself:

Much better! Now our background thread isn't hogging up the CPU, and our main GUI thread stays responsive and smooth throughout.

And one last thing - never put Thread.Sleep(x) anywhere on your GUI thread. It will obviously cause your GUI to be jerky and unresponsive while it's sleeping, which makes for an awful user experience.

Oh, and Merry Christmas. Hope Santa brings you what you wanted.

I Built a Desk

I realized that I had been complacently tolerating my crappy, flimsy desk that I bought from Walmart 5 years ago, which I never liked. For one, it had a glass top, which meant it was always filthy. Secondly, it had a sliding keyboard tray, which meant that every time I was playing a video game for instance, and moving the mouse around trying to aim at people's heads, the whole tray would be wobbling and squeaking. The older it got, the more it wobbled and squeaked...

So I finally got fed up and decided to build my own computer desk! (Click to enlarge.)

It's not much to look at, and my father would probably be ashamed at my lack of woodworking skills, (something he's quite good at,) but it gets the job done! I just went to the nearest Home Depot and had some studs and some 3/4" plywood chopped up, and screwed it all together with beige deck screws. Then I spent two days putting polyurethane on the top. I hate polyurethane. It stinks and it takes forever to dry. And it smells bad, too. I'm... just gonna' go take a nap... right over here... *collapses dead from fume inhalation*

By some miracle, it actually turned out square and level. And it's solid as can be. No more wobbling or squeaking!

Here's the Visio diagram of this desk: desk.vsd (68.00 kb)

Here's the PDF for those of you with no Visio viewer: desk.pdf (100.35 kb)

And lastly, if you're interested in what's in that beastly server cabinet on the left, refer to this post.

The Page File

How tired is this topic? I don't want to be "reheating boring leftovers," as Ned puts it, but maybe this post will help finally put it to bed for anyone still wondering.

Ever since I became an "NT geek" a little over 15 years ago, there has always seemingly been so much mystery and intrigue shrouding "the page file," also known as the "swap file."  Windows does not really do "swapping" anymore, so I will only refer to it as a page file from here on out. Still to this day, in 2011, people commonly ask me about what size their page file should be. "Does my page file need to be this big?  What happens if I shrink it?  Can I put it on another logical drive? Can I disable it?"  And naturally, the Web can be a cesspool of misinformation which only serves to further confuse people and cause them to rely on "rules of thumb" for which there's no true reasoning or sense behind them. To add to this mess, the exact ways in which Windows has used the page file have changed slightly over the years as Windows has evolved and the average amount of RAM in our machines has increased. (*cough* Superfetch *cough*)

I'm focusing on Windows Vista and later here. (Win7, Server 2008, R2, etc.)

Just follow the "Advanced"s to get there.



First I want to clear something up: Forget any "rules" you have ever heard about how the page file should be 1.5x or 2.0x or 3.14159x the amount of RAM you have.  Any such formula is basically useless and doesn't scale to modern systems with different amounts of RAM. Get your vestigial 20th century old wives tales out of my Windows.

Alright, sorry about that. I'm just really tired of hearing those rules of thumb about page file sizing. The only part about this that sort of embarrasses me is that Windows itself still uses a formula like this if you choose to let Windows manage your page file for you. Older versions of Windows use this formula to choose the page file size for you:

System Memory Minimum Page File Maximum Page File
< 1GB 1.5 * RAM 3 * RAM
> = 1GB 1 * RAM 3 * RAM

Windows 7 and 2008 R2 set the paging file size to the amount of RAM + 300MB.

Furthermore, the page file is dynamic by default and can expand on demand. And for the most part that still works just fine for the average home PC user.  But if you have a server with 192GB of RAM, Windows will by default create for you a nice, fat 192GB page file.  Our "rule of thumb" now looks absurd.

No, you do not need (or want) a 192GB page file gobbling up space on your nice 15k SAS drives.  The only reason you might ever want such a huge page file is only if you're interested in generating full crash dumps (full memory dumps) when the machine crashes. (You need a page file that is the size of RAM plus a couple hundred megabytes for this.) In later versions of Windows you can use a dedicated crash dump file to store memory dumps, further decreasing the need for a huge page file.  Also, you'll still get minidumps that can contain useful information about system crashes even if your page file is too small to support a full memory dump.

Other types of crash dumps are explained here.

The truth is the more RAM you have, the less page file you need - with a couple of stipulations.

The answer to "how big should the page file be?" is "just big enough so that you don't run out of commitable memory." The amount of memory that your system can commit is equal to your RAM + page file. You can put your system under a heavy workload and use Performance Monitor to see how much RAM you're using. You can also use Process Explorer by Mark Russinovich and watch the peak memory value. Just make sure that you have enough RAM + page file that you can support that peak memory usage at all times, or else your page file will be forced to expand if it can, and if your page file can't expand then your application might hang, or crash, or any number of systemic weirdnesses will occur.

Unfortunately, Superfetch complicates this test because it goes around in the background, pre-loading things into memory from disk all the time, so it'll always make it seem like you're running low on memory. In reality, a lot of that is Superfetch putting your memory to work, and it's actually really good at reducing disk seeks on PCs.  But Superfetch is turned off by default if Windows was installed on an SSD (because seek times are near 0 on an SSD anyway,) and it's also disabled in Windows Server. 

Also keep in mind that certain applications such as MSSQL and LSASS have their own memory management systems that can operate outside the purview of the NT Memory Management system, which can lead to things like those applications hogging up more than their fair share of memory.


Process Explorer by Mark Russinovich

On the other hand, don't disable the page file completely, even if you have 192GB of RAM.  (You should be able to move it off of the system drive with no ill-effects though, unless you're dealing with a very poorly written application.) Windows can run perfectly fine with no page file if you have plenty of RAM. However, some applications may not. Some third-party applications, and even some MSFT applications like AD DS and SQL may be simply assuming that there is a page file, and if there isn't one, it may result in unpredictable results. (Read: Hard to troubleshoot headaches.)

On the other other hand, keep in mind that having a huge pagefile (like 192GB) will affect system performance if your system is actually having to use that page file a lot.  Just having a large file on your disk won't affect performance by virtue of it just being there, but if your system is having to manipulate that file by pushing memory pages to it all the time it will. (And if you find yourself having to push memory pages to a huge file on disk very often, you obviously needed more RAM a long time ago.)

Lastly - yes, there are some "page file optimization" techniques that still apply, such as striping the page file across multiple spindles, setting a static size to eliminate file system fragmentation, etc. However, with RAM being as inexpensive as it is these days, your main concern should be minimizing having to touch the page file at all anyway.