À propos of my last post, my latest creation is finished and needs some beta-testing now. It's called RSSInYourFace, and it's an RSS feed reader. Yes, I realize that there are already dozens of RSS feed readers out there, but this one is mine.

I wrote it so that I could stay current on all my favorite blogs and news sites without actually having to waste time surfing the web. This is particularly useful at work where you might want to appear as though you're doing something besides surfing the web all day. The gist of the program is that you supply a list of URLs to your favorite feeds, and the program will sit quietly in your system tray until something new gets posted from one of them, at which point it will alert you with a balloon popup about the new item. Subsequently clicking the balloon as it pops up will launch that website/news item in your default browser.

It was written and tested on Windows 7 64-bit, but should work on 32-bit as well. .NET 4.0 is required.

I would really like some feedback if you would like to use it. Try to break it, make it crash, etc. I'd really like to squash any bugs or inefficiencies that I haven't caught yet. Also feel free to make feature requests - I can almost certainly make it happen. My main concern right now is maintaining support for the myriad different formats of feeds out there. Imagine my dismay when I realized that there wasn't just one standard type of RSS feed. There's never just a standard...

Link to version 1.03 Release Build: RSSInYourFace1.0.3.exe (87.00 kb)
Link to version 1.02 Release Build: RSSInYourFace.1.02.exe (87.00 kb)


.NET Programming Love, Multi-Threadedness, and CPU Usage

I do not want to ever be known as a professional software developer.

It's not because I don't enjoy programming, and certainly not because I don't appreciate the people who do love being professional developers. I do enjoy it. It's fun. I've been writing computer programs since I was drawing multi-colored circles in QuickBasic at 14 years old, and I still find it so fun that I do it as a pastime for no other reason than pure personal pleasure.

It's not because "I'm a server guy, or a hardware guy, or an architecture guy, or a network guy... and so therefore I don't need to write programs."  I know plenty of hardware, network and server architecture guys that have entire life-long careers going and don't care to write a line of code ... sometimes not even a line of script!

I don't want to be a professional developer because I'm deathly afraid that if I were forced to code for a living, it might suck all the fun out of it. It's like playing poker for a living. Sure it's fun to play poker with your buddies on a Saturday night and drink and laugh and play ... but what if you needed to play poker to earn a living? What if your performance in Texas Hold 'Em dictated how well you ate or if you made your rent for the month?  All of a sudden, Texas Hold 'Em isn't so fun any more...

You might hear me reiterate that sentiment again throughout the course of this blog, as (at this point in my life) I feel it is one of the defining sentiments of my career as an IT professional.

So, with all that said, I want to share some programming goodness that I encountered tonight, simply as an enthusiast and an amateur programmer. You are more than welcome to comment on how amateurish and unevolved as a programmer I am. I encourage it actually, as there is no better way for me to learn than by accepting constructive criticism.

I really like .NET. I love it. And what makes that very ironic is that years ago, I used to denounce languages such as Java because of its JVM nature. "Why not program against the machine itself? It would obviously be more efficient," I used to say. Well years later, here I am professing my love for .NET - a CLR language, which basically makes it the Microsoft equivalent of Java! I get the feeling that Russinovich and his ilk don't have much respect for us CLR/JVM coders because we operate in a managed sandbox environment and don't have to worry as much about things like byte-by-byte memory allocation... and god forbid we forget to de-allocate something.

But you know what? .NET gives me time to focus on solving business problems, instead of agonizing over every last little memory leak.  That's what makes .NET the perfect "Systems Engineer's" or "Systems Administrator's" language. I don't care about typecasting every last delegated sub type as an enum... and such as.  (Source: Miss South Carolina)

.NET just lets me focus on doing things and solving problems.

Tonight, I was working on a little program for myself, as opposed to something for the company where I work.  Just something that I was personally interested in.  And I had only just recently really grasped the concept of multi-threading my .NET applications. It's something that I had been struggling with for a long time... and I'm not so ashamed to admit that because there are some enterprise-grade applications that I use at work today that our company pays a lot of money for where the GUI freezes up hard for several minutes while the application downloads data and you're left staring at a non-responsive GUI... And now I know that I can do it better than that.

I guess I'll add that to list of things they should be paying me for, eh?

It's not that creating parallel threads is difficult. Least of all in .NET. You can find hundreds of examples out there on Google right now. Knowing when and where to put them in your program and how to use them effectively and design the rest of your application to work with threads effectively is the tricky part. It's a design problem, not a technical one.

It's the C# BackgroundWorker that really got me in to designing good, multi-threaded GUIs.

BackgroundWorker workerEngine = new BackgroundWorker();
workerEngine.WorkerReportsProgress = true;
workerEngine.ProgressChanged += new ProgressChangedEventHandler(worker_ProgressChanged);
workerEngine.DoWork += new DoWorkEventHandler(worker_DoWork);

And now you're executing this code in a parallel thread:

void worker_DoWork(object sender, DoWorkEventArgs e)
    // do some stuff
    // Sending "progress report" to the worker_ProgressChanged method
    ((BackgroundWorker)sender).ReportProgress(0, "I'm still working hard over here!"); 

Which can send "updates" to this method:

void worker_ProgressChanged(object sender, ProgressChangedEventArgs e)
    // Sending results back to the main/GUI thread

Which can in turn interact with your GUI. You can do interesting things like populate a listbox on your GUI with updates from the background worker. See, that's one of interesting challenges for us novice coders trying to get in to multi-threaded programming... All of your GUI elements exist on one thread, and you cannot manipulate them or even get any information from them at all from another thread. So the difficulty is in creating a bridge (or a delegate, if you will,) between your threads, which we have now done with a BackgroundWorker object.

Now when you run this code, be aware that your worker thread is trying to try run as fast as he possibly can, which means he can eat up CPU time:

That's my background worker thread doing nothing but spinning in an infinite loop, causing significant CPU spikes on every core. At least the GUI remains responsive. Now let's stick a System.Threading.Thread.Sleep(100) (milliseconds) in the background thread so that perhaps he'll pace himself:

Much better! Now our background thread isn't hogging up the CPU, and our main GUI thread stays responsive and smooth throughout.

And one last thing - never put Thread.Sleep(x) anywhere on your GUI thread. It will obviously cause your GUI to be jerky and unresponsive while it's sleeping, which makes for an awful user experience.

Oh, and Merry Christmas. Hope Santa brings you what you wanted.

I Built a Desk

I realized that I had been complacently tolerating my crappy, flimsy desk that I bought from Walmart 5 years ago, which I never liked. For one, it had a glass top, which meant it was always filthy. Secondly, it had a sliding keyboard tray, which meant that every time I was playing a video game for instance, and moving the mouse around trying to aim at people's heads, the whole tray would be wobbling and squeaking. The older it got, the more it wobbled and squeaked...

So I finally got fed up and decided to build my own computer desk! (Click to enlarge.)

It's not much to look at, and my father would probably be ashamed at my lack of woodworking skills, (something he's quite good at,) but it gets the job done! I just went to the nearest Home Depot and had some studs and some 3/4" plywood chopped up, and screwed it all together with beige deck screws. Then I spent two days putting polyurethane on the top. I hate polyurethane. It stinks and it takes forever to dry. And it smells bad, too. I'm... just gonna' go take a nap... right over here... *collapses dead from fume inhalation*

By some miracle, it actually turned out square and level. And it's solid as can be. No more wobbling or squeaking!

Here's the Visio diagram of this desk: desk.vsd (68.00 kb)

Here's the PDF for those of you with no Visio viewer: desk.pdf (100.35 kb)

And lastly, if you're interested in what's in that beastly server cabinet on the left, refer to this post.

The Page File

How tired is this topic? I don't want to be "reheating boring leftovers," as Ned puts it, but maybe this post will help finally put it to bed for anyone still wondering.

Ever since I became an "NT geek" a little over 15 years ago, there has always seemingly been so much mystery and intrigue shrouding "the page file," also known as the "swap file."  Windows does not really do "swapping" anymore, so I will only refer to it as a page file from here on out. Still to this day, in 2011, people commonly ask me about what size their page file should be. "Does my page file need to be this big?  What happens if I shrink it?  Can I put it on another logical drive? Can I disable it?"  And naturally, the Web can be a cesspool of misinformation which only serves to further confuse people and cause them to rely on "rules of thumb" for which there's no true reasoning or sense behind them. To add to this mess, the exact ways in which Windows has used the page file have changed slightly over the years as Windows has evolved and the average amount of RAM in our machines has increased. (*cough* Superfetch *cough*)

I'm focusing on Windows Vista and later here. (Win7, Server 2008, R2, etc.)

Just follow the "Advanced"s to get there.



First I want to clear something up: Forget any "rules" you have ever heard about how the page file should be 1.5x or 2.0x or 3.14159x the amount of RAM you have.  Any such formula is basically useless and doesn't scale to modern systems with different amounts of RAM. Get your vestigial 20th century old wives tales out of my Windows.

Alright, sorry about that. I'm just really tired of hearing those rules of thumb about page file sizing. The only part about this that sort of embarrasses me is that Windows itself still uses a formula like this if you choose to let Windows manage your page file for you. Older versions of Windows use this formula to choose the page file size for you:

System Memory Minimum Page File Maximum Page File
< 1GB 1.5 * RAM 3 * RAM
> = 1GB 1 * RAM 3 * RAM

Windows 7 and 2008 R2 set the paging file size to the amount of RAM + 300MB.

Furthermore, the page file is dynamic by default and can expand on demand. And for the most part that still works just fine for the average home PC user.  But if you have a server with 192GB of RAM, Windows will by default create for you a nice, fat 192GB page file.  Our "rule of thumb" now looks absurd.

No, you do not need (or want) a 192GB page file gobbling up space on your nice 15k SAS drives.  The only reason you might ever want such a huge page file is only if you're interested in generating full crash dumps (full memory dumps) when the machine crashes. (You need a page file that is the size of RAM plus a couple hundred megabytes for this.) In later versions of Windows you can use a dedicated crash dump file to store memory dumps, further decreasing the need for a huge page file.  Also, you'll still get minidumps that can contain useful information about system crashes even if your page file is too small to support a full memory dump.

Other types of crash dumps are explained here.

The truth is the more RAM you have, the less page file you need - with a couple of stipulations.

The answer to "how big should the page file be?" is "just big enough so that you don't run out of commitable memory." The amount of memory that your system can commit is equal to your RAM + page file. You can put your system under a heavy workload and use Performance Monitor to see how much RAM you're using. You can also use Process Explorer by Mark Russinovich and watch the peak memory value. Just make sure that you have enough RAM + page file that you can support that peak memory usage at all times, or else your page file will be forced to expand if it can, and if your page file can't expand then your application might hang, or crash, or any number of systemic weirdnesses will occur.

Unfortunately, Superfetch complicates this test because it goes around in the background, pre-loading things into memory from disk all the time, so it'll always make it seem like you're running low on memory. In reality, a lot of that is Superfetch putting your memory to work, and it's actually really good at reducing disk seeks on PCs.  But Superfetch is turned off by default if Windows was installed on an SSD (because seek times are near 0 on an SSD anyway,) and it's also disabled in Windows Server. 

Also keep in mind that certain applications such as MSSQL and LSASS have their own memory management systems that can operate outside the purview of the NT Memory Management system, which can lead to things like those applications hogging up more than their fair share of memory.


Process Explorer by Mark Russinovich

On the other hand, don't disable the page file completely, even if you have 192GB of RAM.  (You should be able to move it off of the system drive with no ill-effects though, unless you're dealing with a very poorly written application.) Windows can run perfectly fine with no page file if you have plenty of RAM. However, some applications may not. Some third-party applications, and even some MSFT applications like AD DS and SQL may be simply assuming that there is a page file, and if there isn't one, it may result in unpredictable results. (Read: Hard to troubleshoot headaches.)

On the other other hand, keep in mind that having a huge pagefile (like 192GB) will affect system performance if your system is actually having to use that page file a lot.  Just having a large file on your disk won't affect performance by virtue of it just being there, but if your system is having to manipulate that file by pushing memory pages to it all the time it will. (And if you find yourself having to push memory pages to a huge file on disk very often, you obviously needed more RAM a long time ago.)

Lastly - yes, there are some "page file optimization" techniques that still apply, such as striping the page file across multiple spindles, setting a static size to eliminate file system fragmentation, etc. However, with RAM being as inexpensive as it is these days, your main concern should be minimizing having to touch the page file at all anyway.

NT Debugging

I'm not talking about the NT Debugging blog.  This is one of my personal experiences with NT debugging.

A couple weeks ago, I was looking at a Windows VM that was apparently crashing on a somewhat regular basis.  Through the use of usual logfile analysis techniques we can get some correlations and some probable causes.  In this particular case it was plainly evident that the system was working perfectly until some 3rd party software was loaded.  Then the regular unexpected shutdowns began, about once every day or two.

The correlation was found through the use of the Reliability and Performance Monitor, which is a very nifty tool: 

A "stop," or "bugcheck" we are all familiar with.  It produces a memory dump file in %SystemRoot%\MEMORY.DMP and other "minidumps" in %SystemRoot%\Minidump\ unless otherwise configured.  It's pretty much, well, a dump of everything that the system had in memory when the offense took place.

But do we really know what to do with an NT memory dump?  I have to say I didn't really, and I was a little embarrassed about it.  So I set out to figure out what useful information I could really glean from that memory dump. Having that extra bit of tenacity to really dig down deep and identify with greater precision what the root of the problem is, rather than just saying "well it's some sort of software compatibility problem, better reformat the hard drive!" can help you out in your quest to be the office guru. 

Well it turns out there's a nice utility called Windbg.exe. You can get it from the Windows SDK.  To effectively debug an application, you need debugging symbols. Fortunately, Microsoft has provided some public debugging symbols on their own symbol server.  I hear that Microsoft also has a private symbol tree for internal use only, but we'll just have to settle for the public ones.

Here's a KB  that will help you get Windbg and your symbols path set up correctly. 

Now that you have that configured, simply drag the memory dump file into the Windbg window, and it will tell you with much greater certainty exactly what driver and/or interaction caused the BSOD.

One of the interesting things that Windbg can reveal, is that sometimes drivers installed by crashy software still get loaded even after the software has been uninstalled. And if all that machine code-looking stuff seems scary, Windbg also outputs the simple line: "Probably caused by: driver.sys" that can at least give you a lead.

There are also other dump file analyizers, such as WhoCrashed, that may be more to your liking.

And lastly, be careful about sharing your memory dumps, as they might contain password hashes.