Saturday, September 11, 2010

Lightning Fast Builds with Visual Studio 2010 and an SSD

I reduced my Visual Studio 2010 C++ build time from 21 minutes to  7  5 minutes! You can too. Here's how.

I'm a build performance junkie. If there's one thing I really hate in life, it's sitting around waiting for builds to complete. Fifteen years ago, the very first article I published was titled Speeding Up Visual C++. It was all about making Visual C++ 1.51 go faster on what was then a state of the art computer - an ISA bus Gateway DX2/50. Woo hoo! My recommendations were:
  1. Use Precompiled Headers.
  2. Upgrade to 16MB of memory.
  3. Use 4 megabytes of Disk Cache.
  4. Upgrade your Hard Drive to Fast SCSI or Enhanced IDE.
  5. Turn off Browse Info.
  6. 32 Bit File Access.
Today computers are thousands of times faster, but rotating platter disk drives are still desperately slow. The seek time of 7200RPM drives has changed very little in the last ten years, although the transfer rate for sequential files has risen dramatically. That problem, combined with Visual Studio's desire to create hundreds of .tlog temporary files, quarter gigabyte .sdf files, and project size bloat means that the average build may be even slower today than it was fifteen years ago.

Historically, your CPU would be sitting idle most of the time waiting for the hard disk to keep up. Linking performance is based almost entirely on your disk's random access read/write performance. The highly rated Western Digital Caviar Black can only perform about 100 random access IOPS (I/O Operations Per Second.) It takes multiple I/O operations per OBJ file, so a significant fraction of the link time is waiting for the hard drive to do its job.

Enter the latest generation of SSDs driven by the Sandforce Controller, such as the OCZ Vertex 2. These drives can do over 40,000 IOPS - 400 times faster than a Caviar Black. And Visual Studio 2010 build performance is phenomenal. In fact, these drives are so fast that their impact on build time is negligible. This SSD will easily hit over 50MB/sec of 4KB random writes. In contrast, the ultra-zippy 10,000 RPM VelociRaptor can only do about 3.5MB/sec. (Note that disk striping or mirroring has minimal impact on build performance because the linker isn't smart enough to use scatter/gather to force the queue depth high enough to let the command queuing on the drive work its magic.)

Now that the hard disk performance no longer matters, our next bottleneck is the CPU. You can tell your boss you need one of those monster hyper-threaded quad core i7 processors such as the 875k or, for the money-is-no-object crowd, the hyper-threaded six core 980X. Visual Studio 2010 automatically uses parallel builds. My 875k pegs all eight cores simultaneously at 100%. Compiles proceed eight files at a time and the CPUs stay pegged until the compile is finished. I've never seen a project build so fast.

The next bottleneck is probably your RAM. If you have 4GB RAM running on a 32-bit OS, you are severely limited and you probably won't be able to run eight compiles (much less 24 compiles if you are using parallel MSBuild tasks, as I explain in Part 2.) Upgrade to Windows 7 64-bit with 8GB of RAM.

So it's interesting that the types of recommendations for build performance haven't changed much. Here is my updated list:
  1. Use Precompiled Headers.
  2. Upgrade to 16MB 8GB of memory.
  3. Use 4 megabytes of Disk Cache. Upgrade to 64-bit Windows. 
  4. Upgrade your Hard Drive to Fast SCSI or Enhanced IDE a Sandforce-based SSD. 
  5. Turn off Browse Info. (This is still true. Browse Info is different than Intellisense.)
  6. 32 Bit File Access. Check your motherboard's SATA implementation. At SSD speeds, not all controllers are created equal.
In Part 2 of this article, I'll talk about how to tune your build system to keep all that hardware busy.

The system used for this article was:
  • ASUS P7P55D-E Pro motherboard.
  • Intel 875k i7 processor.
  • OCZ Vertex 2 120GB SSD.
  • 8GB RAM.
  • Thermaltake MUX-120 heatsink.
Do not use the SATA 3 Marvell controller on the ASUS motherboard. The standard ICH10 controller is much faster with SSDs.

The prior system that took 21 minutes was a Core 2 Duo 2.4 GHz with ASUS P5B Deluxe and Caviar Black 750GB hard drive.

Project is permanently out of date

I've seen several cases with Visual Studio 2010 and later where my project would be permanently "out of date." I'd build the project, run it, and would immediately be told that my project was out of date. Solving this through trial and error is tedious at best. This problem commonly is caused by two different problems:

Missing file in project

The most common cause is a file that's in your project that you've deleted from the disk. These are easy to find in smaller projects by just looking through the project for a file with an X next to it. If you can't find the file causing the problem, enable detailed Diagnostic output in MSBuild. In Visual Studio, open Tools | Options, select Projects and Solutions | Build and Run, and set MSBuild project build output verbosity to Detailed.

Alternatively, here's a handy Python script that will examine a .vcxproj file and look for any missing files:
http://stackoverflow.com/questions/2762930/vs2010-always-thinks-project-is-out-of-date-but-nothing-has-changed

I've also seen this problem when I renamed a file. The old filename is still embedded in the compiler's temporary files, even though the name no longer appears in the project. The solution is to delete all intermediate files in the project and rebuild.

Project upgraded from earlier version of Visual Studio

The StackOverflow page above also details a bug where MSBuild says, "Forcing rebuild of all source files due to a change in the command line since the last build."

The most likely cause is if you used /MP in the Additional Options box in project Properties | Configuration Properties | C/C++ | Command Line. This was often done in Visual Studio 2005. My project started building properly after I fixed this (although it was tricky, because I had set /MP1 on certain files due to bugs in multiprocessor support in VC2005.) This tip came from https://social.msdn.microsoft.com/Forums/vstudio/en-US/aca632fa-e1e0-4511-aa03-a309ae547a5b/how-to-see-the-command-line-forcing-rebuild-of-all-source-files-due-to-a-change-in-the-command?forum=msbuild.

The next problem I saw in the MSBuild log was the use of an old PDB filename. I was using Visual Studio 2013, which uses vc120.pdb. However, some of my project files used the name vc80.pdb, which was the wrong name and so was never found. In the property page for the project, set it to look something like this: (the part in bold is what you see after you set the value for the first time. The next time you reopen the page, it will show you the value that the compiler will use.)


Unhelpful recommendations

After enabling Detailed output for MSBuild, I saw this error. It didn't appear to affect the problem, but it should probably be fixed to prevent confusion in the future:
Project file contains ToolsVersion="4.0". This toolset may be unknown or missing, in which case you may be able to resolve this by installing the appropriate version of MSBuild, or the build may have been forced to a particular ToolsVersion for policy reasons. Treating the project as if it had ToolsVersion="12.0". For more information, please see http://go.microsoft.com/fwlink/?LinkId=293424.

Another suggestion which didn't work was that intermediate directories use a relative path. but I use an absolute path in my project and it works fine:
https://social.msdn.microsoft.com/Forums/vstudio/en-US/36df2f40-f6d4-4123-8261-88394790f29e/project-always-needs-complete-rebuild-in-vs-2010?forum=vcgeneral

Sunday, September 5, 2010

Fixing Permissions on an External Hard Drive

Today I pulled a hard drive from my old computer and hooked it up to my new computer, planning to move the data to the new drive and then use the old drive as a backup disk. Mostly this plan worked well, except that I wasn't allowed to delete many files and folders, even though I was an Administrator. Curious.

The problem turned out to be that I was on a workgroup, not a domain, and so the systems didn't have any common notion of "Administrator". Although Explorer knows how to request elevated permissions, this still isn't enough. You have to "Take Ownership" of the files in order to delete them, and there's no way to do this from Explorer.

I found a solution in the winmatrix forum, but it only works for files, not directories. You can't set the Full Control permission on a directory, so you end up being locked out of directories if you try and use these commands:

takeown /f filepath /r
icacls filepath /grant yourusername:f /t


(I've added /r and /t to the commands, which is required for the them to operate recursively.)

Instead, I did the steps below. These steps assume that the new drive is on E:.
  1. Open a Command Prompt with Run As Administrator.
  2. Run this command: takeown /f e:\ /r
  3. Right-Click on the root of the copied drive.
  4. Select Properties.
  5. Click the Security tab.
  6. Click the Edit button.
  7. Select Authenticated Users.
  8. Click the checkbox under Allow for Full Control.
  9. Run this command: icacls e:\*.* /reset /t
    This command will force all permissions to mirror the permissions on the root of the drive that you set in #6. You must have the *.* or the root directory will be reset, which you don't want.
After these commands completed, I was able to delete all of the desired files. Executing these commands can take quite a while if you have many files on your disk.

Friday, September 3, 2010

Windows 7 Network Performance

A few years back, after I installed Vista, I spent quite a bit of time trying to fix my GigE Ethernet performance with Windows Vista talking to Windows Server 2003 R2. My file copy performance hovered around 15 to 18 MB/sec, which was pretty dismal.

I've just built myself a new PC with a Core i7 CPU and Windows 7. I tried copying a file from the old Vista box (Core 2 Duo 2.4GHz on ASUS P5B Deluxe mobo) to the new Windows 7 box. The copy performance over the Ethernet went straight up to 50MB/sec! This was with a single WD Caviar Black drive on the Vista system, no RAID or striping.

When I tried copying from a Windows Server 2008 R2 system to the new Windows 7 system, I peaked at 112MB/sec for data that was cached, then backed off to 50 MB/sec for data that wasn't cached. Windows Server 2008 R2 was installed on the same hardware that used to be running Windows Server 2003 R2, so the performance increase was solely due to the OS upgrade.

I'm seeing similar performance gains over the internet. Talking to our corporate server from the Vista system, I maxed out at 1.5 MB/s.  Under Windows 7 with i7, I'm able to max out the connection at 5 MB/s.

All of this leads me to believe that the GigE networking performance of Windows Server 2003 R2 was awful, given that I'm running Windows Server 2008 R2 on the exact same hardware with a 5x performance increase.

Net result: (no pun intended) I'm very happy with Windows 7.