Thursday, November 19, 2009

Installing PHP on Windows 7

If you've wrestled with installing PHP on IIS before, there's good news. Installing PHP in Windows 7 is easy. Super easy. Microsoft has created the Web Platform Installer (WPI), which supports installing many things, PHP included.

If you look in the IIS Manager, down at the bottom of the rows of icons you should see Web Platform Installer. If you don't see it there, go to your Start menu and search for "platform".

Check off PHP and let it install. There are a couple of dependencies that will be added automatically.

There's one more change you need to make after PHP is installed, which is to add "index.php" to the list of Default Documents.

There are some other things you should know:
  • WPI correctly installs PHP even on a 64-bit system. There's no need to use cscript to separately change configurations.
  • Apparently, there's still no 64-bit version of PHP for Windows. WPI installs the 32-bit version, which works correctly even under the 64-bit version of IIS.
  • There's no uninstall option. However, everything is installed to a single directory and removing the relevant FastCGI settings is straightforward.
  • The correct strategy for upgrading to new versions of PHP isn't obvious. I'm not sure if Microsoft used the stock build of PHP or if they created a custom one.
One small gotcha - WPI installs PHP 5. If this is your first encounter with PHP 5, you need to make sure you use <?php ... ?> as opposed to <? ... ?>.

Monday, November 9, 2009

Handling Exceptions from STL

Recently I tracked down a crash to an out of bounds index in a STL vector. The strange thing was that the crash was being handled by Windows instead of by our error reporting code, which is designed to trap everything that the explicit exception handlers did not catch.

Obviously it wasn't.

I created some short test code:
vector vec;
vec[5] = 3;


When run under the debugger, I receive a helpful window that says "vector out of range" with the standard Abort/Retry/Ignore debugger options. All very well and good, but there was no exception being thrown.

I tried using set_unexpected(), which turns out didn't mean what I thought it meant. The function set_unexpected() is called when an exception is thrown through a function that doesn't declare that exception in the throw() statement in the function declaration. Visual C++ does not support these declarations, and so does not support set_unexpected(). This is discussed in the documentation.

So I tried set_terminate(), which is the only other function under Exception Handling Routines that seemed relevant. Unfortunately, my termination function was never called. Since my two line program certainly was not catching an exceptions, something else was happening.

I tried running the code again as a Release build instead of a Debug build. It still broke to the debugger, but in a separate place than the Debug build.

I tried reading the documentation, which makes no mention that operator[] can throw an exception.

Stranger and stranger. I tried walking the stack with the Release build, which looked like this:
msvcr90.dll!_crt_debugger_hook
msvcr90.dll!_invalid_parameter
msvcr90.dll!_invalid_parameter_noinfo
Exc.exe!wmain


The function _invalid_parameter() had this comment:
// No user handler is defined. Notify the debugger if attached.
So I tried running the Release build without the debugger. Which also yielded no useful information.

I don't remember exactly how I figured out what was happening, but it turns out that STL is throwing "out_of_range", which is derived from "logic_error", which is derived from class "exception". I've been using STL for twelve years. I've written articles about it. And I've never heard of these exception classes. Clearly, the authors of the documentation had never heard of them either.

Note that this behavior is different than other implementations of STL, which specifically document that only the at() functions throws an exception, not operator[]. At least "out_of_range" is defined in the standard.

[Update 8/13/2010 Visual Studio 2010 behaves according to the standard. In Release builds, vector<>::operator[] does not do bounds checking and does not throw the out_range_range exception. In Debug builds, an assertion is shown because _ITERATOR_DEBUG_LEVEL is set to 2 in yyvals.h if _DEBUG is defined.]

But the fun wasn't over. I still didn't know why set_terminate() wasn't working. The documentation for Unhandled C++ Exceptions only discusses set_terminate().

It turns out that unhandled logic_error exceptions are handled by invalid_parameter_handler and not by terminate_handler. This is broadly discussed in the documentation for Invalid Parameter Handler Routine. However, that documentation describes how errno is set, which doesn't appear to happen with STL.

The situation gets even more murky if you consider Windows SEH (Structured Exception Handling) as well as C++ exceptions. The standard way of dealing with SEH is to use SetUnhandledExceptionFilter(). However, Visual Studio 2008 changes this behavior. See a blog entry by Jochen Kalmbach. Related information about C-Runtime behavior is discussed in the MSDN forums.

Thursday, October 29, 2009

Router for 50Mbps Broadband Service

In my last blog entry I described how my WRT54G was the bottleneck in my new 50Mbps broadband service. Finding a replacement has ended up being substantially more difficult than I expected.

A point of confusion is that there are numerous different speeds. There's the speed of the wired ports, which is generally 10/100/1000. Then there's the speed of the wireless connection, generally 11/54/108/???. Then you have the WAN to LAN speed, which is the speed at which the router can actually route packets. This has *nothing* to do with the numbers above. Most older routers can't route more than 20 to 25Mbps, at which point they max out and become unresponsive. The routers that seem to be designed to have the best WAN to LAN performance are the "gaming" routers.

Very few routers include the specifications for the WAN to LAN performance. Of the ones that do, the specification is usually taken with important features like SPI (Stateful Packet Inspection) turned off. One Cisco router advertises 800Mbps WAN to LAN performance. What they don't tell you is that performance drops to 20Mbps (a 97% drop!) if you turn on IPS (Intrusion Protection System.)

The only reference I've found is the chart at smallnetbuilder.com. Make sure you select WAN to LAN Throughput. Less than 15% of the routers listed break the 150Mbps barrier. I chose this number because 100Mbps broadband is coming and I want some horsepower left over for features like SPI.

So if we look at the top routers, we learn that almost all of them are over $100 and many of them are over $150. This is a pretty big jump over the $40 routers that litter the bottom end of the list. A careful review of the routers on Amazon and NewEgg shows that many of these routers have serious problems. For example, the D-Link DIR-825 has been out for over a year and has achieved five stars on NewEgg with just 35% of reviewers. The somewhat better rated Linksys WRT600N is no longer for sale, and its replacement, the WRT610N, took a 20% performance hit and also falls to just 35% for five star ratings.

The D-Link DGL-4500, one of the "gaming" routers I mentioned earlier, has a more respectable 55% of five star ratings, has been out for two years, and costs $150. Personally, I've had several bad experiences with poor D-Link firmware in the past, so this isn't my first choice.

You might think I'm just being picky looking at the number of 5 star ratings, but the LinkSys WRT54GL router is a prime example of doing things right. After 2500 reviews, it has 84% five star ratings, and this router is prized by the hard-to-please hardcore techies.

At this point I'm leaning towards the Netgear WNDR3700. It's only been on the market for a couple of months, but has garnered 72% of five star ratings from the early adopters - impressive in an industry that usually requires a year of firmware updates to get things right. I've had troubles with Netgear in the past, but this whole situation seems to be be a matter of choosing the best of a dubious lot.

One router that hasn't shipped yet is the WNR3500L. It's based on open source and is generating a lot of buzz, but you can't get one yet. It's worth watching.

If you try to look at the cheaper units, the ratings become even more disparate. There are many more 1 star ratings for the LinkSys WRT120N than there are 5 star ratings. The Belkin N1 Vision F5D8232-4 suffers from similar ratings. The D-Link DGL-4300 is highly rated, but it runs hot. That makes sense - it's four years old, absolutely ancient technically.

Lastly, if you have Macs in-house, the AirPort Extreme seems to generate universal admiration. The MB053LL/A model scores well in the chart mentioned above, but there are several other models that aren't listed. The AirPort can be used with Windows, but it's much easier to configure with a Mac.

Tuesday, October 27, 2009

Cox Ultimate Broadband - 50Mbps

I dumped my old ISP and signed up for Cox's "Ultimate" broadband package, with speeds up to 55Mbps downstream and 5Mbps upstream. This is the bleeding edge, and getting it working at full speed is tricky.

Amazingly, Cox was able to get the DOCSIS 3.0 modem installed and running on the first try. Given that this technology is new for Cox, I was quite surprised.

So I fired up Speedtest.net and obtained:
22 Mbps download
9 Mbps upstream

Astute readers will notice that this is half of what I was expecting downstream and almost twice what I was expecting upstream. Strange.

I'll save you the details of four hours of sleuthing and simply present my results:
  • The Router.The problem is almost entirely caused by my WRT54G router running DD-WRT v24-sp1. The router maxes out at 22 Mbps. If one IP connection is running at that speed, the WRT54G won't even accept wired http connections to the status page. I tried installing TOMATO, and that maxed out at 28.5Mbps. A respectable improvement, but far short of what I needed. Discussions on dslreports.com indicate that my results are typical and that the WRT54G is simply too slow to meet my needs.
  • Buffer management.You might think a fast Broadband connection would be just like a 100MB local Ethernet connection, but it's not. Local Ethernet connections typically have submillisecond latencies. Broadband connections can easily have 200ms latencies, which means that over a megabyte of data can be transmitted before an ACK is received. This is a major change in how buffers are managed in the Ethernet stack. I found that tuning for 50Mbps broadband required parameters very similar for max throughput for Gigabit Ethernet.
  • RWIN and MTU.Windows Vista and Windows 7 automatically tune IP parameters, so I didn't need to adjust the RWIN, MTU, or other parameters. However, Windows XP and earlier users will almost certainly need to hand tune parameters to make things work properly. Verizon has a Speed Optimizer tool that does this automatically, but Cox does not. Don't use Verizon's tool - Verizon uses an MTU of 1492 and Cox uses 1500.
  • Wireless Connections.If you are using 802.11g (as most people do), your connection maxes out at about 20Mbps. (Maybe 25Mbps if your wireless connection is perfect.) Buying 28Mbps or 50Mbps service is simply a waste of money. You need to upgrade to 802.11n to run full speed.
  • Testing. 50 Mbps is fairly slow for a LAN, but it's the bleeding edge for consumer WAN technology. Most web sites simply can't service data that fast. Speedtest.net won't go that fast if it's busy. I can only download at about 10Mbps from my corporate server, which has a 100Mbps connection to a backbone. This is probably a TCP tuning problem on the RedHat server, but this demonstrates that both Windows and Linux have default connections that are not well suited to this configuration.
[Update 2/26/2010] Here is the speed test result from the iMac in the office. This was performed without any tuning of OS X: Speedtest.Net Result

Saturday, October 24, 2009

Porting C++ Applications to Visual Studio 2010

[This blog entry was updated on 4/12/2010 to reflect the final release of Visual Studio 2010.]

I've now spent a couple of days porting my C++ applications to VS2010. Here are the top ten things I've learned.

1. STL
If your app makes significant use of STL, prepare to spend some time making it work in VS2010. The new version of STL has been significantly reworked to take advantage of rvalue references and static asserts. I spent several hours reworking some of my derived classes to update the supported operators.

2. Boost
Make sure you get version 1.42 of Boost, which includes support for VS2010. Thankfully, Boost 1.42 has not yet been updated with C++0x features, so porting was relatively straightforward.

3. Property Manager
Earlier versions of Visual Studio used the Property Manager to manage certain settings like optimization and DLL usage. Visual Studio 2010 dramatically expands the use of the Property Manager. If you created your own pages in the property manager in an existing project, you'll find that VS2010 migrates the information to .props files instead of .vsprops. The files use different formats, although they are both XML. Don't forget to check these new files into source control. I had trouble with parameters that were lost when the project was migrated, including Additional Include Directories, Additional Libraries, and Delay Load Libraries. Make sure you compare your VS2005 or VS2008 property pages with the equivalent pages in VS2010.

4. Target Name
VS2010 introduces a new "Target Name" property in the General page of Configuration Properties. I still haven't figured out the rationale, but you need to make sure your Target Name matches the Output File in the Linker or Library pages. (Note that Target Name should not have the extension included.) One symptom of this happening is that you try to run your application and the application is not found, even after building correctly.

5. Default Directories
For the last several releases, the default Include and Library search paths were found under Tools/Options. Now they are on the General page of the project Properties, under "VC++ Directories." This is all managed from the Property Manager, so you can override the default directories for a single solution by replacing Microsoft.Cpp.Win32.User (or Microsoft.Cpp.x64.User) with your own custom page. This is a big win over earlier versions of Visual Studio, where the default directories were a global setting that couldn't be overridden.

6. Platform Toolset
You can use the Visual Studio 2008 compiler/linker under the VS2010 IDE, which allows you to get the improved user experience without having to port your code. However, if you are still stuck on VS2003 or VS2005, this won't help you.

7. Target CLR Version
It's possible to target versions of the CLR other than 4.0, but you can't do it from the IDE. See my related blog post for instructions.

8. Show Inactive Code Blocks
The editor is now much more successful at detecting Inactive Code Blocks. Historically this has been a problem because commands such as "Go to declaration" don't work in an Inactive Code Block.

9. MFC Application Size
The size of all of my MFC-based applications (statically linked) has grown by about 1.5MB. I'm not very happy about this. I double-checked all of my build settings and they are correct. A similar question on the Visual C++ Team Blog was given the answer that "The increased capabilities of MFC have introduced a number of additional interdependencies that end up causing more of MFC to be pulled in when linking a minimal application that links with the static MFC libraries." So this problem may not be solvable.

10. Intellitrace
IntelliTrace is the "historical debugging" tool that records the guts of your application as it runs. Unfortunately, IntelliTrace doesn't work for C++. Bummer. It also requires the Visual Studio Ultimate, which isn't available to Microsoft Partners. Double bummer.

11. Range-based "for" loops
One of the features I was particularly looking forward to was support for a compiler-based "foreach" syntax, which made it much simpler to write loops based on STL containers. Unfortunately, Visual Studio 2010 doesn't include this feature because the committee standardized is too late in the Visual Studio beta process. More information can be found in the comments section from this blog entry on the Visual C++ Team Blog.

Wednesday, October 21, 2009

Visual Studio 2010: Review/First Impressions for C++

I downloaded Visual Studio 2010 Ultimate Beta 2 and installed it on a system with Core 2 Duo 3GHz, Windows 7 RC, and 4GB RAM. Here are my first impressions.

First, the scope of this product is huge. There are so many features, it's like counting the stars in the sky (If you live in the city and can't see any stars, take a look at the Hubble Deep Field to see what I mean.) Once upon a time, you could classify a Visual Studio user as C#/VB/C++, with maybe some database work thrown in. Now we have XAML, Azure, SOAP, SharePoint, IIS, HTML, XML, and much more. The audience that this product caters to is diverse and far-reaching.

I'll limit my discussion to the C++ features, since I primarily do development in C++, MFC and ATL. The "gold standard" for developing in those technologies was Visual C++ 6.0 (VC6), which ran far better on a Pentium II 350 than Visual Studio 2008 ever did on a Core 2 Duo running ten times faster. VC6 was built by and for C++ development, and it was a pleasure to use. I didn't drop VC6 until I was finally forced to use VS2005 to support Vista.

So here are my first impressions:
  • VS2010 is much, much faster than VS2008. Opening the Server pane is now instant, instead of ten seconds or more of thrashing. Everyone was worried about the performance of the UI with the WPF rewrite. Everything I see related to performance is thumbs up.
  • Multiple core parallel compiles are now the default, instead of a hidden teaser like they were in VS2008.
  • The C++ editor is finally able to parse TODO and HACK tags in comments. The C# editor has been able to do this since VS2005.
  • The Help system works. Finally! The Help system last worked properly in VC6. Since VS2003, the help system has been bloated, insanely slow, and almost random in results that it returned. "Help 3.0" in VS2010 returns answers almost instantly. The help system is implemented as an http server running on the local system. When I search on Windows SDK calls, I actually get the result I want instead of useless Sharepoint, .Net and DDK results. This alone makes VS2010 worth the price of admission.
  • Help pages have been reformatted and are far more legible. However, it's not clear what will happen to the Community Content from VS2008.
  • The UI has some cosmetic changes, but the toolbars are basically unchanged. This is a relief after prior releases rearranged everything.
  • Many user interface stupidities have been fixed, like the resource browser closing every time you opened a resource for editing.
  • The resource editor allows you to put in an alpha-blended mockup for reference when editing dialogs. Cool.
  • The resource editor is still lame. Still no in-place editing of text.
  • The dialog editor has problems with locking up for ten seconds at a time. Hopefully this is just a Beta problem. Also, there is no context menu when right-clicking design elements.
  • Trying to search Help for #pragma still doesn't work.
  • New language constructs, like lambda functions and auto declarations, make functional programming much easier.
I haven't been able to test the size of generated code yet. I'm still trying to get my main projects to compile. There are breaking changes in STL, attributed ATL, and the build system that are causing me some rework.

I'm excited to try:

  • The Parallel Patterns Library (PPL). This will be a huge step forward in making use of multiple cores in C++. I've used the .Net Task Parallel Library in C# and was very impressed - it has some fantastic ideas behind its development.
  • The unit testing features, which appear to have been expanded since VS2008.
  • The "Basic" install of Team Foundation Server, which should let mere mortals use TFS without having the overhead of specialized servers.
  • Numerous other goodies that I'm still discovering.
The other piece of good news is the VS2010 Beta 2 is supposed to have a Go Live license, so you can ship code that it produces. Since the final release won't be until at least March, this makes it easier to start using new features.

Monday, October 12, 2009

NUnit Unit Testing with C++

I switch back and forth between C++ and C#. When doing C# development, NUnit rules the day for unit testing. Whether I'm doing automated tests from the command line or using the GUI to run selective tests (show below in a screenshot from SourceForge) NUnit is a pleasure to use.



If you've ever tried to run unit tests for C++, the landscape is much less appealing. C++ does not have a reflection API, nor does it have attributes that are embedded in the executable code, so it's much more tedious in C++ to do all of the housekeeping to initialize the framework and it's much more difficult to integrate external GUI tools. In short, unit test in C++ is a sub-par experience compared to more modern languages.

But I have good news for you - it's possible to use NUnit with C++. There are two minor caveats:
  1. You must be using Microsoft Visual Studio 2005 or later (sorry MinGW and cygwin users.)
  2. The code to be tested must either be a DLL or a static library. If you need to test code in an .EXE, you should factor that code out into its own static library.
The secret is to place your tests in a separate DLL compiled for C++/CLI, which you enable on the Visual Studio C++ properties page in the General properties, immediately under Configuration Properties. Under Project Defaults, set Common Language Runtime support to "/clr". Don't use any of the other variants - they won't work for this task.

If you haven't built this kind of project before, a C++/CLI project is a curious hybrid that contains all of the power of C++ as well as much of the power of .Net. (Access to certain features, like LINQ, is not available in C++/CLI.)

With a C++/CLI project, you can use NUnit attributes for classes and member functions, just like in C#. When you run NUnit, your target is your test DLL, which implicitly loads either your static library code or your DLL code.

This strategy can also be used in the test environment build into Visual Studio if you have the Professional edition or better.

Friday, September 25, 2009

CPAN error: Recursive dependency detected

Yesterday I updated my RedHat Enterprise system with the latest security fixes, after which my primary Perl scripts stopped working.

I went into the CPAN shell and started with 'install Bundle::CPAN', which yielded the following errors:

Recursive dependency detected:
Test::Harness
=> A/AN/ANDYA/Test-Harness-3.17.tar.gz
=> File::Spec
=> S/SM/SMUELLER/PathTools-3.30.tar.gz
=> Scalar::Util
=> G/GB/GBARR/Scalar-List-Utils-1.21.tar.gz
=> Test::More
=> M/MS/MSCHWERN/Test-Simple-0.94.tar.gz
=> Test::Harness.
Cannot continue.

I found several other references to this problem, none of which provided a solution for me.

I traced the problem back to List::Util. One low-level module required version 1.21 (which was installed), but gave an error that version 1.19 was installed. I traced the problem to the fact that my Perl installation had two separate directories where List::Util was stored:

/usr/lib/perl5/5.8.8/List/Util
/usr/lib/perl5/5.8.8/i386-linux-thread-multi/List/Util

The first directory contained version 1.21, the second directory contained version 1.19.

I solved the problem by removing the multi-thread directories for List and for Scalar:

/usr/lib/perl5/5.8.8/i386-linux-thread-multi/List
/usr/lib/perl5/5.8.8/i386-linux-thread-multi/Scalar

After this, I was able to install Bundle::CPAN.

Monday, August 31, 2009

PathMatchSpec Problems

Today I was debugging a problem in my application where a wildcard name failed to match. I traced the problem to significant issues with the implementation of PathMatchSpec() in the Windows API.

In short, do not expect this function to work like the command interpreter Cmd.exe. This function does not handle many boundary conditions, nor does it properly handle empty extensions.

The simplest example is this command, which correctly finds the Windows directory on all versions of Windows and MS-DOS 6.x:

dir c:\windows.*

However, this API call returns false on Vista:

BOOL b = ::PathMatchSpec("C:\\Windows", "C:\\Windows.*");

Other inconsistencies are shown in the table below. The "right" answer to these scenarios is unclear because MS-DOS did not support long filenames or spaces in filenames. Although various versions of Windows are themselves inconsistent, PathMatchSpec() does not agree with any of them. I would argue that the correct behavior is what Windows Vista does.

Command::PathMatchSpec("C:\\Windows", xxx);MS-DOSWin 9xWin NT
dir c:\windows.*FalseDisplays directory nameDisplays directory nameDisplays directory name
dir c:\windows.FalseDisplays directory contentsDisplays directory contentsDisplays directory contents
dir c:\windows...FalseDisplays directory nameDisplays directory contentsFail
dir "c:\windows "

(Note the trailing space)
Falsen/aDisplays directory contentsFail
dir "c:\windows ."

(Note the trailing space followed by a period.)
Falsen/aDisplays directory contentsFail

And yes, I did actually install MS-DOS to create this chart :-)

Thursday, August 27, 2009

Setting the default Windows SDK

Every time I install a new Windows SDK (or worse, a new copy of Visual Studio), I've gone through the painful process of updating all of the project directories for the Windows SDK include directory, lib directory, etc.

Visual Studio 2005 and 2008 are both smart enough to look in the version of the Windows SDK included with those compilers, but I'd never found a way to change the default version. Until now.

The Windows SDK comes with a utility called the Windows SDK Configuration Tool. You can find it in your Start menu.

When you run this, you can set the default SDK to be whichever version you want. Then Visual Studio will automatically reference that version of the SDK without any need to manually update project directories.

[Update 8/31/2011]
Visual Studio 2010 does not seem to pay attention to the Configuration Tool. Instead, this appears to be set on a project-by-project basis in Configuration Properties/Platform Toolset. After doing so, you may be able to fix some schizophrenic behavior by updating the MSBuild information too. Take a look at the following registry entries. (Thanks to the tip from http://geekswithblogs.net/rob/archive/2010/09/17/integrate-the-windows-sdk-v7.1-with-vs2010.aspx)

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSBuild\ToolsVersions\4.0
  • FrameworkSDKRoot (REG_SZ)
    • $(Registry:HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A@InstallationFolder)
  • SDK35ToolsPath (REG_SZ)
    • $(Registry:HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A\WinSDK-NetFx35Tools-x86@InstallationFolder)
  • SDK40ToolsPath (REG_SZ)
    • $(Registry:HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SDKs\Windows\v7.0A\WinSDK-NetFx40Tools-x86@InstallationFolder)

Friday, July 24, 2009

Dangers of Buying Laptops Parts on eBay

I've bought several laptops on eBay. All of them were Thinkpads coming off of lease, and all of them have worked well. My most recent purchase was a Thinkpad T42/p to run Windows 7, which, as I mentioned in an earlier post, has been very successful.

However, a five-year old Thinkpad T40 died on me recently. I bought some parts on eBay to try and fix the problem, and it's been an eye-opening experience.

Part #1 - New motherboard. Included one week warranty. Motherboard worked, but intermittently. By the time I figured out what was actually wrong, the warranty was expired. If I was more savvy fixing Thinkpad motherboards, I would have caught the problem earlier, but I ended up getting stuck with a $130 piece of junk.

Part #2 - New display. Billed as being in "Excellent Condition." Actual condition - display worked, but had significant damage to the bottom of the display that was visible once you turned the display on. However, these displays are relatively large (not like a DIMM module, for example) and paying the postage to send it back would have cost more than the display was worth. It turns out that PayPal doesn't care about fraudulent sellers - it's always the buyer's fault. Apparently, you can send somebody a 50 pound box of bricks instead of the laptop, and the buyer will still be responsible for paying the return postage in order to make a claim. PalPal refuses to talk to you unless you have the tracking number for the returned item.

Part #3 - Laptop, bought for parts, advertised as not booting. Result: seller lied about just about everything (except the part of the laptop not booting.) The battery wasn't new, it was five years old. The DVD was bad. The hard drive and the memory were both half of the advertised sizes.

All in all, I've had a 100% failure rate on all of the parts I ordered, even though all of these sellers had positive feedback of 99% or better.

A sample size of three isn't very large, but given my 100% success rate buying complete laptops and 100% failure rate buying parts, I'd place a strong warning on buying laptops parts via online auction.

Sunday, May 17, 2009

Why I Custom Build My Computers

For the last fifteen years I've built all of my own computers. The recipe has been straightforward:
  • Intel CPU at the "elbow" of the curve for price/performance.
  • ASUS motherboard with good fan control.
  • Hard drive recommended by StorageReview.com on their Leader Board.
  • Zalman CPU fan.
  • RAM with good reviews on NewEgg.
  • Namebrand power supply. For the last five years it's been a SeaSonic.
  • A fanless video card, usually an XFX nVidia card. (Nothing against ATI, I just know the nVidia quirks inside and out.)
The most important features to me are reliability, fan control and BIOS control. Reliability because I make my living based on these systems and BIOS control so that I can set the system up without compromise. I don't overclock, but BIOS control can be the critical difference when faced with non-Windows operating environments (which are often used by disaster recovery tools.)

Fan control is a biggy. I really, really hate fan noise. Only in the last few years of manufacturers started to work on this as computers have become home theater accessories. Fans have been made quieter and have been put under the control of the motherboard to slow them down as the cooling requirements drop.

I've ordered the last couple of new systems from EndPcNoise.com. They follow my "recipe" for everything except the hard drive. They also assemble it, test it, route and tie all of the cables, and add additional sound damping devices as requested. The systems I've bought from them have been the best I've ever owned. (I have no financial interest in this company, I'm just a happy customer.)

Last week I bought a used computer system for our test lab. Most of the lab computers are cast-offs from developers, so most of the lab systems follow my recipe. However, we needed a modern system for 64-bit testing, so I bought a used Gateway FX7026, a mid-range consumer system.

My expectations were appropriate for this system. I expected comparatively louder fan noise, poor documentation and shovelware installed with the operating system. I'm pleased to say that I was right on all of these. Unfortunately, things went downhill from there. I was reminded in no uncertain terms why I don't buy systems from vendors like Dell, HP and Gateway.

The first problem is figuring out what's in the computer. The documention doesn't really discuss it. As an example of someone who does this right, if you type a system's serial number into the IBM/Lenovo site, you'll be given all of the relevant build information for that particular system. Gateway doesn't give you any of that, so you have to find the components by searching the web or by tearing the system apart and trying to read part numbers.

The second problem is updating drivers. My definition of "simple reinstall" also comes from the IBM/Lenovo Thinkpad. Boot the "Restore CD." Walk away for 90 minutes. Come back, install the IBM System Update utility, let it install the latest required drivers. Install updates for Windows. Done.

Gateway used to have such a utility. They still recommend using it on all driver download pages. But the utility is no longer supported and does not work with any system built since 2004. So you have to manually go through the Downloads page, download each update, extract it, install it, reboot, and move on to the next update. Elapsed time - several hours. And you'd better have a second computer to help you with this because the drivers for the network chip are not built into XP.

I avoided most of these problems by installing Windows 7 RC, which has a remarkable inventory of drivers built into. There were only two red X's in the Device Manager after installation.

After Windows 7 booted, I was surprised to look in Task Manager and see that the network card maxed out at 100Mbps. This was a surprise because I had researched the G33 motherboard before purchasing the system and all models of the G33 include gigabit networking. I'd find out the cause shortly.

Next I tried to install the Windows 7 update that enabled Virtual PC and Windows XP. Except that the update refused to install, complaining that Virtualization extensions weren't supported. I knew that the processor supported them. This was a critical issue and I had read Intel's spec sheets before buying the computer.

I learned that Virtualization extensions required BIOS support. No problem, I'd get the latest BIOS from Intel. I had specifically bought this computer because it had a standard Intel motherboard. Gateway refers to the motherboard model as "Shroedoer Town" but doesn't list the exact model. I eventually found out from a handy article that it's a DG33SXG2. Shrewd readers will note that there is no such motherboard on Intel's site.

And here's the final insult. Gateway took Intel's bottom-of-the-line G33 motherboard - and detuned it with cheaper hardware! Gateway removed the GigE ethernet. They removed the ability to use standard Intel BIOS upgrades. They removed the support for virtualization extensions. And Gateway had the temerity to call it a "mid-level" system. I call it cheap.

So Gateway has now made sure that they will never, ever get more business from me. Poor documentation, poor hardware, poor driver updating - there really isn't a lot more to get wrong.

For me, this is a sad thing to see. The first computer I bought after I graduated college was a Gateway 486. I used the famous cow box as a coffee table in my apartment. The monitor was the very first 15" monitor on the market that was "affordable." There was a lot of innovation in that system and it lasted me for years.

P.S. I never did get the fan controls on the FX7026 to work. That's a standard feature of the G33 motherboard, so I can't tell if Gateway broke that too or if there were other factors at play. I also couldn't find any third party software, including SpeedFan, that supported the G33 fan and temperature controllers. Even Intel's software didn't work because I was running a 64-bit operating system.

Wednesday, April 15, 2009

AppCrash in StackHash_1703

I've wasted an hour solving this one. Here's the symptom. Your application crashes on Windows Vista almost immediately after it starts. If you look at the "Details", you see something like:

Problem signature
Problem Event Name: APPCRASH
Application Name: xxxxx
...
Fault Module Name: StackHash_1703
Fault Module Version: 0.0.0.0

The problem is that there's no such module as "StackHash_1703", so this appears to be some special case in Windows. I tried turning off antivirus and enabling compatibility mode, but the application still would not work.

The problem was that I had DEP (Data Execution Protection) enabled. For whatever reason, the error message gave me the AppCrash error above instead of the standard message about "DEP has closed the program."

To solve the problem, I added the application to the DEP exclusion list and everything worked again.

If the application that's crashing is video related (such as MovieMaker or Media Player), then your problem is probably caused by an old versions of Nero or an old video Codec.

Monday, March 9, 2009

Dell 1815dn Printer Review: Stay Far Far Away

I bought a Dell 1815dn multifunction laser printer a while back. I've absolutely hated the thing. Everything it does, it does the hard way. I would write more about how bad this printer is, but I've found someone else who beat me to it:

http://www.yelsew.com/dell-1815dn-review.html

I agree 110% with every single thing in this article. I could go on and on, except none of the language is printable in a family blog.

Sunday, March 1, 2009

Linux vs Windows: Uptime

If you haven't figured out from this blog, I'm a Windows Guy. But I have a dirty little secret: our corporate Internet presence is a Linux server. Recently I was talking to a Windows IIS developer at a medium-sized company and he was amazed that I would entrust our public face to such an "anarchy of developers." I asked him if he had ever administered a LAMP (Linux/Apache/MySQL/PHP) server and he admitted that he hadn't. I said to him that there were two key reasons that I use Linux: stability and maintainability.

Our corporate Linux server (Redhat Enterprise 3) has been rebooted once in the last five years and that was only because the power grid in the hosting facility was being upgraded. The current uptime count is 1,242 days, or almost three and a half years.

The pedants among you will point me to Netcraft's uptime list to show that Windows Servers can be kept up that long too. However, in my experience that's more than a rarity - it's almost a unique exception that requires Herculean efforts. In contrast, our corporate server was simply automatically updated as patches became available. All available patches have been installed (except kernel patches.) No special efforts were required to keep the Linux server running because of updates.

Much of this is possible because Linux does not have the concept of "in use" files. You'll never get an "Access denied" error because someone else has a file open. In practice, this means that shared libraries can be upgraded on the fly without affecting versions of those libraries that are in use by applications. Unix has allowed the replacement of open files for as long as I've used it, over 25 years.

So as you are sitting there thinking yourself smugly superior for your IIS server, take it from another Windows Guy: when your system absolutely, unquestionably has to stay running, Windows should probably not be your first choice.

You'll notice that the NetCraft list I cited earlier includes no LAMP servers. However, until fairly recently, most variants of Unix/Linux could not report more than 497 days of uptime. Thus the presence on the list of FreeBSD but not other variants. See the NetCraft uptime FAQ.

Postscript: Up until March 2007, the Uptime Project tracked who could keep a computer running the longest without rebooting. The winner was an OpenVMS system that had been running for nearly 12 years. Irish Rail allegedly had an OpenVMS machine up for 18 years. OpenVMS is also used by both the US Postal Service and by Amazon.com. Probably completely uninteresting for most of you, but I used VMS for about five years and it has a warm place in my heart.

Friday, February 27, 2009

XBOX Technical Support

I spend part of every day working with customers, sort of "the buck stops here" technical support. So I have more than a little compassion for people who, day after day, must suffer through clueless customers. But Microsoft's XBOX Technical Support has set a new low.

It all started when my brand new XBOX 360 made grinding noises and ate the demo DVD that came with the system. The problem has been thoroughly documented by the press:
http://www.llamma.com/xbox360/news/Xbox-360-Game-Disc-Scratched.htm

A little one in the house is very attached to one of the games on the DVD and failure was not an option.

Scratched DVD. Easy problem, right? No, not in Microsoft land.

First I try to ask the question online in their support system at support.xbox.com. The only choices are for console hardware failures. There's not even a choice for "Other." So I call the 800 number.

After wading through menus for at least ten minutes, I finally reached a human. It takes half an hour to explain that my DVD is scratched. Her grasp of English is shaky at best. She says I have to send my console back for repair. Arguing is futile because she doesn't speak English well enough. So I get a service repair order.

After I hang up, I go to the web site to obtain the printing label. I try to go to the support registration system, but it won't let me because the "serial number is already registered." Well duh, she just did that on the phone. But she didn't associate the registration with my Live account.

I fill out a support form on support.xbox.com asking what to do. I use the category "Console will not power on" for lack of a better choice. I get a long form letter, starting with "I know how disappointing it is that you're unable to process a repair online. Please accept our apologies for any inconvenience you have felt regarding this unfortunate matter." The email takes eight paragraphs to say "call the 800 number." I tabled the registration problem and just printed the pre-paid shipping label from the web site.

I packaged the system as requested. I taped the DVD to the console with a yellow sticky. I write with my fat red Sharpie pen, "BAD DVD."

A few days later, I get back a new console and a second package with a DVD. Microsoft shipped back the same DVD to me, except they took off the label that said "BAD DVD." There are so many incompetent people involved it's hard to know who to be upset at.

I call back the 800 number again. Every time I ask a question, the support person spends five to ten minutes talking to her supervisor. She wants the serial number for the bad DVD. DVDs don't have serial numbers. She wants some information from the DVD's book. The XBOX Arcade DVD is a demo disc and has no book. It becomes clear that she's apparently never even seen an XBOX 360, much less used one.

After 45 minutes on the phone, she hasn't been able to figure out how to get me a new DVD or how to fix the registration with my Live account. She doesn't even know what a Live account is and tries to get me to use the License Transfer option on the web site. I point out that there's no console associated with my Live account, so I can't have any licenses. She insists that I try anyway, which immediately gives an error message.

She finally gives me an incident number and tells me to call back in an hour. I think that's what's called a "brush off."

I call back two hours later. The new person speaks English quite well and understands it at least better than the last two people. I tell her the problems. She says that she can fix my registration problem by just updating my record. At least she understands the problem and the fix. Her fix doesn't work, but I feel that I've at least obtained validation that I'm asking the right questions.

However, she also gives me the brush off about the DVD. She insists that she can do that for me, but says that "I have to call back tomorrow." At which point she will have gone home and I'll have to start this whole process over with someone else.

So this whole thing should have been one phone call that lasted less than five minutes. Instead, Microsoft's collateral damage is as follows:
  • Support reps on the phone for two hours.
  • Paid to ship console to and from customer.
  • Paid to replace customer's console (which probably had nothing wrong with it.)
  • Paid to ship bad DVD back to customer, in a separate envelope.
  • Public display of complete incompetence.

And the sad thing is, I still don't have a disc that works :-(

Monday, February 23, 2009

Windows 7 Beta on a Thinkpad T42/p

This week I bought an old Thinkpad T42/p off of eBay. IMHO, this system was the pinnacle of the Thinkpad T series, with a great keyboard, a spacious 15" display running 1440x1050, Gigabit Ethernet, and most importantly an ATI Radeon Mobility 9600. This particular system has 1GB of RAM, which I'm told is the minimum required to run Aero. The only real downside to this laptop is that it does not support SATA drives, so you can't put in a 7200rpm SATA drive.

I did a clean install of Windows 7 and I have to say that it runs really well - far better than Windows XP ever did with a similar configuration. Superfetch makes a HUGE difference. I haven't tried to add ReadyBoost yet.

Although the system initially came up with the standard VGA display driver, the Radeon 9600 driver was automatically installed the first time I connected to Windows Update.

The Aero interface did not work at first. I'm not sure exactly what I did to get it working - maybe generating the Windows Experience Index under Control Panel / Performance Information and Tools. I also rebooted a few times. In any case, Aero is now working perfectly.

Here are the numbers reported by the Windows Experience Index:
Processor: 2.0
Memory (RAM): 4.1
Graphics: 2.0
Gaming graphics: 3.6
Hard drive transfer: 2.0


As near as I can tell, all features of the laptop are running properly. There are no missing drivers in the device list. Speedstep works properly. The wireless card works (although I had to choose WPA instead of WPA2 to connect properly to my WRT54G router running DD-WRT.) Display brightness works, volume works, even the ThinkLight. The only thing I installed from Lenovo is the Active Protection for the hard drive.

All in all, I'm quite happy with the T42/p and with Windows 7 experience running on this system.

[Update 5/21/2009] I wiped the system and installed Windows 7 RC today. I confirmed that Aero kicks in AFTER you generate the Windows Experience Index and reboot.

Here are the numbers reported by the Windows Experience Index for the RC. Compared to the Beta, Processor and Hard drive transfer went up and Gaming graphics went down.
Processor: 3.2
Memory (RAM): 4.1
Graphics: 2.0
Gaming graphics: 2.0
Hard drive transfer: 4.3


[Update 10/28/2009] Lenovo has published System Update version 4, which provides a few updates for Windows 7, including the HotKey manager for onscreen feedback of volume, brightness, etc. ActiveProtection is also supported. However, most of the other Thinkpad utilities are still not available for Windows 7.

Monday, January 26, 2009

Downloading symbols for .dmp files

We have an automated system for processing .dmp files received from the field, either our own collection or Microsoft's crash report system. We use cdb (the command line version of WinDbg) to automatically create a text file that contains the stack dump.

The batch file looks like this:

@set _NT_SYMBOL_PATH=SRV*c:\cache*http://msdl.microsoft.com/download/symbols
Cdb -lines -c "!analyze -v;q" -z %1

(Thanks to John Robbins for showing me how to use cdb in this manner.)

In theory, setting _NT_SYMBOL_PATH should provide cdb with enough information to automatically download symbols as needed. However, I wasn't seeing that happening. Without symbols, the debugger can't properly processes callstacks using FPO (Frame Pointer Omission), which means that the callstacks were often missing a lot of information.

Today I found a workaround. The symchk utility will examine a dmp file and verify that all of the required pdb and dbg files have been downloaded. Here is an example:

symchk /id Demo_000000.dmp /s SRV*c:\cache*http://msdl.microsoft.com/download/symbols

Friday, January 16, 2009

List Control (CListCtrl) beeping

For several months now I've been having problems with an MFC application where it would beep every time I changed selections in a CListView. I figured I was doing something wrong, but today I finally decided to track it down.

Thanks to VistaHeads for the answer.

Run Regedit and then delete the default value for
HKEY_CURRENT_USER\AppEvents\Schemes\Apps\.Default\ CCSelect\.current

Note that the offending value is blank, so it looks like you aren't doing anything by deleting the value, but go ahead and delete it anyway. (Delete the value, not the key.)

The other amusing thing I learned about this is that a worker thread is created to play the beep sound and the worker thread is created with the priority Time Critical. There are a lot of things that might be time critical, but beeping on a List View change is certainly not one of them. (Yeah, I know, interrupts, blah blah, buffer management, blah, blah. It's still silly.)

Monday, January 12, 2009

Unicode BOM Handling in the C Run-time Library

Visual Studio 2005 and later include Unicode BOM (Byte-Order Mark) support, but I found the documentation somewhat lacking. Here are a few hints.

One of the primary points of confusion for me was what you were defining by setting the encoding. The answer is that you are providing a hint as to the encoding of the file (but only a hint. If the file has a BOM, then that BOM takes precedence.)

All calls you make to read or write the file must be with Unicode APIs. If you try to use an ANSI API, the C-Runtime library (CRT) will assert. This means that the CRT will do character set conversion between Unicode and the file's encoding, but won't do character set conversion between the local code page and the file's encoding. For example, you'll get an assertion if you open the file with ccs=utf-8 and then try to use fgets to read the data.

Other points:
  • The CRT will not perform any BOM handling if you do not specify a ccs= encoding. This means that backwards compatibility is retained because the CRT does not perform any behind-the-scenes processing if you don't ask it to.
  • Most BOM formats are not supported. For example, UTF-7, UTF-32 and especially UCS-16 big-endian are not handled.
  • If you specify a specify a ccs= encoding, then the BOM will be automatically removed from the data stream. However, you need to be careful of file positioning calls such as fseek and rewind because the bom will only be skipped when the file is first opened. For example, if you do fopen, fread, rewind, fread, then the second fread will read the BOM and the first fread will not.
  • The file encoding is respected when writing, so the number of characters actually written may be lesser or greater than the buffer size you wrote.
  • If you open the file in binary mode, then any ccs= specification will be ignored and no BOM handling will be performed.
  • Apparently the CRT does not provide a documented way to determine the encoding of the file.

Thursday, January 1, 2009

How Not To Run A Web Site

Last night, New Years Eve, I received the following error from Evite:

Sorry! Due to planned maintenance, certain pages may be unavailable. We should be back up within 30 minutes, so please check back shortly.

Thanks for your patience!

As it turned out, "certain pages" included both the home page and the party I was attending.

What kind of complete idiot schedules "maintenance" on the biggest party night of the year for a web site that manages party invitations?

I could understand system overload, but scheduled maintenance?!?

So, with just six hours left in the year, Evite easily won my award for Best Example of How Not To Run A Web Site.