Friday, February 23, 2007

Virtual Server 2007 versus VMware Server

With the advent of the Core 2 Duo processors, it's finally possible to build a server for virtual machines without breaking the bank. I have three computers in my office that run various versions of Exchange Server for testing. Most of the time, these boxes are little more than space heaters, so I was looking forward to integrating everything into a single box.

I started by installing Windows Server 2003 x64, which is required for Exchange 2007. My plan was to run Exchange 2007 on the host x64 operating system and the older versions of Exchange in virtual machines.

The Microsoft solution for creating virtual servers is Microsoft Virtual Server 2005 R2. I had tried to install the software before the R2 version and I found it to be difficult to install and even more difficult to use. The documentation, while extensive, didn't have any Getting Started section that made any sense. My experience installing the R2 version was only slightly better and I spent several hours figuring out how to do things that should have been obvious.

The next step was to convert my physical servers into virtual servers (P2V). I spent almost a day researching how to do this and finally gave up. Microsoft's tools are immature and very difficult to use. The P2V conversion process relies on having a Domain Controller, Microsoft Operations Manager (the 32-bit version), and other requirements, none of which I had the time or inclination to set up. I uninstalled Virtual Server 2005 R2.

The alternative was VMware's product, VMware Server, which just recently became free. I found this software to be very easy to use. I installed it, opened the desktop icon labeled "VMware Server Console," and was able to immediately get started. This was a stark contrast to my experience with Microsoft's product. (Note that I've used Virtual PC since before Microsoft purchased it, so I have no shortage of experience with the technology that Microsoft Virtual Server is based on.)

Converting my physical servers to virtual servers was equally easy. I downloaded VMware Converter and followed the simple wizard. Once the conversion process was finished, I went back to VMware Console, added the new virtual machine, booted it, and it worked. The only caveat was that you have to power down the old physical computer before booting the virtual machine, or you end up with duplicate names. Also, if you've set your DHCP server to hand out a particular IP address to the MAC on your old server, you need to update your DHCP server.

It's also possible to run the VMware console on other computers by downloading the Client installer from the VMware downloads page. Although this works well when you are managing multiple servers, the sluggish screen update over the LAN can be annoying. I use Remote Desktop when I need to work with one of the virtual servers for any period of time.

I'm really impressed with VMware Server. It works well and has some impressive upgrades if I need better manageability than I have now.

Sunday, February 18, 2007

ReadyBoost Redux

If you've followed my blog, you'll know that my experience with ReadyBoost has been painful. The most recent problem has been that the Apacer HT203 thumb drive isn't recognized as a USB 2.0 drive. Whenever I plug it in, Vista (or Windows XP on another system) gives the error that the "drive would perform better if plugged into a high speed port." Since Windows thought the device was USB 1.0, ReadyBoost wouldn't work at all. This fixed the hanging problem, but not the way I intended.

Previously, I had written Apacer to try and get help with the system hangs. Apacer technical support did finally respond. They hadn't heard of my problem before (predictably) but gave me a utility to low level format the drive, which I tried but it didn't help. (Note: it costs about $15 in postage, round trip, to send a thumb drive to Apacer for repairs. My recommendation - if the drive is 1GB or less, throw it away and buy a new one.)

Tonight I was tinkering with the drive again and tried to format the drive in Windows, just for chuckles. At which point I realized that Windows had previously formatted the drive as FAT instead of FAT32. I reformatted the device as FAT32, unplugged the device, plugged it back in, and suddenly it worked again as a USB 2.0 device!

Unfortunately the device failed again after a few hours, so I'm assuming it's a bad thumb drive and I'll replace it.

Thursday, February 15, 2007

Fixing "Minimize CRT Use in ATL"

One of our products uses an ATL-based DLL as part of its installation. Under Visual Studio 6, the component was 39KB. When I rebuilt the DLL under VC2005, the size jumped to 108KB, a rather large increase for no additional functionality.

The "General" property sheet contains an item "Minimize CRT Use in ATL", which seems to be exactly what I want to solve the problem. However, getting it to work ended up being a big effort.

This problem had been discussed in several other places with no solution. Examples include the following sites. The last site concerned me the most, because Microsoft closed a bug filed against the problem saying "Won't Fix."

The problem can be reproduced very easily:

1. Create an ATL DLL project using VC++ 2005 wizard
2. Switch projects to the Release configuration
3. Set "Use of ATL" to Static Link to ATL and set "Minimize CRT Use In ATL" to Yes.
4. Build the project.

You'll get the following errors:
LIBCMT.lib(tidtable.obj) : error LNK2005: __encode_pointer already defined in atlmincrt.lib(atlinit.obj)
LIBCMT.lib(tidtable.obj) : error LNK2005: __encoded_null already defined in atlmincrt.lib(atlinit.obj)
LIBCMT.lib(tidtable.obj) : error LNK2005: __decode_pointer already defined in atlmincrt.lib(atlinit.obj)
LIBCMT.lib(crt0dat.obj) : error LNK2005: __get_osplatform already defined in atlmincrt.lib(atlinit.obj)
LIBCMT.lib(crt0dat.obj) : error LNK2005: __osplatform already defined in atlmincrt.lib(atlinit.obj)

I first tried ignoring LIBCMT by adding it to "Ignore Specific Library" in the property sheet. This produced another set of errors, including:
atlmincrt.lib(atlloadcfg.obj) : error LNK2001: unresolved external symbol ___security_cookie
atls.lib(atlbase.obj) : error LNK2001: unresolved external symbol __except_handler4

I next tried excluding both ATLS and LIBCMT. This didn't work either. Some of the posts mentioned above tried excluding atlmincrt.lib, but since this appears to me to be the key library that makes everything work, I didn't consider excluding it to be a viable solution.

After several hours of trial and error, I determined that disabling exception handling allows the DLL to link successfully. You do this in the project properties under C/C++ | Code Generation | Enable C++ Exceptions. Set it to No.

At this point I was successful. My component was now 50KB. This was still not ideal, but much better than the 108KB I started with.

There is one important caveat to this approach. If you don't let ATL use the C run-time, then you can't use it either. Any attempt to call a function such as atoi, printf, or any other C run-time function will bring back all of the original link errors, such as error LNK2005: __encode_pointer already defined in atlmincrt.lib(atlinit.obj). Many C run-time functions have Windows equivalents, such as lstrcpy instead of strcpy, so it's possible to not use the C run-time, but it requires discipline.

If your project is generating the "__encode_pointer" link errors and you can't figure out what function you are calling that is causing the problem, then temporarily add LIBCMT to the Ignore Libraries list and build the project. Look for unresolved symbols in your object files. Ignore unresolved symbols in atls.lib, stdafx.obj, and atlmincrt.lib

Monday, February 12, 2007

Ms. Dewey Search Engine

Craziest twist on a search engine I've ever seen. Just try it.
(Requires Flash)

Update: Sadly, Ms. Dewey was shut down in early 2009. The Internet shall be a more somber place without her.