Thursday, October 29, 2009

Router for 50Mbps Broadband Service

In my last blog entry I described how my WRT54G was the bottleneck in my new 50Mbps broadband service. Finding a replacement has ended up being substantially more difficult than I expected.

A point of confusion is that there are numerous different speeds. There's the speed of the wired ports, which is generally 10/100/1000. Then there's the speed of the wireless connection, generally 11/54/108/???. Then you have the WAN to LAN speed, which is the speed at which the router can actually route packets. This has *nothing* to do with the numbers above. Most older routers can't route more than 20 to 25Mbps, at which point they max out and become unresponsive. The routers that seem to be designed to have the best WAN to LAN performance are the "gaming" routers.

Very few routers include the specifications for the WAN to LAN performance. Of the ones that do, the specification is usually taken with important features like SPI (Stateful Packet Inspection) turned off. One Cisco router advertises 800Mbps WAN to LAN performance. What they don't tell you is that performance drops to 20Mbps (a 97% drop!) if you turn on IPS (Intrusion Protection System.)

The only reference I've found is the chart at smallnetbuilder.com. Make sure you select WAN to LAN Throughput. Less than 15% of the routers listed break the 150Mbps barrier. I chose this number because 100Mbps broadband is coming and I want some horsepower left over for features like SPI.

So if we look at the top routers, we learn that almost all of them are over $100 and many of them are over $150. This is a pretty big jump over the $40 routers that litter the bottom end of the list. A careful review of the routers on Amazon and NewEgg shows that many of these routers have serious problems. For example, the D-Link DIR-825 has been out for over a year and has achieved five stars on NewEgg with just 35% of reviewers. The somewhat better rated Linksys WRT600N is no longer for sale, and its replacement, the WRT610N, took a 20% performance hit and also falls to just 35% for five star ratings.

The D-Link DGL-4500, one of the "gaming" routers I mentioned earlier, has a more respectable 55% of five star ratings, has been out for two years, and costs $150. Personally, I've had several bad experiences with poor D-Link firmware in the past, so this isn't my first choice.

You might think I'm just being picky looking at the number of 5 star ratings, but the LinkSys WRT54GL router is a prime example of doing things right. After 2500 reviews, it has 84% five star ratings, and this router is prized by the hard-to-please hardcore techies.

At this point I'm leaning towards the Netgear WNDR3700. It's only been on the market for a couple of months, but has garnered 72% of five star ratings from the early adopters - impressive in an industry that usually requires a year of firmware updates to get things right. I've had troubles with Netgear in the past, but this whole situation seems to be be a matter of choosing the best of a dubious lot.

One router that hasn't shipped yet is the WNR3500L. It's based on open source and is generating a lot of buzz, but you can't get one yet. It's worth watching.

If you try to look at the cheaper units, the ratings become even more disparate. There are many more 1 star ratings for the LinkSys WRT120N than there are 5 star ratings. The Belkin N1 Vision F5D8232-4 suffers from similar ratings. The D-Link DGL-4300 is highly rated, but it runs hot. That makes sense - it's four years old, absolutely ancient technically.

Lastly, if you have Macs in-house, the AirPort Extreme seems to generate universal admiration. The MB053LL/A model scores well in the chart mentioned above, but there are several other models that aren't listed. The AirPort can be used with Windows, but it's much easier to configure with a Mac.

Tuesday, October 27, 2009

Cox Ultimate Broadband - 50Mbps

I dumped my old ISP and signed up for Cox's "Ultimate" broadband package, with speeds up to 55Mbps downstream and 5Mbps upstream. This is the bleeding edge, and getting it working at full speed is tricky.

Amazingly, Cox was able to get the DOCSIS 3.0 modem installed and running on the first try. Given that this technology is new for Cox, I was quite surprised.

So I fired up Speedtest.net and obtained:
22 Mbps download
9 Mbps upstream

Astute readers will notice that this is half of what I was expecting downstream and almost twice what I was expecting upstream. Strange.

I'll save you the details of four hours of sleuthing and simply present my results:
  • The Router.The problem is almost entirely caused by my WRT54G router running DD-WRT v24-sp1. The router maxes out at 22 Mbps. If one IP connection is running at that speed, the WRT54G won't even accept wired http connections to the status page. I tried installing TOMATO, and that maxed out at 28.5Mbps. A respectable improvement, but far short of what I needed. Discussions on dslreports.com indicate that my results are typical and that the WRT54G is simply too slow to meet my needs.
  • Buffer management.You might think a fast Broadband connection would be just like a 100MB local Ethernet connection, but it's not. Local Ethernet connections typically have submillisecond latencies. Broadband connections can easily have 200ms latencies, which means that over a megabyte of data can be transmitted before an ACK is received. This is a major change in how buffers are managed in the Ethernet stack. I found that tuning for 50Mbps broadband required parameters very similar for max throughput for Gigabit Ethernet.
  • RWIN and MTU.Windows Vista and Windows 7 automatically tune IP parameters, so I didn't need to adjust the RWIN, MTU, or other parameters. However, Windows XP and earlier users will almost certainly need to hand tune parameters to make things work properly. Verizon has a Speed Optimizer tool that does this automatically, but Cox does not. Don't use Verizon's tool - Verizon uses an MTU of 1492 and Cox uses 1500.
  • Wireless Connections.If you are using 802.11g (as most people do), your connection maxes out at about 20Mbps. (Maybe 25Mbps if your wireless connection is perfect.) Buying 28Mbps or 50Mbps service is simply a waste of money. You need to upgrade to 802.11n to run full speed.
  • Testing. 50 Mbps is fairly slow for a LAN, but it's the bleeding edge for consumer WAN technology. Most web sites simply can't service data that fast. Speedtest.net won't go that fast if it's busy. I can only download at about 10Mbps from my corporate server, which has a 100Mbps connection to a backbone. This is probably a TCP tuning problem on the RedHat server, but this demonstrates that both Windows and Linux have default connections that are not well suited to this configuration.
[Update 2/26/2010] Here is the speed test result from the iMac in the office. This was performed without any tuning of OS X: Speedtest.Net Result

Saturday, October 24, 2009

Porting C++ Applications to Visual Studio 2010

[This blog entry was updated on 4/12/2010 to reflect the final release of Visual Studio 2010.]

I've now spent a couple of days porting my C++ applications to VS2010. Here are the top ten things I've learned.

1. STL
If your app makes significant use of STL, prepare to spend some time making it work in VS2010. The new version of STL has been significantly reworked to take advantage of rvalue references and static asserts. I spent several hours reworking some of my derived classes to update the supported operators.

2. Boost
Make sure you get version 1.42 of Boost, which includes support for VS2010. Thankfully, Boost 1.42 has not yet been updated with C++0x features, so porting was relatively straightforward.

3. Property Manager
Earlier versions of Visual Studio used the Property Manager to manage certain settings like optimization and DLL usage. Visual Studio 2010 dramatically expands the use of the Property Manager. If you created your own pages in the property manager in an existing project, you'll find that VS2010 migrates the information to .props files instead of .vsprops. The files use different formats, although they are both XML. Don't forget to check these new files into source control. I had trouble with parameters that were lost when the project was migrated, including Additional Include Directories, Additional Libraries, and Delay Load Libraries. Make sure you compare your VS2005 or VS2008 property pages with the equivalent pages in VS2010.

4. Target Name
VS2010 introduces a new "Target Name" property in the General page of Configuration Properties. I still haven't figured out the rationale, but you need to make sure your Target Name matches the Output File in the Linker or Library pages. (Note that Target Name should not have the extension included.) One symptom of this happening is that you try to run your application and the application is not found, even after building correctly.

5. Default Directories
For the last several releases, the default Include and Library search paths were found under Tools/Options. Now they are on the General page of the project Properties, under "VC++ Directories." This is all managed from the Property Manager, so you can override the default directories for a single solution by replacing Microsoft.Cpp.Win32.User (or Microsoft.Cpp.x64.User) with your own custom page. This is a big win over earlier versions of Visual Studio, where the default directories were a global setting that couldn't be overridden.

6. Platform Toolset
You can use the Visual Studio 2008 compiler/linker under the VS2010 IDE, which allows you to get the improved user experience without having to port your code. However, if you are still stuck on VS2003 or VS2005, this won't help you.

7. Target CLR Version
It's possible to target versions of the CLR other than 4.0, but you can't do it from the IDE. See my related blog post for instructions.

8. Show Inactive Code Blocks
The editor is now much more successful at detecting Inactive Code Blocks. Historically this has been a problem because commands such as "Go to declaration" don't work in an Inactive Code Block.

9. MFC Application Size
The size of all of my MFC-based applications (statically linked) has grown by about 1.5MB. I'm not very happy about this. I double-checked all of my build settings and they are correct. A similar question on the Visual C++ Team Blog was given the answer that "The increased capabilities of MFC have introduced a number of additional interdependencies that end up causing more of MFC to be pulled in when linking a minimal application that links with the static MFC libraries." So this problem may not be solvable.

10. Intellitrace
IntelliTrace is the "historical debugging" tool that records the guts of your application as it runs. Unfortunately, IntelliTrace doesn't work for C++. Bummer. It also requires the Visual Studio Ultimate, which isn't available to Microsoft Partners. Double bummer.

11. Range-based "for" loops
One of the features I was particularly looking forward to was support for a compiler-based "foreach" syntax, which made it much simpler to write loops based on STL containers. Unfortunately, Visual Studio 2010 doesn't include this feature because the committee standardized is too late in the Visual Studio beta process. More information can be found in the comments section from this blog entry on the Visual C++ Team Blog.

Wednesday, October 21, 2009

Visual Studio 2010: Review/First Impressions for C++

I downloaded Visual Studio 2010 Ultimate Beta 2 and installed it on a system with Core 2 Duo 3GHz, Windows 7 RC, and 4GB RAM. Here are my first impressions.

First, the scope of this product is huge. There are so many features, it's like counting the stars in the sky (If you live in the city and can't see any stars, take a look at the Hubble Deep Field to see what I mean.) Once upon a time, you could classify a Visual Studio user as C#/VB/C++, with maybe some database work thrown in. Now we have XAML, Azure, SOAP, SharePoint, IIS, HTML, XML, and much more. The audience that this product caters to is diverse and far-reaching.

I'll limit my discussion to the C++ features, since I primarily do development in C++, MFC and ATL. The "gold standard" for developing in those technologies was Visual C++ 6.0 (VC6), which ran far better on a Pentium II 350 than Visual Studio 2008 ever did on a Core 2 Duo running ten times faster. VC6 was built by and for C++ development, and it was a pleasure to use. I didn't drop VC6 until I was finally forced to use VS2005 to support Vista.

So here are my first impressions:
  • VS2010 is much, much faster than VS2008. Opening the Server pane is now instant, instead of ten seconds or more of thrashing. Everyone was worried about the performance of the UI with the WPF rewrite. Everything I see related to performance is thumbs up.
  • Multiple core parallel compiles are now the default, instead of a hidden teaser like they were in VS2008.
  • The C++ editor is finally able to parse TODO and HACK tags in comments. The C# editor has been able to do this since VS2005.
  • The Help system works. Finally! The Help system last worked properly in VC6. Since VS2003, the help system has been bloated, insanely slow, and almost random in results that it returned. "Help 3.0" in VS2010 returns answers almost instantly. The help system is implemented as an http server running on the local system. When I search on Windows SDK calls, I actually get the result I want instead of useless Sharepoint, .Net and DDK results. This alone makes VS2010 worth the price of admission.
  • Help pages have been reformatted and are far more legible. However, it's not clear what will happen to the Community Content from VS2008.
  • The UI has some cosmetic changes, but the toolbars are basically unchanged. This is a relief after prior releases rearranged everything.
  • Many user interface stupidities have been fixed, like the resource browser closing every time you opened a resource for editing.
  • The resource editor allows you to put in an alpha-blended mockup for reference when editing dialogs. Cool.
  • The resource editor is still lame. Still no in-place editing of text.
  • The dialog editor has problems with locking up for ten seconds at a time. Hopefully this is just a Beta problem. Also, there is no context menu when right-clicking design elements.
  • Trying to search Help for #pragma still doesn't work.
  • New language constructs, like lambda functions and auto declarations, make functional programming much easier.
I haven't been able to test the size of generated code yet. I'm still trying to get my main projects to compile. There are breaking changes in STL, attributed ATL, and the build system that are causing me some rework.

I'm excited to try:

  • The Parallel Patterns Library (PPL). This will be a huge step forward in making use of multiple cores in C++. I've used the .Net Task Parallel Library in C# and was very impressed - it has some fantastic ideas behind its development.
  • The unit testing features, which appear to have been expanded since VS2008.
  • The "Basic" install of Team Foundation Server, which should let mere mortals use TFS without having the overhead of specialized servers.
  • Numerous other goodies that I'm still discovering.
The other piece of good news is the VS2010 Beta 2 is supposed to have a Go Live license, so you can ship code that it produces. Since the final release won't be until at least March, this makes it easier to start using new features.

Monday, October 12, 2009

NUnit Unit Testing with C++

I switch back and forth between C++ and C#. When doing C# development, NUnit rules the day for unit testing. Whether I'm doing automated tests from the command line or using the GUI to run selective tests (show below in a screenshot from SourceForge) NUnit is a pleasure to use.



If you've ever tried to run unit tests for C++, the landscape is much less appealing. C++ does not have a reflection API, nor does it have attributes that are embedded in the executable code, so it's much more tedious in C++ to do all of the housekeeping to initialize the framework and it's much more difficult to integrate external GUI tools. In short, unit test in C++ is a sub-par experience compared to more modern languages.

But I have good news for you - it's possible to use NUnit with C++. There are two minor caveats:
  1. You must be using Microsoft Visual Studio 2005 or later (sorry MinGW and cygwin users.)
  2. The code to be tested must either be a DLL or a static library. If you need to test code in an .EXE, you should factor that code out into its own static library.
The secret is to place your tests in a separate DLL compiled for C++/CLI, which you enable on the Visual Studio C++ properties page in the General properties, immediately under Configuration Properties. Under Project Defaults, set Common Language Runtime support to "/clr". Don't use any of the other variants - they won't work for this task.

If you haven't built this kind of project before, a C++/CLI project is a curious hybrid that contains all of the power of C++ as well as much of the power of .Net. (Access to certain features, like LINQ, is not available in C++/CLI.)

With a C++/CLI project, you can use NUnit attributes for classes and member functions, just like in C#. When you run NUnit, your target is your test DLL, which implicitly loads either your static library code or your DLL code.

This strategy can also be used in the test environment build into Visual Studio if you have the Professional edition or better.