Monday, April 30, 2007

Blogger is bad, mmkay?

Blogger is bad. I chose to use it because I already had a Google account, I'm quite satisfied with Google's other services, especially Gmail and Google Maps, and hoped that their purchase of Blogger/Blogspot would mean that this stuff was equally good. I was wrong.

First of all, Blogger is slow. This I knew before I joined up, by just looking at other Blogger blogs. I hoped that they'd move stuff over to Google's servers so it'd become faster, but it seems this hasn't happened yet.

Second, it's buggy. I used to have a label called “software”, but after some sequence of actions it indicated the wrong post count, and I couldn't get it right again. I'm not the only one with this problem, and it has not been fixed yet as far as I know. I reported the problem. It took them twelve days to respond, seven weeks to acknowlegde the issue on the “Known Issues” list, and it hasn't been fixed yet. And there are more bugs and annoyances beside this one.

Third, and most importantly, the administration panel is terrible.

  • I have to type this post into a tiny box of 17 lines high and 37 characters wide. What did I buy a 24" monitor for again?
  • I have to type HTML tags because the HTML editing mode screws things up – I daren't even click it to find out anymore.
  • The preview uses a different style sheet than the actual blog. No idea what a definition list or a blockquote will look like unless I publish an immature post.
  • You can't get a preview in a separate window without actually publishing the post. This means you cannot edit and preview side-by-side.
  • I cannot save a post without navigating away from the editing page. Yet, given the fleetingness of a browser's edit box, I like to save regularly.
  • The settings panel contains settings with incomplete or unclear descriptions. Getting from there to the help page (which isn't all that helpful) feels like the old Windows 3.1 days all over again. This is the web – what about a hyperlink directly to the relevant page?
  • Because the headings h1 through h3 are used by the Blogger interface, internal post headings should start at h4. However, these are way too small, and h5 is almost unreadable. I had to modify the template to fix this.
  • Uploading an image inserts a thumbnail into the post, which is in JPEG format even if the image is a GIF or PNG, even if the image is small enough not to need any resizing at all.
  • The image thumbnail is placed at the top of the post, not where the cursor was when I clicked the “Add Image” button. (Okay, may be a browser issue, but Opera 9.20 is not really an obscure browser.)

Oh, and whatever I try (cookies, cache, Javascript, …), I cannot log in using Opera 9.20 under Linux.

I got so fed up after writing this post today that I gave WordPress a try. It works a lot smoother and less clunky than Blogger. It has the possibility to import a Blogger blog.

However, the import from Blogger to WordPress does not copy the uploaded images, instead hotlinking them and giving funny preview popups. I'd have to copy all images by hand. I am too lazy for that. Making the switch would also mean that my readers have to update their feed readers. I will assume that you're too lazy for that. Finally, links from other sites to my blog would stop working. People would have to try and find the new location of, especially, this post, and they are probably too lazy for that. That post is linked from a number of locations where I couldn't even edit the link if I weren't too lazy for that.

Recommendation for new bloggers: click here. In the meantime, I'll go pray that somebody at Google will wake up and fix this mess.

Update: I tried to be helpful. I tried to contact the Blogger team and give them the URL to this post. But the closest I could find would be posting in the Blogger Help Group, and as you may understand I'm a bit reluctant to do that. There seems to be no way to contact the developers or even the support people directly …

Review: Free C# code analysis tools

Over the upcoming days, I'll be reviewing the C# code I'm working on for my Bachelor's thesis. It consists of nearly 9.000 lines of code (over 300 kB), so I felt somewhat reluctant to read it all through. I decided to try and identify the most obvious problems using automated code analysis tools first. Because I use Visual Studio 2005 under Windows for C# development, the reviews will mainly be targeted at this environment.

I tried the following programs:

Potential problem detection
FxCop, Code Analyzer, Gendarme, devAdvantage
Quality metrics
devMetrics, NDepend, SourceMonitor, vil
Code coverage
Similarity detection

None of the following reviews are very comprehensive, because I only played around with the tools for a little while, but these reviews can give you a good indication on what to use and what to ignore.


The FxCop program by Microsoft themselves checks assemblies for compliance with the .NET design guidelines, and identifies potential problems within the code.


FxCop checks a lot of issues with Microsoft guidelines. It gives a certainty percentage for each issue found, and also clearly explains what the problem is, and how to fix it. Some of the more interesting issues that are checked for:

  • Are exceptions raised that should not be raised by user code? For example, System.Exception should not be thrown directly.
  • Is an IFormatProvider supplied when converting strings to numbers? Very important if your code is to behave correctly under other locales.
  • Are reference parameters of methods checked to be non-null before use?
  • Are fields initialized to default values that are already assigned by the runtime, like null for reference fields? This would result in an unnecessary extra assignment.
  • Does the string argument to ArgumentOutOfRangeException contain the name of the argument?
  • Are there any unused local variables?
  • Are you using public nested classes? These are considered harmful by the guidelines.
  • Do abstract types have a public constructor? This should be made protected.
  • Are there any unused methods?
  • Are variables like fileName capitalized correctly? “filename” is wrong since “file” and “name” are (apparently) separate words.
  • Are you using a derived type as a method parameter where a base type would suffice?

This list goes on and on. FxCop found 557 issues in my code from dozens of different rules.

You can jump directly from an issue to the corresponding source line(s) in Visual Studio or another application of your choice.


I'm very impressed by the comprehensive list of flaws that this program detects. It is definitely very useful for anything larger than a toy application.

Code Analyzer

At first glance, the Code Analyzer tool seems to do similar things to FxCop. And indeed the website provides us with a useful list of the differences, broken English included:

FxCop advantages (comparing to Code Analyzer):
  • Extensive set of rules available out of box. Code Analyzer provides just limited set of sample rules.
  • Since it works with assembly metadata works with code created in any .NET language. Code Analyzer works now just with C# sources.
Code Analyzer advantages (comparing to FxCop):
  • FxCop is limited to assembly metadata, Code Analyzer works with source code and provides more functionality like comments, position in source code and more.
  • FxCop has flat rules structure, which makes orientation in policy more difficult for larger policies. Code analyzer has hierarchical structure, based on logical rules categories.
  • FxCop provides only one type of report, Code Analyzer is flexible and provides more report types and users can create their own report types.

Especially the first advantage of Code Analyzer, source code inspection (as opposed to assembly inspection) seems worthwhile. Unfortunately, the program crashed on startup so I am unable to test it.


Powered by the Cecil code inspection library, Gendarme tries to identify points of improvement in your code based on a certain set of rules. There is no binary version yet; you'll have to build it yourself from an SVN checkout.


There is no GUI or IDE plugin, so you're stuck with the command line. Gendarme is run on assemblies, so it does not inspect the actual source code. On my program, it identified the following problems at multiple points in my code:

  • You should use String.Empty instead of the literal "", because it gives better performance.
  • A static field is written to by an instance method. (This was intentional: each object gets a unique ID, and I increment the “next ID” field in the constructor.)
  • Newline literals (\r\n or \n) in strings are not portable; use Environment.NewLine instead.

All in all, useful, but nothing spectacular.


This could become a very useful tool, if the rule set is expanded. At the moment it will not identify very much. The lack of a decent user interface limits its practical use.

devAdvantage and devMetrics

devAdvantage is a Visual Studio add-in that helps you identify areas that might use refactoring. devMetrics is an add-in to compute code complexity metrics. The Community Editions can be downloaded for free. The programs look interesting, but do not work on Visual Studio 2005. Bummer.


NDepend is a very feature-rich quality measurement tool, also powered by Cecil. It operates on .NET assemblies, but because it also extracts debug information it can link this back to the original source code. A free one-month version can be downloaded for trial, academic and open-source use. You can view the getting-started animation to get an idea of the possibilities.


NDepend uses CQL, the Code Query Language, to extract information about the code. It allows you to construct your own queries if you're willing to invest the time to learn this. CQL is similar to SQL; take a look at this demo (3 minutes 30 seconds, Flash). For example, you can find all methods with over 200 intermediate language instructions, and sort them by descending number of instructions, using the following query:

SELECT METHODS WHERE NbILInstructions > 200 ORDER BY NbILInstructions DESC

NDepend comes with a few dozen built-in CQL queries that measure certain aspects of your code and can be used to quickly spot potential problems.

NDepend will spit out an HTML file like this one with humongous amounts of information on your project, most of which is just detailed factual information that is almost entirely useless. In my situation, NDepend failed to include the CQL results in the HTML file for some unknown reason.

The HTML file does, however, contain some useful information. It provides you with a table where the worst statistics are highlighted per method. It also lists warnings that, as far as I could tell, are not produced with CQL queries. In my case these were mainly “method so-and-so is protected and could be made private” warnings.

Other interesting features are the TypeRank and MethodRank, computed like the proven Google PageRank. It shows which types and methods are the most important in your program. On my program it did indeed give a very good indication.

The main part of the program is the VisualNDepend. This produces a two-dimensional chart much like the disk-space charts from SequoiaView (among others). The area of each rectangle indicates the value of some metric. by default this is the number of lines of code of the respective class or method, but you can also select metrics like the MethodRank or the cyclomatic complexity.

Unfortunately, there is no easy way to ignore certain source files or methods, e.g. designer-generated code. You'll want to ignore these while scanning the results, because generated code usually makes for terrible metrics. You can use CQL to do this, but you'll have to modify each of the predefined CQL quality metrics.


NDepend is a difficult tool to work with at first. It can give you a wealth of useful information once you get the hang of it, but for a quick inspection it is less practical.


SourceMonitor is a simple free program to compute quality metrics on your code. Apart from C#, it can also be used for C, C++, Java, Visual Basic, VB.NET, Delphi and (strangely) HTML.


SourceMonitor produces a table view of some quality metrics of your code, organized per source file. In the table view, you can double-click on any source file to get more detailed information about this file. This produces, among others, a chart showing how many statements are at a particular “block depth”, the number of brace pairs surrounding it.

The program creates a checkpoint for each measurement, so you can easily track the (hopefully) downward slope of your program's complexity while you are refactoring.


A simple, yet useful tool. Very easy to use and understand.


Everything that Vil does, according to its web site, is done better by NDepend. Also, Vil has no GUI yet and gives a very discontinued impression. I won't bother.


NCover is a code coverage tool. Its main purpose is to determine how much of your code is covered by your unit tests. It does this by simply running the program or tests and looking which lines are actually executed.


NCover is a simple command line tool without many bells and whistles. Simply tell it which program to run. It generates a large XML file with the output data (445 kB already in my relatively small program). The XML can be viewed with an accompanying XSLT style sheet, which you have to copy over to the right directory yourself.

The resulting view gives you a percentage bar for each class, showing the amount of code executed in that class. Clicking the class name expands it, breaking it down into methods. Clicking a method name breaks it down further into its individual lines.

There are ways to run NCover periodically and monitor the coverage of your tests. I haven't tried this.


A simple tool, but more useful than I thought at first glance. It can give you a good indication which parts (especially, which if branches) your unit tests have missed. (Then again, this turns unit testing more into a white-box test when it was intended to be black-box.)


Simian identifies regions of code that are similar. It is a little Java-oriented, but also supports many other popular languages. Simian is free for non-commercial projects and for trial purposes.


There is no GUI or Visual Studio plugin, you'll have to work from the command line. This hugely diminishes the ease of use, especially because the Windows command prompt is so clunky. Simian produces a list with entries like the following:

Found 11 duplicate lines in the following files: Between lines 30 and 63 in maths\MatrixAlgebra.cs Between lines 28 and 61 in maths\Vector.cs

Correct, MatrixAlgebra.cs was split up into Matrix.cs and Vector.cs, and should be removed entirely.


A duplicate code finder sounds very useful, but it's use is very limited. It found nothing useful on my project. The results may vary for other coders. In any case, the lack of IDE integration for .NET analysis makes using this tool more effort than it's worth.


If you care about the details, don't look any further than FxCop. It's very comprehensive and easy to use. Code Analyzer may complement FxCop nicely, if you can get it to run.

For a more general view on things, NDepend can be very useful, if you're willing to invest some hours to get acquainted with it. For a quick overview, SourceMonitor can be a better alternative.

If there's any free program that I've overlooked, please let me know so I can include it!

Monday, April 23, 2007

The importance of teeth brushing

I am not a dentist. I'm not going to tell you how brushing your teeth is good for you because it removes plaque and is healthy for your gums. For me, teeth brushing has another, entirely different use.

My morning ritual looks more or less like this: get up, go to the bathroom, have a shower, get dressed, eat breakfast, brush teeth. After that, I usually went away to university. Currently I'm mostly working and studying from home. In between the getting dressed is often a lot of e-mail reading, blog reading, forum reading etcetera. Because you're a blog reader yourself you know how time-consuming these things can be if you don't put a stop to them. Sometimes half of the morning has already passed before I close the web browser and start working.

It turns out that these are also the times that I neglect to brush my teeth. Teethbrushing signifies the end of my morning ritual, and thereby, the start of work time. The fresh taste in my mouth is the physical reminder of that. When I notice that I'm wasting time on reading blogs, I just have to get up and brush my teeth, and the sense of urgency to start working increases significantly, often up to the point that I start right away.

This conditioning may have been with me from the time I started to go to school. Even back then, brushing my teeth was one of the last things I did before I left. No wonder that the association is ingrained so deeply. A hack though it may be, it's very useful and I must take care to keep it.

Brushing my teeth is also the last thing I do before I go to sleep. It would be interesting to see whether it also makes me sleep better. Perhaps I can combine this experiment with some future power napping experiments.

Sunday, April 15, 2007

Panasonic NV-GS320 review

Last Saturday I bought the NV-GS320 digital video camera from Panasonic. Since this camera is pretty new, I couldn't find many decent reviews on the web, so I decided to write one of my own. (The NV-GS320 seems to be the same camera as the PV-GS320 but with some different names for the features. The spelling of “colour” on the Panasonic web page suggests that the NV was made for the European market.)

This camera sells for 500–600 euros, which places it in the medium- to high-end consumer range. I bought mine at Media Markt for € 578 (prices as of April 2007).

The combination of 3CCD and MiniDV makes this camera almost unique in its price range. Most other 3CCD cameras start around € 1000. How did Panasonic do this? Probably a tape deck is cheaper than a hard disk or a dvd writer. But what else did they leave out? Let's find out.

Contents of the package

Apart from the camera itself and a battery, the package includes a remote control which, for a nice change, includes the required button cell battery. There is a manual (Dutch in my case, no English version included) which is comprehensive and relatively decent, though not excellent. Also included in the package are a USB connector cable (large to small mini A plug), an adaptor and the necessary cables, and an A/V cable to output to S-Video and three phono connectors. A MiniDV tape and a FireWire cable are not included.

Picture quality

The NV-GS320 is one of the few cameras in its price range sporting three CCD sensors (3CCD). This is supposed to give a clearer picture with more vibrant colours. The sensor allows for hardware widescreen (16:9) ratio, without losing quality compared to standard 4:3. The camera uses a Leica Dicomar lens with a maximum zoom factor of 10×.

My first impressions of the picture quality were excellent. The images are very sharp and colourful in daylight:
(Click to enlarge.)
In the enlarged version it looks a bit pixelized, but this is a result of the deinterlacing. Of course, I cannot compare the image quality to that of other cameras, but in the absolute sense these pictures are very good.

The camera can either be set to automatic or manual mode. When switching to manual, the current settings of the automatic mode appear to be retained, which is very handy.

The automatic white balancing can take a few seconds to kick in, but usually finds the right balance. The same holds for the aperture and shutter speed – sudden changes in lighting are not picked up immediately. Whether that is good or bad depends on the situation. Autofocus works just fine and I haven't noticed any unexpected hiccups. Filming through a dirty window, however, is not recommended.

In manual mode, you can configure the aperture, gain (only when the aperture is fully open), shutter speed, and white balance. I have not used the manual mode much, as automatic seemed to work just fine in all conditions.

There is an option called “backlight compensation” which brightens the input at the cost of saturating a light background. This works fairly well and can be very handy when shooting, e.g., a portrait against a bright sky.

Panasonic's O.I.S. (optical image stabilizer), done in hardware by wiggling the lens, promises excellent correction for shaking. Many other cameras do this in software, slightly degrading picture quality along the way. My finding is that the image stabilizer manages very well to correct for small vibrations; if you hold the camera properly, it is possible to compose a fairly stable shot at the full 10× zoom. Larger shaking is not compensated for, but even these motions seem a bit smoother than usual.

Low-light performance

One of the most important factors in a camera is how it performs under bad lighting conditions, like lamp light, candle-light or worse.

Performance under indoor lamp light seems alright:

The picture does tend to get a bit blurry when moving, so shooting from a tripod whenever possible is recommended. However, as you can see, the level of noise is very acceptable. The above picture was taken with the maximum aperture and gain settings (18 dB); apparently the black areas were too difficult even then.

Additionally, there is a feature called “Colour night view” for shooting really dark scenes. The catch is that the framerate drops; I've observed factors between 4 (which may be acceptable sometimes) and 18 (which isn't). The other catch is that anything that moves becomes a big blur. The third catch is that light areas bleed a lot into dark areas.

If you can live with all of that, the night shot is pretty impressive for what I've seen. Here's a shot in the dark, lit only by a TFT monitor:

Admittedly, I tried to hold the camera very still while taking this shot.

Of course, TFT light is a little extreme, so I took the camera out to film by street light:
Left: without colour night view — Right: with colour night view
This seems to be one of the few situations where the white balancing screws up, resulting in a very reddish picture. I could not correct this by setting it to lamp light manually – street light is a different beast altogether. Manual white balancing would probably have fixed it, but I forgot to bring something white along. Also, you can clearly see the light bleeding into the dark areas.


Like with many digital videocameras, the Panasonic NV-GS320 is capable of taking still photographs. The maximum resolution is 2048×1512.

Unfortunately, this resolution is quite pointless. Even in bright light, when the aperture can be nearly closed, the photos taken are not very sharp:

When looking at them up close, it even seems that software sharpening has taken place, judging from the halos:
I suspect the picture is taken at a lower resolution and then scaled up in software.


One of the biggest weaknesses of this camera is the lack of an input for an external microphone, as well as headphone output. If you don't like the sound of the internal stereo microphone, you're out of luck.

That being said, the internal mic is quite decent. Any noise, from the tape motor or otherwise, got drowned out by the environment noise in places where I filmed. Handling of buttons (especially zoom) goes nearly unnoticed as well.

There is a setting to “zoom” the microphone. This, however, means applying gain to the signal, not altering the area over which sound is picked up. The microphone also picks up a lot of sound from the environment, which can be a good or a bad thing depending on circumstance.

The camera includes a wind noise filter. It's hard for me to judge how good this works; when shooting straight against the wind, noise is certainly there, but this may be normal. During 15 minutes of shooting outside on a medium-windy day, wind noise occurred only a few times, so it's not too bad overall.

Recording medium

This camera is one of the few that still record to MiniDV tapes; most cameras nowadays record to either a hard disk or some mini-dvd format. MiniDV still has some advantages: its compression factor is less, supposedly resulting in better image quality, and the tapes are quite cheap and widely available. Its prime disadvantage is the linearity of a tape, and the limited capacity (just over one hour). But tapes can be swapped, while hard disks cannot.

Still photographs are recorded to an SD card up to 2 GB or an SDHC card up to 4 GB.

LCD and viewfinder

The NV-GS320, unlike many of its colleagues, still has a viewfinder. Many other cameras nowadays rely only on the LCD display. Not only does this drain your battery, but also the picture can be hard to see in bright light.

However, LCD technology has come a long way, and even in broad daylight with the sun right behind me, I could still see the picture on the LCD quite well. But if you don't look under just the right angle, the LCD tends to show clipped whites where they aren't, suggesting over-exposure that isn't there. Looking straight at the screen, the problem disappears, but this is something to keep in mind especially when fine-tuning in manual mode.


The controls of the camera take some getting used to, because nearly everything is controlled by a little 4-way joystick which also functions as a push button. Once you get the hang of it, it's really quite easy and intuitive. The joystick controls an on-screen pie menu with options relating to the current mode (filming, playback etc.). The joystick is also used in the configuration menu.

The pie menu contains a tiny help feature, explaining the meaning of the little icons. This is convenient, because the text labels of the options cannot be seen before you activate or deactivate them. On the other hand, toggling an option to find out what it does is faster than calling up the help menu.

My overall impression of the menu structure is okay, though not perfect. But the menu is not deep, and you'll quickly learn where to find every feature.

Manual focus has to be done with the joystick, which is not half as convenient as having a proper focus ring, and on a small LCD it's hard to see whether you have focused properly. The LCD does not zoom in to assist you, nor does it show to what distance the focus is currently set.

Another annoyance is that you cannot hold down the button to increment or decrement values in manual mode; you have to keep wiggling the joystick to make large adjustments.

Some of the buttons cannot be reached when filming with one hand, most notably the menu button and the auto/manual switch. But you won't use these buttons while recording anyway.

The camera comes with a remote control, which duplicates most of the buttons on the camera, allowing for nearly complete control. There are also dedicated buttons for playback mode. One feature that is only accessible through the remote is “audio dub”, allowing you to create a voiceover right there on the camera. If you recorded the audio in 12 bits instead of 16, you'll be able to record the voiceover on a separate track without losing the original audio of the filmed material.

Battery life and power supply

According to the manual, the packaged battery can be used for 30 minutes when actively using the camera. It requires 1 hour and 40 minutes to recharge. Batteries with an effective lifetime of up to 1 hour 45 can be bought separately. However, I found that 30 minutes is not as bad as it sounds: I've been out filming for over an hour and the battery was still nearly full. I shot about 15 minutes of film in this time.

The accompanying adaptor can be used to power the camera directly, or to charge the battery while it's not in the camera, but not both at the same time. Unfortunately the FireWire and USB connections on the camera are located below the battery, so you'll have to switch to the adaptor while capturing to the computer, which means you can't recharge the battery at the same time. This could be problematic in some situations.

Capturing and editing

The camera can be connected to a computer using either USB 2.0 (cable included) or FireWire (cable not included). Adobe Premiere fans will like the FireWire, because Premiere Pro 2.0 is not really suitable for USB capturing. The accompanying software does a better job at USB capturing, because device control works. However, to use scene detection (place each captured clip into its own file), the software will rewind the tape a bit at every splitting point. I can't imagine that this is good for the tape nor the tape mechanism, and it's also completely unnecessary because Premiere has no problems capturing and splitting it all in one go.

Two editing programs are supplied: SweetMovieLife for basic editing and MotionDV STUDIO for (slightly) more advanced work. Because Premiere is my preferred piece of editing software, I only used MotionDV STUDIO for the USB capturing. First (and last) impression: have seen worse.


If image quality is your primary concern, this is the camera for you. In low light, too, it remains very usable. The picture stabilizer works pretty good. But don't buy this camera to take still photographs.

Audio quality is decent, but the mic picks up sound from all around the camera. The lack of a microphone input is a severe shortcoming.

The rest of the feature set is excellent, and the backlight compensation and night shot are nice additions. Remember to buy a FireWire cable if you intend to use anything but the accompanying software.

On the usability front, this camera is decent, but not excellent. If you're afraid of buttons and menus I'd recommend looking elsewhere, but anyone with a little bit of technical experience will have no problem controlling this camera.

Personally, I'll be returning this beast because I know I'll want to plug in an external microphone at some point. But if it weren't for that … I'd definitely go for it.