I'd love to see an answer to that too. If you don't get a solid answer, it may be that there hasn't been an empirical test done yet, so if you started one, that'd be a great thing. Meaning, take your video, encode it many, many different ways, and compare the results.
One thing that makes a difference is what you're using the video for - specifically, whether you do a lot of rapid frame-accurate seeks, switching between forward and reverse, etc; or if you're playing long clips straight, perhaps switching between clips occasionally or blending them, etc. Most codecs use temporal compression, so most frames are stored as differences from previous frames. The more sophisticated the codecs get, the more sophisticated this temporal compression is, which gets you higher levels of visual quality at smaller file size. The tradeoff is that if you want to jump to a specific frame, you often have to first jump to a keyframe near where you want to go, and then seek forward, accumulating the frame differencing until you get to the target frame. So for an application involving random access, the keyframe setting may have more of an effect than the actual codec.
Other factors include stuff like your hard drive. Say you have some nice high-res video. If you compress it a lot, each frame doesn't have too much data in it, but it takes a lot of processing power to decode; so if the rest of your patch has a high cpu load, that can slow it down. You can use less compression, so less computation, but if your hard drive is not that fast, at some point it can't get the data off the disk fast enough to display it either.
So there's a lot of variables, which is why maybe an empirical test suite would be a good thing.
btw what do you mean about free stuff online? Do you mean free codecs, like ogg vorbis/theora, or free encoding software, like ffmpeg and mencoder?