Hi Dapper Dan,
Let me try to answer your questions. I don't pretend to be an expert in this field so others can chime in as they like, but this is how I was given to understand it all works.
Changing the vertical resolution from 480 to 720 is primarily a matter of scaling/de-interlacing. The upconversion will always take place in the digital domain (hence the importance of good quality ADC's). So the first step is to digitize the input in case of an analog video stream (can be CVBS {= composite}, Y/C {S-Video}, RGB {Scart}, or YPrPb {also called composite or YCrCb}. You could say the video stream is transcoded at this point but that is not correct. The video stream is then de-interlaced to eg. 240p in case of 480i input and then tripled to 720p. In this situation certain algorithms are applied in the temporal (time) domain by accounting for the slight difference in the time between frames (odd and even lines) and the spatial domain accounting for the added number of lines. Many, many ways are developed to do this ranging from very simple line repetition to motion compensated de-interlacing techniques. You're quite right in saying that the picture quality is changed. After all, picture information is being added based on educated guesses or cheap and cheerful displaying the same thing twice or three times. In my opion most TV's do a better job of this than receivers.
Technically speaking transcoding is re-formatting (the video) to another "language". The resolution does not necessarily change. What does change is the compression. An example is a DVD. In MPEG2 the resolution can be 480ix720 and occupy a space of 4.7 GB. The same DVD with the same resolution in MPEG4 (H.264 or whatever you want to call it) can take only 800 MB. The conversion from one to the other is called transcoding.
My guess is that what is being reffered to here as transcoding is in fact converting the video from analog to digital. The output can then be presented over the HDMI interface. Note that HDMI is only a delivery system and does not place any requirements on resolution. HDMI is a digital interface like DVI while all the others are analog.
The 100MHz component video is actually the sample rate with which the video is being digitized. Theoretically, the higher the sample rate the better the video will be. But whether the video quality really is better depends on the quality of the source material. No amount of processing will make real HD out of a VCR. So, the benefit of 100 MHz will only be apparant in case the content can do it justice.
Confused yet? I'll try to be more to the point in the future...