The reason is really quite simple; in the early days of the technology (think back to CRT computer monitors), screen refresh rates were slower and bitrates were lower. Subsequently, the interlaced scheme was created to allow the monitor to display every other horizontal line in each refresh. That required two full scans to display a full screen of information, one refresh to display the odd lines and one to display the even lines. Although this worked, it was annoying to a lot of people because it caused a noticable flicker. It was most noticable when you were looking away from the screen, through the side of your eye. It was also notorious for causing eye strain. It also was a stop-gap solution to the rapidly rising computer resolutions, which where outpacing the computer monitor technology. Believe me, those damn interlaced monitors were nasty, especially at refresh rates below 90hz.
Today, the technology has improved and bit rates have increased so now it is possible to address and initialize every pixel on the screen with every refresh and at much higher resolutions. The images are smoother and moving objects appear more natural, clearer and with fewer articfacts. Interlaced images often look like they're jumping from one frame to the next, but full frame progressive at these really high resolutions appears much better.
Unfortunately, it takes a higher bitrate (more bandwidth) to transmit a full image, so most cable companies transmit in 1080i (half the frame) or 720p (full frame, but lower resolution). But most new TVs can upconvert the signal pretty cleanly so you still get a good image.