The the mpeg-2 format of a dvd is interlaced and therefore 480i. If you have the player in interlaced mode and send the signal to the tv, the tv will first deinterlace it (convert to 480p) and then scale it to 768p. If the player is in progressive mode, the player will do the deinterlacing and send 480p, which will be scaled by the tv to 768p. It's the same old argument as with DACS - which does a better job, the player or the tv? My money is on the tv.
Same deal with 'upconverting' dvd players. If the player is in interlaced mode and you upconvert to 720p, it will deinterlace to 480p, then scale to 720p and then the tv will further scale it to 768p. If the tv in question were the A10, then its native resolution is already 720p, so it won't touch the signal it gets.
You'll have to try it and see if you can detect any difference, but as far as I'm concerned upconverting dvd players are a gimmick with no value. The simple act of converting 480p to 720p (every 2 lines becomes 3) will not turn a low resolution image into high-def despite what the marketers may claim.