OttoMatic is on the money but this kind of stuff is very hard to visualize if you aren't overly familiar with digital systems or how a 'digital' signal is modulated.
In a nutshell, the timing affects the receiving ends notion of how to interpet the data as either a zero or a one. If the timing is way off (milliseconds as Otto has mentioned) the interpretation of the data is also going to be way off.
My super simplistic example of jitter that I have described a few times is this:
You are communicating with a friend using a flashlight and morse code and you agree beforehand that a .5 second pulse of light is a zero and a 1 second pulse is a one (That after all is the kind of thing specifications define).
Now if your friend is sending S.O.S it should be .5, 1, .5 but what if it is way off and you think the second pulse was .75 seconds? Is that a zero that was too long or a one that was too short? Now I agree that Jitter in most digital systems (especially audio) is a total non-issue, but in some circumstances it can be problematic and Otto is just trying to say that a large amount of jitter is a big problem. With audio equipment, you don't get a large amount of jitter so the debate is much-ado about nothing.
Latency is a different beast. For example, your hard drive exhibits a small amount of latency (on the order of milliseconds). When the device driver requests a certain piece of data, it takes the drive a few milliseconds to locate that data and start sending it back. That is more or less a 'delay' in fetching the data and not something that affects the interpretation of the data itself.