latency vs. packet loss

How well does latency correlate to packet loss on the Internet? For
example, if one were to pick one of several randomly placed sites on
the net based on lowest latency to/from point x, what percentage of
this time would this also yield the site with the lowest packet loss
to/from point x? My guess is that the correlation is high (due to
typical buffer sizes).