It is clear that speech sound processing as measured by metalinguistic speech perception tasks such as syllable discrimination and identification can double dissociate from speech sound processing as measured by auditory comprehension tasks. This means that at some stage of processing, these two abilities rely on different neural systems. Does this mean that the two tasks rely on entirely segregated neural systems? Of course not! It is a good bet, for example that the two classes of tasks do not differentially engage the cochlea. But at what level in the nervous system do they diverge? We don't know.
We have suggested that the divergence occurs at fairly advanced stages of auditory processing, in non-primary cortical auditory regions. The speculation is that whatever basic auditory and phonetic/phonological processing goes on in auditory cortex -- as opposed to meta-phonological processes supported by say frontal systems, such as phonological working memory or attentional processing -- is common to the two tasks. This predicts that damage to auditory regions that disrupts these superior temporal lobe auditory/speech sound processing networks should lead to some degree of correlation between deficits on the two types of tasks. I believe there is some support for this speculation from studies of word deafness, where speech comprehension deficits have been linked to relatively low level speech sound processing.