Correct, nothing played back at original sampling rate will alias.
It _won't_ alias; it may already _have_ aliased when sampled in the first place.
Aliasing occurs when sampling.
When you digitalize (ADC), you are sampling. When generating a waveform mathematically, you are sampling the mathematical function at the very moment you compute its value at discrete points.
When you play back a signal at a different speed than the original, you are _resampling_ it, that is, theorically, interpolating it and then sampling it again, and it is the sampling stage, not the interpolating one, that produces the aliasing.
The interpolation, since it cannot be an ideal interpolation, may introduce other noises or artifacts, not aliasing as far as I can see.