XXHighEnd

Ultimate Audio Playback => Chatter and forum related stuff => Topic started by: soundcheck on June 28, 2007, 06:32:32 pm



Title: Latency of 1/10 of a sample ?
Post by: soundcheck on June 28, 2007, 06:32:32 pm
Hi Peter.

I read your remark on Latency in your post., which I tend to disagree upon, looking at my own research over the last 8 month! ;)

Perhaps you can shed some light on the subject:

1. First of all you could tell me what kind of Latency you're talking about?

2. 1/10 of a sample means 2micro seconds right? - No buffer bigger than 2us! Amazing. What clockrate is Vista running at? The clockrate defines the IRQ intervals - right?
    If it is 1000Hz It would be 1ms. In that case you'd most probably catch XRUNS if you'd run 1/10 of a sample latency. Because as soon as another process jumps in
    you'd face 1ms or higher  break which would cause an underrun. Even in exclusive mode this wouldn't work.
   
Looking forward to your answer.

Cheers
Klaus

 


Title: Re: Latency of 1/10 of a sample ?
Post by: PeterSt on June 28, 2007, 07:24:39 pm
Haha Klaus, sharp as always !

First off, the 1/10 of a sample is theoretical, just because a buffer in practice cannot be so small and is without sense obviously. However, it can be kind of measured by means of knowing the left headroom and variation of the space in the buffer.
An other angle is using a (special) timer which is far too low to be useful at all, and only when it's set to 0.05 ms it reaches the limit of the 2.4GHz core2Duo I use, that thread accessing one core only (but still inaudible to me).  One 44K2 audio sample is 0.2 ms -> my presented (!) math was a little bit off, granted. BUT, in what I say here a. I used one core and b. I used 44K1. With 88K2 the results are nearly the same, implying another factor of 2 "better". So I'm on the safe side here. Okay ?

Again, this is useless; Besides that the sampling frequency wouldn't be that low, it's already useless because of no timer is stable at that high rate. Also, the code to be executed in between is relatively too much influenced, so it really can't be measured. However, when we talk about the phenomenon latency as such, it would be true for sure. Mind you, "latency" is from DAW applications (better : useful there very much), and there it would apply, although in practice I think you'd have to say that the latency (say) varies from 0.02ms to 0.07ms or whatever would come out exactly, because of a. the variance in the timer and b. the code itself which would relatively stall the timer in a variance.

*ALL* is useless, when you'd see that it all is not about this, no matter how much we tend to think it is. We talked about this before, and very very carefully I'd like to say that this is kind of proven by Linux which would go (far) under the latency figures of the #1 and #2 Engines under XP, XX there still showing off (without you being there, I know).

For whatever it is worth, I mentioned it in the original post, just and only because people like to know this. So again, the fact that it would operate in the, say, real time domain, *for me* only makes me put out the real message : there's no way this player will be influenced by its near environment (other services etc.). And remember, this was just my objective !
This objective was good by the sole theory of we otherwise tweaking thee hell out of us, to let the PC produce better sound. We all know it, and we all tend to listen to the by itself valid things of even up to switching off the PC's monitor. I just created something that allowed us to avoid this stupidnesses, and the real time "figures" is just a means of proving for dumn theories.

Remember, in the end all is about jitter.


I said it before : Vista is great. But it's a stupid shame that nobody is able to help out on the elementary things of it. That's why it took me over 4 months to get the real grasp of it, and I can tell you, there's another 25% to catch for me. :secret:

I hope this was a useful answer for you. And don't forget (like I think I said in the original post already) : those small buffers just do not exist so there is no way to use out this super latency. Okay, on a 32 polyphone synthesizer perhaps where 10ms really is sufficient. So go figure.

Peter


Title: Re: Latency of 1/10 of a sample ?
Post by: soundcheck on June 28, 2007, 07:52:15 pm
Hi Peter.

A 44.1. sample is 0,023ms! 1/10 of it would be 0,0023ms. Do I have a problem with the math? ;)

Regarding timer: I am running a timer at 10000Hz which is as unstable as the 1000Hz (but relatively generating less error and less latencies).

I turned all knobs in my system and anytime I reduced latency and non-linearities in whatever process the sound improved.

This behaviour has been even prooved on a TwinDac by now!! ;)

Cheers
\Klaus



Title: Re: Latency of 1/10 of a sample ?
Post by: PeterSt on June 28, 2007, 08:41:24 pm
Oh my ... you're a bunch sharper than I am. This is from the time I worked on that (bd-design) :

Quote
... which was XX-XP ... Today it's XX-Vista and the latency is 1 sample if I want. But I don't because then I would be too late to fill the buffer again, which is related to the timer which is 1ms at least, and it's even very inaccurate. So there's somewhat :grazy: more to it.
The latency is determined by the speed the audio device swallows the samples, and the time it takes to refill the same amount of buffer, just
after you stuffed in the latest fill. Therefore your before mentioned "latency" of less than 1 sample is ... well, nonsense. : ;): It just
can't work, even if you'd had the speed.

and from somewhere else overthere (Edit Mar 3 2008 : dead link by now, sorry) :

Quote
Sideways is the timedomain. Note the sideways steps of 0.02 millisecond.

So my timer data is still okay, but the time a sample lasts I had in my mind wrongly. So my 1/10 is only 1. :fool:

Thank you for pointing it out Klaus !