Home/Support/Support Forum/How to create a delay in microseconds?
Welcome to Digi Forum, where you can ask questions and receive answers from other members of the community.

How to create a delay in microseconds?

0 votes
Hi Guys

Is there an elegant way of creating a delay of say 20-50us? I can use the MS_TIMER for millisecond delays but wanted to see if there was anything better than just having a dummy loop going around to create a delay in program advancement.

I'm using a RCM6700 and 10.72E

asked Dec 10, 2021 in Rabbit by 3lionsCT New to the Community (1 point)

Please log in or register to answer this question.

2 Answers

0 votes
It depends on what you need the delay for.

Does it need to be precise, or is it just a minimal delay before continuing with some action?

Do you want to be doing something else while waiting for the delay to expire?

The simplest solution is to calculate the duration of an assembly loop over some "nop" opcodes, perhaps with a calculation so you can adjust it for different us delays.

None of the code in Dynamic C related to microsecond delays applies to the Rabbit 6000, so you'll be on your own to figure out the necessary timing to get the result you're looking for. I'd recommend toggling an output pin before and after the delay code and measuring the pulse with a scope or logic probe.

It's also possible that Timer B or Timer C could be useful. I know they can pulse output pins with durations in the microsecond range, but you might be able to connect an ISR that sets a global flag when the delay has passed.
answered Dec 17, 2021 by TomCollins Veteran of the Digi Community (2,311 points)
0 votes
Thanks Tom. That's kind of what I figured. I don't need to be doing anything. I just need the program to halt for a little while (<1ms) to give some time for external hardware to respond.

I thought I was missing something obvious. I'll try what you suggest

answered Dec 22, 2021 by 3lionsCT New to the Community (1 point)