[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Timer interrupts



Hello all!


I'm running an interrupt-driven application on two OS-less
Device Server Platforms.
They pass data between two LANs on a 422 link.

I've got two events hanging on timer0.
I run the timer at 300 Hz, divider 3, ie. 100 Hz.

Each time, I check the RS422 port for data available, and if
there is any, I force an end-of-packet interrupt on the DMA
channel.
I also subtract 1 from a counter which is initialised to 1000.
When the counter reaches 0, I create and send a monitoring
packet.  This should in other words happen every 10 seconds.

I have observed the following anomaly:
When running at fairly high loads (2.5 Mbps in either direction),
the monitoring packets are sent less frequently.  It's not that
every other packet is lost, the frequency is simply lowered so
that they arrive at intervals of 12, 15 seconds or more.

Why?

The monitoring is only done in one of the units, and the packets
are sent on the 422 line before being passed to the LAN.

I handle the multiple-interrupt interrupt, so that shouldn't be an
issue.  However, my interrupt handlers are fairly bulky - is there
a recommendation as to how much time should be spent in an
interrupt handler?

Thanks again,


/Uffe Sjöstedt
SaabTech Systems AB
ulf.sjostedt@xxxxxxx.se