vsmGuy
Advanced Member level 2
I have a program in which I have two ISRs servicing a timer set to interrupt.
These ISRs handle signalling that is time sensetive and hence I need to be able to know how many clock cycles my code (the ISRs) takes to execute.
Once I know that, I believe there will be only one thing that can affect the time - the interrupt latency.
This, I believe, would bring about a jitter of, at most, 1 instuction cycle? (GOTO vs a GOSUB?)
The problem with counting clock cycles in my code (the ISRs) is that I am writing the whole application in C.
There is no reason to worry about C though - the CCS compiler produces a nice .LST file that shows me a nice 1:1 correspondence between the C code I wrote and the ASM it generated.
I treat the ASM output as a human/user friendly output for hexcode and thus in effect its good as me coding the whole application in ASM?
My question is how to best time the clock cycles:
1. Use the MPLAB Simulator?
2. Use the Oshonsoft PIC Simulator?
The MPLAB Simulator is free, but the Oshonsoft PIC Simulator seems to be lightweight and small.
I have never used the Oshonsoft PIC Simulator myself, but a freind showed me how it inherently displays the clock cycles and time spent on execution in its window.
He did some sample code in a variant of Basic for that simulator and is not sure if I can just load a hex file into it and get the same results.
I also saw the Oshonsoft PIC Simulator having rave reviews on many sites on how good a simulator it is, etc
2. Is the Oshonsoft PIC Simulator really that better than the MPLAB Simulator?
3. What kind of "extra" functionality makes it better?
4. Is it some kind of "more user firendliness" than any kind of "extra" functionality that makes it better?
These ISRs handle signalling that is time sensetive and hence I need to be able to know how many clock cycles my code (the ISRs) takes to execute.
Once I know that, I believe there will be only one thing that can affect the time - the interrupt latency.
This, I believe, would bring about a jitter of, at most, 1 instuction cycle? (GOTO vs a GOSUB?)
The problem with counting clock cycles in my code (the ISRs) is that I am writing the whole application in C.
There is no reason to worry about C though - the CCS compiler produces a nice .LST file that shows me a nice 1:1 correspondence between the C code I wrote and the ASM it generated.
I treat the ASM output as a human/user friendly output for hexcode and thus in effect its good as me coding the whole application in ASM?
My question is how to best time the clock cycles:
1. Use the MPLAB Simulator?
2. Use the Oshonsoft PIC Simulator?
The MPLAB Simulator is free, but the Oshonsoft PIC Simulator seems to be lightweight and small.
I have never used the Oshonsoft PIC Simulator myself, but a freind showed me how it inherently displays the clock cycles and time spent on execution in its window.
He did some sample code in a variant of Basic for that simulator and is not sure if I can just load a hex file into it and get the same results.
I also saw the Oshonsoft PIC Simulator having rave reviews on many sites on how good a simulator it is, etc
2. Is the Oshonsoft PIC Simulator really that better than the MPLAB Simulator?
3. What kind of "extra" functionality makes it better?
4. Is it some kind of "more user firendliness" than any kind of "extra" functionality that makes it better?