Choosing a Tick Period for Transforms
The Tick Period determines how often a Transform's Tick Method is invoked. If you create a filter using a SIL Easy Transform, you can improve the fidelity of the filter response by reducing the Tick Period.
•When running MxVDev against a model or C code (MxVMC), you can set the Tick Period to any value.
oSetting it too small slows execution of the simulation.
oSetting it to big will give a steppy response (to filters, integrators, etc.)
•When running in a hardware test environment, there is a further restriction as to how small the Tick Period can be.
In any given tick, some or all of the transforms use CPU cycles to execute, so it takes a finite amount time for all transforms to complete. We call this time the "Tick Duration." If you set the Tick Period of a Transform to less than the Tick Duration, it will not be invoked at the correct times, and the behavior of the transform may be distorted. In addition, you can overload the Transform Harness if every Tick Duration is greater that the smallest Tick Period. In this case, the MxVDev clock is not be able to keep up with the real-time PC clock. As a rule of thumb, the fastest Tick Period should be greater that the worst-case measured Tick Duration. You can measure the Tick Duration using the response system signal MxV Tick Duration.
•Take care not to send transitions to the Test Harness at a rate faster than the Transforms can handle the data. For example:
oIf you send Signal transitions to the VMC every one millisecond, but only tick it every five milliseconds then 4 out of 5 of the transitions will be ignored.
oIf you send voltage update requests to a power supply at a rate of one every tenth of a second, but the API call to set the voltage takes 0.5 seconds to complete then again 4 out of 5 of the transitions will be ignored.
In theory, the TestCase resolution and tick rate of a system are unrelated; you can select any values without adverse effects on each other. However in practice, the two should be have meaningful values with respect to one another. Setting the TestCase resolution to a shorter time period then the shortest tick period in the system is wasteful, since it will not provide any additional information. Inputs and outputs only change as fast as the tick rate, having them updated in the TestCases more frequently than this will not produce any better granularity. Setting TestCase resolution to a higher value than the tick period is useful when time tolerances are involved. For example, if the allowable time for a response to a request is up to 100ms then the TestCase resolution may only need to be 100ms even if transitions can occur every 1ms. The idea being you don't care when exactly within that 100ms the response occurred, so long as it occurred before the 100ms was up.