A question from the Arduino Forum: Why does the signal look like having a different timing on a 24MHz Teensy 3.2 vs. a 72MHz Teensy 3.2?
Both signals are vaild for the WS2812 - they light up.
(The code is unknown - so I actually don’t know which data is shown.)
On an oscilloscope it looks the same: http://s17.postimg.org/j5z9i2b0v/WS2812_signal2.png
http://s16.postimg.org/mroq2x7qd/WS2812_signal3.png
The code uses a loop to check against a clock value to decide when to drop the line back low. This loop takes about (let’s say) six cycles to run, no matter the clock speed. Which means at 72mhz the loop takes 80ns but at 24mhz it takes 250ns.
If I say I want to hold the line high for 320ns, at 72mhz that’s four iterations of the loop for 320ns, or worst case scenario, 5 for 400ns. However, at 24mhz, it takes two iterations of the loop, or 500ns. And there you have the 24mhz clock holding the line high for longer.
(The actual cycle counts and timings are different, but I’m not in a spot to check them but this should get the basic idea across).
To compound this, for the very low clock speeds (48mhz and under) I use hand chosen values for the high/low timings for these chips (to best fit in the low cycles counts for the variants that use hand counted asm code). Higher clock speeds I’m converting the desired times in ns to clock cycles counts, which may mean slightly different cycles counts (more accurate for the higher speed clocks, though).