Ok, maybe the title is a bit misleading, but I thought this was interesting: I was taking some high-speed pictures of my alarm clock (um, testing a new lens), when I noticed that the LED display drivers were set up in a weird way. Instead of turning on each digit individually and displaying one number at a time (albeit fast enough that the human eye can’t normally notice), the display is updated in this weird interleaved pattern. My guess is that this pattern was designed so as to minimize the amount of flicker that someone could see when looking at it, because the display is always lit more or less evenly over its entire surface, but does anyone have a better idea? Is this a common pattern for devices to use?
If you want to try this, find a camera that allows you to set the shutter speed manually, set it to something high (1/125th second is a good start), and fire off a bunch of frames. Because the camera shutter isn’t synched to the display refresh cycles, some of the pictures will show multiple segments at a time, so pick out ones that show single segments. I’d appreciate hearing how your device’s display functions.
For reference, this is what I expected to see:
Nice, clean, single-digit-a-time updates, with a much faster display rate (these were taken at 1/500th of a second). Like any good naive engineer would make
What kind of equipment do you have to do high speed photography? I recall you mentioning the chdk firmware some time back, but most cameras I know of still only go a few frames per second.
Oh, I’m probably using the wrong term, I meant high shutter speed. But I have moved on from the Canon S3 that I was using CHDK on, and am using a Nikon DSLR.
Could the clock display be Charlieplexed?
It could, but I haven’t seen one that is. My guess is that the decreased refresh rate/complexity of the semiconductor and PC board design aren’t worth it compared to a slight increase in processor pin count, but if I find any I’ll post them here.