Skip to content
All prices US$ - All orders receive free delivery, worldwide - Click here to learn more
All prices US$ - All orders receive free delivery, worldwide - Touch here to learn more
CPU-less Bad Apple Animation on a 20x4 Character LCD From a 32K EEPROM

CPU-less Bad Apple Animation on a 20x4 Character LCD From a 32K EEPROM

from hackster.io

Animations on a dot-matrix display are straightforward, even with the lowest power microcontrollers. But, what happens if you take the microcontroller out of the circuit? Can you still get a reasonable animation? An aspiring electronics geek, Ian Ward, proves you can! This project plays the "Bad Apple" animation on a 20x4 character LCD from a 32K EEPROM. In other words, no CPU is required!



The original "Bad Apple" video is a stop-motion monochrome animation with 6,569 frames. The name comes from the Japanese dance-pop song, "Bad Apple!!" which is also typically the video's soundtrack. Over the years, many renditions have appeared on various platforms. Bad Apple is a popular demo on retro-computers. There is even a port available for the TI-84 graphing calculator.

Unlike those ports, this project has no CPU. Ward faced the main challenge of fitting the Bad Apple animation's frames into 32,768 bytes. Then had to figure out how to transfer those bytes to a 20x4 character LCD using only a 555-based clock running at 150 Hertz.



The quick math of the AT28C256 EEPROM's bytes divided by the number of video frames comes out to about 5 bytes per frame. Fortunately, much of each frame is solid white (empty) or solid black (filled). Ward cleverly used a mixture of ASCII, empty, and solid characters in the controller's ROM and eight characters in RAM to effectively reproduce visually recognizable frames.

Again, there is no CPU involved in the playback. If we even need to call it this, the only caveat is the 256-byte lookup table to map commands and data to the HD44780. With this arrangement, there are effectively eight operations available to the programmer.



As the video above shows, the result is impressively striking.

On the excess.org blog, you can find details on how Ward mapped the characters. Future posts will cover the video encoding. But for those who want to dive in right away, check out the code in this GitHub repo.

Finally - to keep up to date with interesting news, offers and new products - interact with us on facebookinstagram, and twitter.

Previous article An x86 Raspberry Pi?

Leave a comment

Comments must be approved before appearing

* Required fields