Creating a One-of-a-Kind Pokedex Using ChatGPT Technology

When I was young, the items in any movie or TV program were the most important aspects. John Connor’s handheld Atari, Marty McFly’s self-lacing Nikes, and a myriad of different versions of the radio watch were what I fantasized about owning one day. So I feel a certain affinity with YouTuber Abe’s Projects, who took his passion for fictional gadgets a step further and created a real version of the Pokedex from the Pokemon cartoon.

If you weren’t the right age in the late 90s, the Pokedex is a handheld computer inspired by the Palm Pilots of the day. Its sole purpose is to identify and catalog Pokemon creatures in the wild. Metatextually, it’s a way for the show to inject some exposition every time protagonist Ash encounters a critter the audience hasn’t seen yet. The gadget was essentially a fancy Rolodex that certainly could have been replicated with 90s tech, and was for several kid’s toys… except for the near-magical ability to identify Pokemon with a camera.

Enter Abe, a fan of both the old cartoon and the toys that it spawned. He decided it was high time to make a Pokedex that works in the real world. Meaning that it can identify Pokemon based on visuals alone, assuming you encounter one on your way to the laboratory next to your house (or perhaps more practically, by pointing it at a Pokemon toy or image on a screen). He scratched out a basic design for a gadget that would combine a camera, screen and speaker, and a couple of navigation buttons, all shoved into Ash’s iconic gen 1 Pokedex as rendered by a 3D printer.

All that is easy enough — as I said earlier, there were toys approximating this design with a pre-programmed collection of 151 Pokemon way back in the 90s. The magic is in making this gadget capable of correctly identifying a Pokemon (or image or toy), then displaying it on the screen, along with a bit of info spoken aloud in an approximation of the original robo-voice. For that, the gadget needs a web connection and a little built-in software.

And by “a little,” I mean tons and tons of custom code, combining ChatGPT to identify the visuals and connect them to a specific creature, the open-source PokeAPI to serve up the pixelated visuals and a bit of flavor text, PlayHT to approximate the Pokedex voice and play back the flavor text, and Firebase to stitch it all together.

Watching Abe perform this labor of love is fascinating. As a relative luddite for both hardware and software projects (I can assemble a PC and solder a keyboard, that’s about it), I love seeing him essentially design his own Pokedex toy from scratch, complete with the internal brackets, screw holes, and even a removable back so he can pop out the MicroSD card holding all the software. And the software is an amazing bit of ingenuity connecting all these different services to make a seamless experience. I was especially impressed by how he identified the code causing audio spikes, then wrote a tool that automatically filters them out.

Well, it’s mostly seamless. It’s not a commercial product, so a bit of jankiness is to be expected, and smoothed over by our suspension of disbelief. The final version works as intended, for the most part. Abe can point it at a Piplup toy or a Raichu on an LCD screen, wait for it to process and identify the image, then get a pixelated Pokemon on the screen and a simulated Pokedex voice speaking the flavor text. It’s not perfect — it’ll work much better for an action figure than an exaggerated plushie toy.

A real version of this Pokedex that you could actually buy would be amazing, and I know there’s a market for it. After all, people are paying triple digits for meticulously crafted “Pokeballs,” even without the physics-defying ability to hold a giant monster inside. But given how much it relies on third-party software that might change its terms at any moment, I wouldn’t hold my breath waiting for The Pokemon Company to take inspiration from Abe’s wonderful project.