Putting AI in the Driver’s Seat
Adafruit has been experimenting with using AI to write the Arduino drivers that make their sensors, LEDs, and everything else easy to use.
Writing software can be a very creative pursuit, requiring engineers to craft elegant solutions that seamlessly blend functionality and user experience. Software developers find themselves in a constant dance between logic and imagination, as they strive to create code that not only solves practical problems but also delights users with intuitive interfaces and engaging interactions. This fusion of art and science in software development often leads to innovation and the birth of entirely new technologies. Whether designing a user-friendly mobile app or optimizing complex algorithms, software engineers harness their creativity to bring digital ideas to life, shaping the ever-evolving landscape of technology we rely on today.
Well, it is like that in some cases, at least. But as anyone who has ever worked as a software engineer knows, reality does not always align with such idealized expectations. Consider the development of drivers for all of the devices and sensors that we like to hack away at, for example. The job is not so much about developing an elegant solution to a problem as it is about poring over page after page of datasheets to find the register maps that show us how to configure and interact with these devices so that those who come along after us can simply make a function call like init_sensor() rather than setting a slew of sub-byte binary flags.
Ladyada of Adafruit knows this pain as well as anyone. Adafruit is famous for not only selling electronic components, but also writing software libraries and guides that make them super easy to use. With all of the sensors, actuators, LEDs, and everything else that they have available, that adds up to a lot of supporting software that needs to be developed. And no, that is not the fun kind of software development, but rather the boring, time-wasting slog kind of development.
In search of a better solution, the Adafruit team turned to AI for a helping hand. In particular, they wanted to see if they could teach OpenAI’s ChatGPT to crank out Arduino libraries for new components in the style of Ladyada. If they could get this to work, the job could be passed off from an experienced software engineer to a prompt engineer, helping them to better allocate important resources.
The key to this effort was the use of a PDF parsing plugin for ChatGPT. That allowed the team to point the chatbot in the direction of a PDF of a component’s datasheet, then ask it to build an Arduino library in the style of Ladyada. As the chat log shows, there is quite a lot more to it than just that though, as you might expect. ChatGPT needs a lot of hand-holding to get all of the details right and make sure it does not start to stray way off track.
Adafruit has only been experimenting with this method for a few days, so there is surely still room for improvement, but they have already been experiencing some benefits from the new approach. While it still takes about the same amount of time to build a library using ChatGPT as it does manually, it does take off a lot of the strain. It puts a developer more in the position of being a supervisor watching the code come together and giving guidance along the way, rather than digging into all of the nitty-gritty details and making sure to dot every “i” and cross every “t”.
So, will your next project involving NeoPixels leverage a library written by Ladyada or LadyadaBot? It is too early to say, but if this method proves itself in the months to come, we might find ourselves living in a world where electronics are increasingly easy to work with, and that would be a win for everyone, not just Ladyada’s overused typing fingers.