On Monday, sheet music platform Soundslice says it developed a brand new characteristic after discovering that ChatGPT was incorrectly telling customers the service may import ASCII tablature—a text-based guitar notation format the corporate had by no means supported. The incident reportedly marks what is perhaps the primary case of a enterprise constructing performance in direct response to an AI mannequin’s confabulation.
Usually, Soundslice digitizes sheet music from images or PDFs and syncs the notation with audio or video recordings, permitting musicians to see the music scroll by as they hear it performed. The platform additionally consists of instruments for slowing down playback and working towards tough passages.
Adrian Holovaty, co-founder of Soundslice, wrote in a current weblog put up that the current characteristic improvement course of started as a whole thriller. A number of months in the past, Holovaty started noticing uncommon exercise within the firm’s error logs. As a substitute of typical sheet music uploads, customers have been submitting screenshots of ChatGPT conversations containing ASCII tablature—easy textual content representations of guitar music that appear to be strings with numbers indicating fret positions.
“Our scanning system wasn’t meant to assist this fashion of notation,” wrote Holovaty within the weblog put up. “Why, then, have been we being bombarded with so many ASCII tab ChatGPT screenshots? I used to be mystified for weeks—till I messed round with ChatGPT myself.”
When Holovaty examined ChatGPT, he found the supply of the confusion: The AI mannequin was instructing customers to create Soundslice accounts and use the platform to import ASCII tabs for audio playback—a characteristic that did not exist. “We have by no means supported ASCII tab; ChatGPT was outright mendacity to folks,” Holovaty wrote. “And making us look unhealthy within the course of, setting false expectations about our service.”
A screenshot of Soundslice’s new ASCII tab importer documentation, hallucinated by ChatGPT and made actual later.
Credit score:
https://www.soundslice.com/assist/en/creating/importing/331/ascii-tab/
When AI fashions like ChatGPT generate false info with obvious confidence, AI researchers name it a “hallucination” or “confabulation.” The issue of AI fashions confabulating false info has plagued AI fashions since ChatGPT’s public launch in November 2022, when folks started erroneously utilizing the chatbot as a alternative for a search engine.