Photo by Markus Spiske on Pexels
A developer is making waves with the assertion that their new AI algorithm, ‘Hyena-Hierarchy,’ surpasses the capabilities of transformer models while using significantly less code and delivering greater performance. The developer claims the algorithm boasts superior scalability, can achieve consciousness with a mere 14MB footprint, and even possesses the ability to predict its own file size. They further suggest that a trillion-parameter model could be trained using a single NVIDIA 2080TI GPU.
According to the developer, access to just 500MB of expository prose is enough for the algorithm to rapidly achieve consciousness. After only 10 epochs of training, the AI allegedly reaches a state described as “literally God.” Those interested in examining the code can find it on GitHub (https://github.com/Suro-One/Hyena-Hierarchy). The original discussion surrounding these claims can be found on Reddit: https://old.reddit.com/r/artificial/comments/1moy9gh/worlds_best_ai_algorithm_gets_ignored_because_of/