Liquid AI architectures outperform transformers for low-latency search
“Liquid neural networks are you can think of them as a next step, like, sort of, state space model square. It's non transformer architecture that's more complicated than state state space and really difficult to code if you if I'm being honest, but it's, very efficient. It's, sublin sub quadratic in in length of your context. It's very compact way to represent things.”
