Everyone’s getting really excited about the newest largest model seeing how many parameters they can cram into the training, while I feel like this is the real use case: small, highly specialised models that can run locally.
Though Firefox was the whipping-boy for RAM-hogging back in the day, and including a local model might just catapult them back to the top of that particular chart 😅
My main concern at the moment is Mozilla’s money issues. I think AI services could likely make that worse.
However, if they can work with local large language models to run client side, could be an amazing feature.
Everyone’s getting really excited about the newest largest model seeing how many parameters they can cram into the training, while I feel like this is the real use case: small, highly specialised models that can run locally.
Though Firefox was the whipping-boy for RAM-hogging back in the day, and including a local model might just catapult them back to the top of that particular chart 😅