Ever wondered why some AI can handle video and ...
Context length absolutely matters in AI. They will be getting longer. So context length is the number of tokens that an AI model can store in memory. The more tokens that it can fit in, the longer the conversation can go on and also the more files or larger files an AI system can take in. So right now we're in the like millions of token zone with Google Gemini and Gemini is the only model that can take in video files because video is so large. It's so many tokens to take in audio and visual component. I do think that context length will grow. I do think that it is still a focus area of labs. There are also a lot of research papers and there was one from Microsoft that showed that you could have like infinite context length that will still be a large piece of 2025 releases. It's just not gonna be as talked about.
No AI insights yet
Save videos. Search everything.
Build your personal library of inspiration. Find any quote, hook, or idea in seconds.
Create Free Account No credit card required