It sounds like a massive upgrade and could set Gemini apart.

What Is a Context Window?

The limit on the text size it can consider is called a context window.

Gemini app running on an Android phone

Smartmockups

Here’s another way to look at it.

Let’s say you go to a grocery store to get groceries without your grocery list.

The limit on how many groceries you remember when shopping is your context window.

gemini advanced

The more groceries you could remember, the higher the chances of not messing up your shopping plans.

This is followed by GPT-4 Turbo with a 128k context window.

Google Gemini 1.5 is bringing a one million context window, four times larger than anything in the market.

This leads to the big question: what’s the big deal with a one million token context window?

But Google’s Gemini 1.5 would be able to digest 700,000 words at a go!

Imagine only watching 20 minutes of a one-hour-long movie but being asked to explain the entire movie.

How good would your results be?

Context window transcends just the text you feed an AI model in a single prompt.

Wondering why ChatGPT or Google’s Gemini keeps forgetting the things you’ve told it earlier in a conversation?

It likely ran out of context window space and started to forget stuff.

Want to write a 50k-word novel that has a consistent narrative throughout?

Want a model that can “watch” and answer questions on a one-hour video file?

You need a larger context window!

Will Gemini 1.5 Live Up to Expectations?

If everything goes as planned, Gemini 1.5 could potentially outperformthe best AI models in the market.

Bumping up the context window of a model alone doesn’t automatically make the model better.

Will Google Gemini 1.5 give us a game-changer?

Social media is currently filled with glowing reviews of Gemini 1.5 from early-access users.

However, most 5-star reviews stem from rushed or simplified use cases.