GPT 5 context size

Wanted to verify, so we’re all on GPT-5 now which means a context size for tokens of 400k in the API.

Previously, for 4.1, the context size was 1m tokens for the API.

Note, the token context size in the ChatGPT app is 128k, unless it’s changed since last I looked, so you get more in either case because Jim is using the API calls.

Not a complaint, just wanted to alert folks to the shrinkage. That swimming pool is cold!

LOL - yes, thanks for that - though there were relatively very few who were hitting that 1M mark. :wink:

The constraint is only temporary, as I’ve seen they had a larger window but chose to constrain because compute deficiencies.

This is also a good time to remind everyone - that the context window doesn’t matter if you’re writing in context of a Storyform. You could have a separate conversation for each of the 75 aspects within a Storyform and they would all connect together even without knowledge of the others.

It’s really important to acknowledge this, and the advantage we have over any other attempt at narrative intelligence. AI is all about context and so is a Dramatica storyform. Many will say that AI unlocks the power of Dramatica, whereas another way to think about it is that Dramatica unlocks the power of AI.