How to generate Cypher Query using LLM?

I have a huge schema in the neo4j database.

I'm using the LangChain function to generate a cypher query

chain = GraphCypherQAChain.from_llm( ChatOpenAI(temperature=0), graph=graph, verbose=True )

chain.invoke(query)

It's returning an error saying that the model supports 16k tokens and I'm passing 15M+ tokens

How can I limit these tokens? I tried setting ChatOpenAI(temperature=0, max_tokens=1000) and it's still giving the same error.

I think it's passing the whole schema at once, how can I set a limit on that?