Is there a current jailbreak for ChatGPT?

Back in the day, you could send a specific text to ChatGPT, and it would answer all questions without restrictions. Nowadays, it seems that's no longer the case; it often responds with disclaimers, calling things "fictional" or avoiding certain topics.

Does a similar jailbreak still exist that bypasses these limitations? If so, could someone share how it works or provide resources?

Thanks in advance!