Shep, please use the existing Cloudflare solution for CSAM, brother.
Brother, you don't need to reinvent the wheel. This "detect and prevent all NSFW images" solution being proposed is not the industry standard for detecting and preventing the dissemination of CSAM.
This is a known, solved problem, and there are multiple standard technologies in use. https://blog.google/technology/safety-security/how-we-detect-remove-and-report-child-sexual-abuse-material/ explains the technology and source repository.
Cloudflare - which I think we all know the site uses - provides a CSAM detection solution. See https://developers.cloudflare.com/cache/reference/csam-scanning/ for the developer reference.
Those of us who have dealt with this problem in other contexts, or who have done content moderation at scale, hate to see you wasting your time and energy on developing a new solution to an existing problem rather than simply implementing the known solution that solves your stated goal.