Why are nazis making such a big comeback in America?

Title says it all. I understand that it has to do with being an outlet for hatred and people blaming all their problems on minorities. Why nazis though?