Non-American here!
I've visited America a bunch of times and I really like it as a place, they have amazing scenery pretty much everywhere you look, and just about every individual American I've met has been really nice.
BUT...
I'd never want to live there. Their healthcare system is insane (sorry Americans but it is) and politically as a nation they're pretty bonkers. Guns, religion, general sort of global belligerence etc.
Also as an aside, San Francisco is genuinely one of the strangest places I've ever been to. I dunno if I was just there at a weird time, but it seemed like every single person there was either a millionaire or homeless. Absolutely nothing in between.