I moved to Australia ten years ago (the best move I made), and every other year I visit my family in California, but it's beginning to freak me out. Either people are becoming more delusional or American culture is becoming foreign to me. It seems as though everyone is constantly selling themselves and/or they think they're being filmed for a movie. I'll definitely be reading Anderson's book.
Also, the Jungle by Upton Sinclair should be mandatory reading. I don't think many Americans understand the importance of labor rights and regulations. The Jungle is the most depressing book I've ever read, but it's effective because it's reality.