Answer
The book "Radical: Taking Back Your Faith from the American Dream" critically examines American Christianity. The book argues that American Christianity has become too focused on material wealth and success, and that this has led to a loss of focus on the teachings of Jesus Christ. The book calls for a return to a more radical Christianity that is focused on social justice and helping those in need.