the Christian faith, while wildly misrepresented in so much of American culture, is really about death and resurrection. It's about how God continues to reach into the graves we dig for ourselves and pull us out, giving us new life, in ways both dramatic and small.