This is not a troll question; I live in Australia and am just bombarded with evidence that the US is spiralling down the drain. It can't just be that though, can it? Tell what's worth living in America for - the truly great parts that don't make the newsbites. And please don't just say freedom; what's still great about America?

#1

The national parks are probably the only good thing in the United states

Report

Add photo comments
POST

RELATED:
    #2

    The large plates of food and of course, free refills!

    Report

    Add photo comments
    POST
    #3

    We have some really cool musuems, and incredibaly well preserved battlefields.

    Report

    Add photo comments
    POST
    #4

    The US the best in space exploration by a longshot

    Report

    Add photo comments
    POST