I always feel that crime films are about capitalism because it is a genre where it is perfectly acceptable for all the characters to be motivated by the desire for money. In some ways, the crime film is the most honest American film because it portrays Americans as I experience a lot of them, in Hollywood, as being very concerned with money.
To me, regardless of who's in office, the government is strangled by business. And the government's priorities are dictated by business. I mean, why does America, even after healthcare reform, still not have free universal healthcare? I'm sure it has something to do with the insurance lobby.