I wouldn't say none of the American presidents were Christian. In fact, most of them claimed to be. However, there is scandal behind a lot of them and their theology (depending on who you are of course). For instance, a lot of people believe that Lincoln was (perhaps he was but was at odds with it from time to time), but he blamed the war on God and spoke out against Christianity too, so I suspect he wasn't really.
Eisenhower I believe was. At least, there isn't anything directly that he did that went against Christianity and spoke in favor of his Christian belief.
The forefathers however (Washington, Jefferson, Adams, etc), while many people claim they were, they never claimed it themselves and only counted themselves as deists. Jefferson was a unitarian which is a Christian heresy.
JFK was a Catholic, but only covertly and did many things against the faith. In fact, the Kennedy family are notorious for pushing for policies that go against the faith.
The notion that America is a Christian nation has a lot of surrounding thoughts. One is that it's pertaining to the Christian morality (which in itself is also iffy since so much of it was more fueled by the enlightenment era). Others say it's because it was founded by Christian men, but that too is iffy.
It is still hotly debated whether America is a Christian nation or not. I side with the notion that it isn't. Many who also side with it use it to argue that the US should reject it...i wouldn't go there either.