The only thing that lets you know about god and gods doings is the bible, this would make the bible the basics of christianity. Or am I wrong?
There are 2 testaments in the bible, the old and the new. The old testament contains a lot of horrible things, but somewhere in the new testament it says that the rules in the old testament do no longer apply. Which leaves us to live with the more humane part of the bible.
Though, I assume you believe that god does nothing wrong and is omnipotent. Which means, you also believe that everything he commanded in the old testament was right, for their time. In other words, to be a real christian, you have to believe that slaverism, misogynism, slaughter of all men, women and children, and also taking women (which I think include underage girls) as sex slaves, was completely right, justifyable and morally correct for that time. It is not right to do such things in our time, because of the new testament. But it was right, back then, because those were gods commands, and he wouldn't do something that was morally wrong or evil.
If you don't believe it was right, doesn't that mean that you don't believe god is almighty, omnipotent, good god? Doesn't that make you a non-believer?