I openly, not proudly, admit that until fairly recently I knew nothing about Israel, beyond (roughly) where it's located. I didn't know the history, etc. at all.
Now I've taken the time to do a little research into it, I have to say I don't understand why America seems to let them get away with far more than any other country ever would, in terms of open aggression, without facing any real criticism. I understand it's done under the name of defending themselves (and I can totally see why they'd be more than a little paranoid, given their history).
What I don't understand, and would be glad if anyone could tell me -without launching into the whole 'The Left are terrible because...', 'The Right are worse because...' - is why the US usually seems to defend whatever they do?
This isn't a dig at America, I just don't get way Israel seems to be 'forgiven' more readily by the US than other countries. I could understand it if it was always the UK that stood up for them, after 'giving' them the land, they might feel some duty to defend their behaviour (like sticking up for your kids if they behave badly), but why the US?
Or is that just the impression I get from the media? Does America historically dislike Israel? Is it, in reality, apathetic toward the place? I live outside the US, so have no idea what the average person thinks.
Can anyone enlighten me?