I have followed with interest the American attitude toward France. I have even read a little American history. I don't see how the two reconcile. Do I misread your history, or would you never have won your war for independence without the French? If it weren't for France would you still be subjects of the Queen of England? Why is there such an attitude today?