Everyone speculates yet none of us know the truth. War is evil and unnecessary. It doesn't matter if it's the war on terror, drugs, or socialism/communism, and US wars are just as horrendous as Russia's.
The US controls the global economy and politics, but the public refuses to consider the world's super-power's role in the numerous conflicts, globally. Is it impossible to believe that the US had anything to do with this war, and has any conflict since WWII improved people's lives?
The US invaded Afghanistan and Iraq to kill two people, and the nations are a mess because of it. Russia's invasion destroyed Ukraine, but ending the war ASAP would save lives, and isn't that the objective?
My sister served and died, and the patriotic BS doesn't lessen the pain. I don't believe Ukrainians prefer to continue the fight than to pursue peace because survival is worth more than death.
We sit on our couches, cheering the devastation while people die, and I don't understand. The war will end, and will Ukrainians who lost their loved ones feel justified?