What if the far left is intentionally attempting to ruin American principles and national strength in order to increase its own power and control over America and the American people? What if the border crisis was a planned event by the left to open up America to destructive forces that would destroy its culture and wealth? What if the precipitous withdrawal from Afghanistan was a planned event by the left to destroy American prestige and world standing? What if the vaccine mandates by the federal government were intended to undermine the labor market nd the greater economy?
Can anyone with assuredness say that these are not all likely scenarios?