In WWII there was a curious episode of injection of critical thinking, not entirely well publicised. Big airplanes bombers in the Allied camp were shut down more and more, and the lucky ones that returned to base did so with multiple bullet holes, all over the place in the fuselage.
It was obvious to people that this was a sign that the fuselage needed to be stronger, with more armour and protection. But heavier plates would not necessary help the performance of the airplane.
A Jewish mathematician who had fled from Hungary, Abraham Wald, was asked to look into the problem. I don’t know exactly why him. But the first thing he did was to sketch the distribution of the bullet holes in the returning planes. Doing so many times, he saw a pattern: the areas with more holes were wings, tail and the nose of the aircrafts, whilst others such as the culprit and a sector of the back were not. The answer was simple: these areas with the holes were the weak areas of the fuselage, the ones that needed the extra plates, the reinforcement, the thicker armour.
Wald turned the problem and the logic upside down. The reframed question now was not where the bullet holes were in the aircrafts that returned, but where they would be in the ones that didn’t. If areas of the fuselage needed reinforcement and the extra armour, it was not the ones with the holes – the aircrafts returned after all – but the ones with no holes at all such as the culprit and part of the back. Presumably, this is why those aircrafts did not come back.
Wald reframed and inverted the problem. It did not cost anything. Certainly at that time, sophisticated simulations that would have been the order of the day today, were not available.
Seeing the problem upside down, reframing and finding ‘the other side of the coin’, is a tool within a good Critical Thinking approach.
As in the previous Daily Thought, another case of ‘Invert, always invert’. 
The Critical Thinking vignettes continue during this week. Pass it on.