Here’s a story about data that demonstrates how having it doesn’t equal having information. (Data is information plus interpretation). The story happened to me 30 years ago, but it still informs my skepticism when I get presented with “facts” out of context.
Before I got into information technology (IT), I managed manufacturing and safety engineering for a packaging company. Among our products was food packaging, which we sold internationally. One fine day, we got a telegram (yes, a telegram! — it was a long time ago) from a U.S. government agency telling us that an Asian nation had rejected one of our shipments, which it found to be “radioactive.” Radioactive! OMG, we were making radioactive food packaging, and our little firm had caused an international incident!
As you might imagine, everything stopped — especially in my tiny engineering/safety department — as we scrambled to figure out what might have happened. You see, we did use a tiny bit of radioactive material, Polonium-210, in the “anti-static” devices suspended above our paper-converting production lines. (For those unfamiliar with Polonium-powered anti-static devices, here you go.) While we had bought the devices from a well-regarded supplier, had used them as directed, and had never seen any signs of problems, we wondered if perhaps some radioactive material was leaking out of them?
The government sent Nuclear Regulatory Commission techs to our factory. I also contacted the device manufacturer, who rushed in their specialists. And I hired our own nuclear specialists to investigate independently. Imagine a low-tech manufacturing plant with several hundred workers watching techs in “silver space suits” sweeping their workplace with Geiger counters and other exotic devices, plus government investigators poking around. Employees (yes, including me) were panicked. And, of course, we pulled lots of finished products and work in progress (WIP) from inventory and our distribution channels at a heavy cost.
None of the experts found anything amiss. The anti-static devices were intact, and no radiation was detected on any product — including product samples we paid a fortune to air express back (in shielded containers, no less) from Asia for testing. Everyone was confused, and the costs of this crisis continued to climb.
Then we started to wonder: Did we really understand the issue? Remember, we had been notified by a U.S. agency (in bureaucratic English, of course) about a communication they had received from a foreign government written in a foreign language. It never occurred to us, a small manufacturer, to doubt what the U.S. government told us. Since we had received a copy of the original notification from the foreign customs service, we contacted a local university and engaged a language professor to carefully re-translate the original communication.
Imagine our surprise when our expert told us the original government translation had a “slight” error: The critical word didn’t translate to “radioactive”; it was actually “phosphorescent” or “luminescent” (i.e., our packaging material glowed when illuminated with an ultraviolet (“black”) lamp). Perhaps Boomers remember using liquid laundry detergent to “paint” invisible designs on clothes and faces, then turning on a black light to see the resultant glowing designs? (I guess today’s “Glow Parties.”) That glow came from the “phosphates” used as brightening agents in the detergents — just like phosphates were used when making paper. (And you Boomers may also remember TV advertising campaigns about reducing phosphates due to their pollution dangers.)
Our shipment had been rejected because the phosphate levels in the paper we used to create packaging exceeded that nation’s stringent limits (even though the phosphate levels were acceptable in the U.S.). What should have been an “Oops, we need to buy different paper if we want to sell into that market” became a costly lesson in nuclear physics for my team, the entire firm, our customers, and the anti-static device maker.
As a manager today, you’re deluged with data. You receive reports, charts, and graphs that are loaded with data points (sales by division, productivity per hour, etc.) and it’s up to you to make sense of that data, i.e., to turn data into ‘information’ (data plus context). If you don’t have the right context, data can lead you to make bad decisions. As an example, when I worked in financial services, we had one had a huge spike in expenses for a couple of weeks that got us into trouble with our European parent company. It had no idea what ‘hosting the Superbowl’ in one of our cities meant in terms of cash needs…and we didn’t realize that it didn’t know. Data without context.
The lesson for everyone involved in making sense of “noisy,” incomplete, conflicting data: Don’t take incoming data at face value and let it drive hasty and expensive decisions. And don’t let “authority figures” overwhelm you with their supposedly superior knowledge. Think about what you’re seeing and hearing and try to gain confirmation before forming fraught conclusions. I became quite a skeptic back then and remain one today; the lesson that’s stuck with me for 30 years remains as relevant today as it did three decades ago.
And by the way — we never got so much as an apology from the government.