Imagine this problem statement. 👇
An organization is grappling with high employee turnover, and they want you to predict the attrition so that planning becomes easier.
Distinguishing the real challenges from stated challenges in a consulting discussion can be difficult, but adding the dimension of analytics to the mix can make it even more challenging. Organizations that are embarking on a transformational journey with analytics may face learning curves that make it difficult to articulate clearly the expected outcomes of an effort.
Once the problem statement has been defined, it may be tempting to immediately analyze the available data on employees, gather relevant information, and create a highly accurate black box model for predicting employee attrition. However, this approach will fail if:
Table of Contents
If we look at the problem statement which was laid out for an analytics consultant again, “planning” is a loaded word. It should be unpacked diligently during the initial stages of the project setup. Does planning mean recruitment planning to factor in attrition? Or does planning mean employee engagement planning to reduce employee attrition? Understanding this is vital because the considerations we make for accuracy as an endpoint are very different from explainability as an endpoint. The black box model we developed might be useful for planning recruitment but will be useless in preventing attrition. “How much attrition?” and “why this attrition?” require entirely different approaches to analytics.
Hence, it becomes paramount that all/most of the expected outcomes out of an analytics effort are discussed and accepted/dismissed during the initial stages of the project setup. Else, the resources invested in executing the project might not yield the ROI the organization might have hoped for.
Examining the concept of explainability more closely reveals its own set of challenges. While it is possible to use techniques such as Anchors to make black box models more transparent, true explainability requires all the factors that influence the outcome to be included in the model. While including only the variables that correlate with the actual drivers of the outcome may be sufficient for achieving accuracy, it is not enough for explainability, as the actual variables that affect the outcome must also be included.
Expanding on the attrition example, the total vintage (number of years of experience with an organization) is highly correlated – in most cases – with the number of years an employee works with the same manager. We will be reading the output wrong if only one of them is considered a variable for modeling. Hence it becomes crucial that a qualitative assessment of driver variables is carried out with the help of subject matter experts during data collection and modeling phases to avoid common mistakes of erratic interpretations.
It is important to note that analytics consulting involves a plethora of intricate considerations. However, by clearly defining the expected outcomes and ensuring transparency in the decision-making process, organizations can significantly enhance the efficacy of their analytics endeavors, foster greater adoption, and realize substantial returns on investments.
In today’s fast-paced world of e-commerce and supply chain logistics, warehouses are more than just… Read More
What does it mean to redefine the future of manufacturing with AI? At the heart… Read More
In 2022, Americans spent USD 4.5 trillion on healthcare or USD 13,493 per person, a… Read More
In the rush to adopt generative AI, companies are encountering an unforeseen obstacle: skyrocketing computing… Read More
AI in Manufacturing: Drastically Boosting Quality Control Imagine the factory floors are active with precision… Read More
Did you know the smart factory market is expected to grow significantly over the next… Read More
This website uses cookies.
Leave a Comment