Imagine this problem statement. 👇
An organization is grappling with high employee turnover, and they want you to predict the attrition so that planning becomes easier.
Distinguishing the real challenges from stated challenges in a consulting discussion can be difficult, but adding the dimension of analytics to the mix can make it even more challenging. Organizations that are embarking on a transformational journey with analytics may face learning curves that make it difficult to articulate clearly the expected outcomes of an effort.
Once the problem statement has been defined, it may be tempting to immediately analyze the available data on employees, gather relevant information, and create a highly accurate black box model for predicting employee attrition. However, this approach will fail if:
Table of Contents
If we look at the problem statement which was laid out for an analytics consultant again, “planning” is a loaded word. It should be unpacked diligently during the initial stages of the project setup. Does planning mean recruitment planning to factor in attrition? Or does planning mean employee engagement planning to reduce employee attrition? Understanding this is vital because the considerations we make for accuracy as an endpoint are very different from explainability as an endpoint. The black box model we developed might be useful for planning recruitment but will be useless in preventing attrition. “How much attrition?” and “why this attrition?” require entirely different approaches to analytics.
Hence, it becomes paramount that all/most of the expected outcomes out of an analytics effort are discussed and accepted/dismissed during the initial stages of the project setup. Else, the resources invested in executing the project might not yield the ROI the organization might have hoped for.
Examining the concept of explainability more closely reveals its own set of challenges. While it is possible to use techniques such as Anchors to make black box models more transparent, true explainability requires all the factors that influence the outcome to be included in the model. While including only the variables that correlate with the actual drivers of the outcome may be sufficient for achieving accuracy, it is not enough for explainability, as the actual variables that affect the outcome must also be included.
Expanding on the attrition example, the total vintage (number of years of experience with an organization) is highly correlated – in most cases – with the number of years an employee works with the same manager. We will be reading the output wrong if only one of them is considered a variable for modeling. Hence it becomes crucial that a qualitative assessment of driver variables is carried out with the help of subject matter experts during data collection and modeling phases to avoid common mistakes of erratic interpretations.
It is important to note that analytics consulting involves a plethora of intricate considerations. However, by clearly defining the expected outcomes and ensuring transparency in the decision-making process, organizations can significantly enhance the efficacy of their analytics endeavors, foster greater adoption, and realize substantial returns on investments.
AI in Manufacturing: Drastically Boosting Quality Control Imagine the factory floors are active with precision… Read More
Did you know the smart factory market is expected to grow significantly over the next… Read More
Effective inventory management is more crucial than ever in today's fast-paced business environment. It directly… Read More
Gramener - A Straive Company has secured a spot in Analytics India Magazine’s (AIM) Challengers… Read More
Recently, we won the Nasscom AI Gamechangers Award for Responsible AI, especially for our Fish… Read More
Supply chain disruptions can arise from various sources, such as extreme weather events, geopolitical tensions,… Read More
This website uses cookies.
Leave a Comment