Dear Experts, I am an Indian expatriate working in the Middle East. I work for a wonderful company for which I have a lot of respect but there are certain things which are completely outrageous, at least from the perspective of an employee. Our Company Policy states the employee should have three years experience at the company to be given the vacation salary before going on vacation. If the employee has less than three years experience, the company would pay their vacation salary only after their return from vacation. Management says this is because of those employees who work for a year or two and do not return. They say they hold the vacation salary as it would cover the expenses incurred by the company for processing the visa, paying the manpower agency, flight ticket and other arrangements when they hire an employee from another country. I want the company to know I don’t feel good about this policy. How can I tell my management this policy makes me feel negative about the company and more so it makes me feel the company doesn’t trust me? I want to convey the message in the best way possible without spoiling my image and without being rude. Here is how our CAREEREALISM-Approved Experts answered this question on Twitter:Q#449 I'd keep it quiet unless you have an inroad to someone who can make a change. Sorry. (@beneubanks) Q#449 This is a sticky situation. I might be tempted to stay quiet. Sometimes rocking the boat isn't the best option. (@gradversity) Q#449 Agree w/ Dawn & J.T. Keep quiet; in fact, try to find way to understand their reasoning helping to resolve your angst. (@ValueIntoWords) Q#449 Best to keep opinion to yourself on this. Policy in place for a reason. NOT personal, business. Their company. Their rules. (@DawnBugni) Q#449 I wouldn't say anything. This policy's in place because they got burned. No point in trying to change it. (@jtodonnell) Our Twitter Advice Project (T.A.P.) is no longer an active campaign. To find an answer to the above question, please use the "Search" box in the right-hand column of this website.
We get it. Looking for work can be scary, especially if you’ve been at it for a long time and haven’t gotten any results.
Understanding which fears are getting in the way and how to overcome them will make all the difference. Sometimes you might not be aware of which obstacle is getting in the way of your goals. If you want to overcome these fears once and for all, we invite you to join us!
In this training, you’ll learn how to:
- Utilize strategies for coping with your job search fears
- Be confident in your job search—from writing your resume to networking
- Face your fears and move forward
Join our CEO, J.T. O'Donnell, and Director of Training Development & Coaching, Christina Burgio, for this live event on Wednesday, October 5th at 12 pm ET.
CAN'T ATTEND LIVE? That's okay. You'll have access to the recording and the workbook after the session!
During the pandemic, organizations had to interact with their customers digitally. Contact centers provided the company’s “human face.”
Without face-to-face interactions, it is a lot harder to understand how your customers feel, since you cannot experience customer behavior directly.
Running a contact center is like steering a submarine: you need a periscope to see what is going on.
What Does A Contact Center Manager Use For A Periscope?Bigstock
Contact center managers have two tools—post-call customer satisfaction (CSAT) surveys and sentiment analysis.
CSAT surveys ask customers to react after their encounters with the company, prompting them to give a numerical score.
Sentiment analysis uses speech analytics to take customers’ “emotional temperature” during the conversation.
I believe that sentiment analysis is a better “periscope” than post-call surveys.
Post-Call CSAT MeasurementBigstock
How It Works
When the interaction ends, the automated survey asks the customer to give a numerical score. This measures how they feel about the interaction. Customers may also be asked to say why they gave this score.
Survey Wording Issues
One popular CSAT measurement is the net promoter score (NPS). Customers are asked how likely they are to recommend the company to their friends and relatives.
NPS’s strongest advocates believe asking how likely customers are to recommend the company is better than asking how happy they feel. It’s not clear how carefully respondents think about the question. They are asked to respond unexpectedly. They rarely have the time or the interest to consider the question carefully. Their response will most likely reflect their emotional state.
NPS’s scoring system may not match up with how customers think. NPS classifies anyone giving a score of 6 out of 10 or below as “detractors,” or people who will complain about the company. Customers giving 9 or 10 out of 10 are classified as promoters, or people who will tell everyone how good the company is. Those giving 7 or 8 are classified as “passive.” Respondents are unlikely to think in such depth. If their problem has been solved, they will give a high score, if it hasn’t, they will give a low score. Some respondents have even given a 7 or 8, because “they never give 10 on principle.”
About 3% of customers respond to post-call surveys. This is too small to be considered a representative sample. Where results show poor CSAT, this may reflect angry customers’ motivation to show their feelings or get “revenge” on the agent. It does not necessarily indicate how all customers feel.
Inconsistent customer reactions and low sample sizes make aggregating CSAT data a frustrating task. Inaccuracies potentially baked into each result are then compounded by the volume of results.
At a high level, ranking agents’ average CSAT or NPS scores can raise some red flags if an agent has a lower score than the team average. The same can apply to team or queue averages.
How It Works
This is a much newer technology than post-call surveys. Speech analytics software can be programmed to identify and indicate whether customers are expressing positive, neutral, or negative feelings.
It is trained to recognize such feelings based on samples where the speaker’s feelings are known. The system uses artificial intelligence (AI) to construct a picture of which combinations of phrases, pitch, pace, and volume match feelings that have been identified in a recording by the AI’s trainer. Where mismatches are discovered, the system can be further trained.
For sampling purposes, the sounds on a voice call can be split into each party on the call and analyzed separately. Sentiments can be identified even when both parties are speaking at once.
This is the major differentiator between sentiment analysis and post-call surveys. Sentiment analysis is usually applied to all calls. It can be applied to all parts of a call, showing users how customers’ feelings change throughout the call. The sample size is likely to equal the population being studied, so the statistical significance of the data cannot be denied.
Sentiment analysis reflects how the customer feels without having to process and respond to a question.
One limitation is that because there is no question, you cannot tell why the customer is angry. The root cause might be the agents’ behavior, the issue with the product, or be unrelated to the call at all.
Since the sample size is so much larger, there is more scope for aggregation and analysis. You need to build a set of benchmarks to establish what is “normal” for your population. If a water-utilities contact center handles issues relating to wastewater disposal, customer sentiment will be fairly negative as a matter of course.
League tables showing average sentiment by agent, team, or queue can quickly identify where improvements can be made. Comparing or correlating this with other data such as call length or first contact resolution, you can see how contact center operations affect customer perception. You can see what makes customers angry or happy, and then tune your offerings as a company accordingly.
Sentiment analysis clearly produces more data than a post-call survey, but it's usually more expensive to collect. Cloud computing is making speech analytics and sentiment analysis more affordable for smaller contact centers.
What do you use as a “periscope” on your contact center? How useful are the results? Do they match your expectations or are they surprising? I’d love to hear more!
Here are links to some other articles on NPS, customer feedback, and customer sentiment:
In this article, we are going to review the elements of a good analytics planning framework and how analytics planning is part of data product ownership in the data mesh.
What Is Analytics Planning?Bigstock
As part of any CDO or CDAO role, there is both data and analytics governance and a process for ensuring that analytics and insights are generated from the right data to solve a variety of business problems.
To make sure that data products (i.e., dashboards, insights, commercialized analyses, etc.) in the data mesh are fit for purpose, the business and analytics problem framing must occur to have workable high-impact solutions.
Analytics planning and next-generation analytics are helpful to a variety of stakeholders—chief data analytics officers, chief data scientists, heads of marketing analytics, and heads of digital analytics.
Many times, data analytics is a center of excellence, and therefore is vital for the professionals in the COE to have a seat at the table whether that is with a data product owner, a tribe lead, or a business person. This linkage and relationship are vital, not only from a relationship management standpoint but to enable the right data mesh design by helping to identify the right analytics and data products. The goal is to get the data needed to improve business decision-making and monetization.
What Type Of Meeting Or Committee Does Analytics Planning Require?
Analytics liaisons and data stewards from the COE should meet with data product owners and business people in what I call data analytics governance meetings where the types of analytics and data products are discussed. This is a “seat at the table” meeting among business partners to discuss the appropriate types of proactive analytics that would drive problem solutions and business impact.
Data analytics topics to be discussed include:
- Data requirements
- Descriptive analytics
- Predictive and prescriptive analytics
- Data products and monetization tactics
These leadership meetings should occur at least quarterly. Monthly (or more frequent) reviews should occur at the project team level. Typically, data analytics functions can have hundreds or thousands of projects depending on the number of business partners.
What Is The Business Purpose Of These Planning Meetings?Bigstock
For analytics or data products to be fit for purpose, you will want to review the partner's business strategy as well as any P&L drivers where analytics might have an impact:
- Frame the business problems and opportunities.
- Determine if the data mesh/data fabric can support these efforts.
- Then decide what the deliverables/solutions are and the path to deploy. Don’t lead with models, analyses, or research outputs. Ensure that if you build a solution there is a commitment from the client to deploy it with an understanding of the potential business benefit.
Data analytics governance creates a prioritization process.
The prioritization process could include business ROIs, GCOs (good customer outcomes), or other metrics to determine what gets worked on first. Are these projects high priority, medium, low, strategic, or even non-negotiable? (Non-negotiable might mean compliance projects which means the data analytics team must carve out bandwidth to create new data pilots/new analytics pilots. Pilots could include identification of new segments or new scoring systems based on transaction data and more.)
Data Analytics Planning — It All Goes Back To Business Problem Framing.Bigstock
What is the number one reason analytics fail? We hopefully all know this, but it is worth mentioning again: the number one reason analytics fail is due to a failure to frame the business problem correctly.
What type of problems may clients mention to the data analytics team during the quarterly check-ins?
- How are we improving against customer expectations?
- Are we connecting with prospective customers?
- How do we qualify sales leads for better cross-sell/upsell?
Analytics Problem Framing: Choosing The Type Of Analytics Method To Solve The Problem.Bigstock
Let’s review the categories of analytics that may be part of the discussion during the analytics planning meeting with the business and product owners.
- Metrics and measurement. How does the business person or product owner run their business line? That which is measured is actioned.
- Setting KPIs becomes a focal point for understanding key drivers of any problem and provides the jump-off point for additional analytics.
- KPIs and metrics are considered more of a BAU type of analytics and answer questions such as:
- How many customers do we have in which segments?
- How many and what channels are they using?
- Describing and profiling: often helps define customer behaviors.
- Which customers are profitable? Helps understand the 80/20 rule.
- What prospects are similar to our customers? Look-alike profiles, etc.
- What is the financial situation of our customers—are they wealthy, what life stage are they in, etc.?
- Knowledge discovery: surface unknown patterns which customers have. For example, if you're in a bank, are certain checking customers diminishing their balances which may mean they're taking their money out and potentially putting it elsewhere? Intervention strategies can be designed from this type of knowledge discovery.
- Segmentation and clustering: grouping customers by homogenous groups, for example, based on their value, life stage, potential, etc.
- Algorithms and prediction. Many data science and statistical methods can help to predict the customer's responsiveness, next best action, right channel to engage, risk level, and more.
So that's a little bit about how to match the business problem to the type of analytics. The next step would be for the analytics leader or analytics liaison to work with the data product owner or business lead to provide an endorsed quarterly data analytics plan which would also identify data needs in order to perform the agreed-upon analytics.
What are the elements of the analytics plan?
- A list of prioritized BAU initiatives that have been agreed upon from the meeting with the product owners along with business goals and projected returns generated from insights.
- Agreement on the type of analytics deliverables and the path to deploy. For example, will this model be scored on an ongoing basis to provide targeted leads to salespeople? If the business person or the product owner declines to leverage learnings, then these analytics should be prioritized as low or even canceled.
- Agreement to proactively serve up new analytics. Some level of innovative pilots should be part of any analytics planning framework. This approach takes the data analytics team out of defensive mode and puts them in an offensive, proactive, and prescriptive position.
- Analytics planning includes an agreement to do an ongoing blueprint and roadmap for analytics which includes an assessment of the maturity level of the firm’s data analytics. Unfortunately, many of the maturity models that exist only focus on data governance and don’t connect the dots between data maturity and data analytics maturity. A data analytics maturity assessment and blueprint must include looking at the level of next-generation analytics that the firm is developing and testing including RPA, generative AI, machine learning, and more. One view in the plan should assess the level of defensive data analytics the team is involved in versus offensive analytics. (Get in contact with me if you need more information about this maturity model.)
Given the data mesh puts a higher degree of quantitative skills on business partners, it is imperative for all stakeholders to have a better understanding of data, analytic methodologies, and execution. Training and knowledge maturity is critical.
I hope this post helps fill in some of the planning gaps in the data mesh concept and shows how analytics planning can inform what the data product owners can work on and how an ongoing engagement and governance model can be established to benefit both the analytics team as well as the business as a whole.
What has your experience been with data analytics planning in the data mesh? We look forward to hearing your thoughts.