Is your resume getting you the results you want? I’ve worked with countless clients over the years that have come to me for help after months of fruitless job hunting on their own. Often, one of their greatest barriers is a resume that is underpowered. Not compelling. Boring. Meh. Related:4 Ways To Edit Your Resume Like A Professional Resume Writer If you suspect that your resume may not be as effective as it could be, it’s possible you’ve overlooked one of these key ways to put more power into your resume:
We get it. Looking for work can be scary, especially if you’ve been at it for a long time and haven’t gotten any results.
Understanding which fears are getting in the way and how to overcome them will make all the difference. Sometimes you might not be aware of which obstacle is getting in the way of your goals. If you want to overcome these fears once and for all, we invite you to join us!
In this training, you’ll learn how to:
- Utilize strategies for coping with your job search fears
- Be confident in your job search—from writing your resume to networking
- Face your fears and move forward
Join our CEO, J.T. O'Donnell, and Director of Training Development & Coaching, Christina Burgio, for this live event on Wednesday, October 5th at 12 pm ET.
CAN'T ATTEND LIVE? That's okay. You'll have access to the recording and the workbook after the session!
Firstly, many new data concepts have emerged in the last few years, such as data mesh and data fabric (the subject of a future post) which seek to solve the problem that data need to be distributed to the entire organization and users want to access it faster. The idea that we need a more integrated and distributed data environment is well accepted and makes sense in data analytic circles.
The data mesh as defined, in this case by Wikipedia, which for data mesh is as good a source as any: a sociotechnical approach to build a decentralized data architecture by leveraging a domain-oriented, self-serve design. With data mesh, the responsibility for analytical data is shifted from the central data team to the domain teams, supported by a data platform team that provides a domain-agnostic data platform.
To achieve the promise of distributing data to drive insight presumes that the data is of quality and that business domains have the readiness and maturity/skills to harness the power of the data and to "self-serve" to create insights to drive business impact. The vision of distributing data and insights to increase business impact is one that most CDAOs, CAOs, and CDOs embrace; in fact, most of us advocate for centralization first to stabilize and to quality assure data and to create platforms with gold standard data only to then create a hybrid model where the platforms are well maintained but access and teams are decentralized/linked throughout and to the business lines.
A few observations include the fact that it appears that data mesh has been put forward as a conceptual or theoretical idea without defining it well and pointing out its strengths and weaknesses. There is a history in technology circles of failed adoption of CRM platforms and more, so as we journey into this, we don't want to “build it, and they will come” or go from data mesh to data mess. Ok, so let's define the data mesh from what is known so far. Hopefully, we can debunk some of the nebulous shiny object syndromes related to the mesh so we can go forward with our eyes wide open asking good questions and adopting the best parts of the mesh wherever possible.
First and foremost, and I will say this throughout this piece, it will be necessary to put forward a tested commercially viable data mesh solution, which does not exist to date. Well, that's the spirit of test and learn, I suppose. Ok, so here we go. Are you ready to fasten your seat belts? If I could bring back Janice (OMG lady) from the series Friends right now, I would.
The Data Mesh Is A Theoretical Concept Or Construct Which Says The Following...
- Data mesh is a philosophy or a theory to drive architectures. I have not yet seen how this architecture manifests in a transparent way.
- Data is a strategic asset. Ok, no issue with that premise.
- There is no technological solution prescribed for the data mesh as of yet. This could be problematic as data mesh is not a tested construct, especially across industries.
- Data can be self-describing. The idea that data can be discovered and understood in the product sense can be problematic in some industries as it presumes that the business users know and understand the data and can back up the data engineers and analysts in a centralized platform team. I can buy this one if you are in a Silicon Valley software company, but not if you are in banking or financial services, where some product managers don't even have advanced excel skills. The end user maturity is still evolving. Data mesh advocates should define dependencies.
- Provisioned for access. Ok, I can buy this part, but just because you can supply data doesn't mean that the end users understand the data and know how to use it.
- FAIR data: findable, accessible, interoperable, and reusable. Ok, it certainly sounds good if it works smoothly. However, if it results in tons of duplication and the data isn't well defined as promised, what we were trying to solve with data mesh may cause the "Wild West Data Effect (WWDE)" with data replicated and flying around the organization. It is easy to say oh, go ahead and duplicate the data but shouldn't it be planned duplication? Does duplicated data exist in the mesh-o-verse forever?
- Some experts use the term knowledge graph interchangeably with data mesh. No issue with this, but I prefer a well-defined technology solution.
- Whether or not data mesh (DM) is an authentic architecture remains to be seen.
- DM assumes centralized database structures/teams don't work. Not sure I agree that centralized teams and platforms don't work; I think it is more about how CDAOs and CDOs link the team through the operating model and governance through partnerships.
- Data pipelines are fragile. I agree they are and are difficult to manage. Many new tools should be discussed in the context of data mesh which most vendors don't discuss. Where is the discussion of RPA, Pega, Immuta, Matillion, and more?
- Data engineers in the COE for data don't know the data well as they aren't using it. My POV is that it depends on the talent architecture and if it considers experiences and industry. The vendor's statement is an over-generalization that needs to be revisited.
- Analytical data is different from operational data. This point I agree with. But not all data needs to be returned to the data warehouse or data lake. It depends on what you want to do and where you want to do it. Many source systems have operational reporting for operation data, and many also have dashboards. So, this goes back to defining use cases and having a blueprint/strategy for what you want to do and where. I believe some of the vendor commentaries around this point need to be analyzed, and firms need to go back to basics lately and probe on data mesh vendor roadmaps and completeness of vision. What parts of the DM actually exist in any ecosystem?
- There are many monolithic and centralized data repositories. I don't think many firms have even gotten to ETL, especially not globally, let alone ELT and data mesh; much of the dialogue deals with Fortune 50 companies and not even Fortune 1000 companies.
- Data mesh seems to lessen the fact that data analytics is professional competency. It is believed DA is a bottleneck and is not connected to execution, which in most cases is far from the case. If the skill sets genuinely existed in the business lines, this would have happened by now. So we need to examine all of the connected roles in IT and operations to really understand the full picture of bottlenecks and centralize versus decentralize.
- Domain-driven data ownership architecture: I agree with this point if the domains via data stewards can drive their architecture, but I have not seen this often. Domains are often familiar but have no idea how to create data products or do analytics, let alone data modeling. I chuckle when I hear simple comments like "let's change the paradigm." I wish we could have a world where everyone knew analytics and engineering. That would genuinely be nirvana.
- Data as a product (data domains are the product). This is a great idea, but how do we connect these products across all the data as we still want to be customers centric? As long as this doesn't create data product silos, then fine. Most vendors who talk about data products don't' think about enterprise or customer centricity. Having data mesh advocates and researchers explain how to connect customer data to product data and cross multiple domains (LOB data areas) would be good. Using the word data product could be very confusing to business users as we have been talking about customer views for a long time. This needs to be better defined than I have seen in the business press.
- Data should be served and useable at the source. It sounds great. I would like to see how this will work without recreating the processes/tools in DA COEs. I would love to see how vendors push these capabilities upstream to the source. I agree that this would be a significant step change when and if this is technically possible and domains/product owners have the skills to manage this.
- Data moves around, and we can't get to one source of truth. I agree that it has been an elusive goal and only partially achieved (it's more mature in the marketing domain). I would love to understand how vendors who are commenting on data pipelines are coming up with an architecture to make the internal implementation of domains and domain-oriented distribution a reality.
- We don't need the data catalog to have usable data. Alternatives?
- Too many misunderstood terminologies, such as metadata. In the data mesh, the metadata layer still exists. However, DM advocates suggest using simple English and less jargon to describe terms like metadata, master data, catalog, etc. Amen to this one; I agree, but you still need to meet the parameters of what metadata provides.
- The data engineering team still sets up the infrastructure. Yes, they will need to, but data mesh seems to accuse data engineers of holding the business back from using the data, and I disagree with this idea. This depends on the org and engagement models and governance.
- Domain teams in the business can put their data into the lake themselves. I look forward to this day.
- Decentralize storage with centralized infrastructure. How will data governance, policies, and controls work in this DA environment?
- From specialists to generalists. This will require a massive push in training and education. This will work better in tech companies. I would love business and domain users to have the statistical and technical skills to create data products. This change will require new jobs, families, education, and training with significant investment. Also, academic institutions are not currently up to speed on these bleeding-edge ideas to provide a training source and talent pool. Vendors and firms will need to develop their curriculum and training,
- Responsibility for quality and security shifts back to the business lines under the data mesh. It will be interesting to see how the data mesh assures standards and defines security and quality aspects going forward. I agree with this trend as an extension of the data steward concept already in progress under data governance.
In summary, if we are serious about the data mesh, we need to do an entirely new business case and rationalize all of the global concerns that the data mesh presents. For me, data mesh is currently a theory that could turn into an official architecture or a guiding principle. As of now, the data mesh has raised more questions than answers. The data mesh does not necessarily point out the differences and uses case between operational and analytical data, which in my mind, still have a different fit for purpose use case. Changing everyone's mind will take more than just one vendor coining a term to flip the current paradigms on their head without significantly more research and testing. Said differently, we need data about the data mesh (case studies, success stories, and more).
I look forward to your thoughts and comments. What has your experience to date been with the data mesh and how far away do you think you are from adopting this concept?
When companies go through hard times, they must do whatever it takes to stay afloat. This looks different in every industry, but the solution is always to cut costs. Executives and other business leaders within a company need to decide what expenses they should target.
We recently asked our leading executives what cost-cutting measures their industry takes when hard times hit.
Here are their responses...
Andrea Markowski, Marketing Executive
Image from Bigstock
Generally, marketing budgets are usually cut first, no matter the industry. There are a few reasons for this, but one of the most probable is that marketing can be difficult to track and justify.
This is the nature of marketing—it can be intangible and thus hard to show that the work being done is benefiting the bottom line. For example, how much value does a new branding campaign bring to a company? There are ways to approximate this, but showing concrete results in dollars and cents isn’t easy.
The best way for marketing departments to hold on to their budgets and prove their effectiveness is through detailed (and time-consuming) tracking in a CRM system. If it’s possible to draw a clear line from marketing efforts to new customer acquisition, then it’s worth it.
However, many beneficial marketing activities simply cannot be tracked or measured. For example, determining the effectiveness of a billboard in a highly-trafficked area is a challenge. No one can argue, though, that it is likely seen by the hundreds of thousands of people who pass by. Tracking results is ideal, but it’s nothing to obsess over.
Andrea Markowski is a marketing director with specializations in strategy development, digital tactics, design thinking, and creative direction. She has superpowers in presentations and public speaking.
Lisa Perry, Global Marketing Executive
Image from Bigstock
Cost-cutting has become synonymous with corporate survival, and the marketing budget is typically the first to go. Unfortunately, most leaders see marketing spending as an expense, not an investment. This is a shortsighted approach as a strategically developed and executed marketing strategy is a source of revenue.
That said, there comes a time when we all need to figure out how to do more creatively with less. Here are five tips to consider when looking to reduce your marketing budget.
1. Define Measurement Strategies: Identify KPIs (i.e. conversions, cost per acquisition, ROI) across key marketing strategies to ensure you are focused on driving the bottom line.
2. Templatize Your Content: Create design templates that can help with reducing a more agile testing process, design costs, and improving efficiencies.
3. Repurpose Existing Content: Take existing content, make it relevant, and reuse it across multiple channels.
4. Go Digital: Going online with your marketing collateral can save time and money.
5. Vendors That Focus on ROI: Ensure the vendors you are working with are focused on giving you the most value for your investment, ensuring a positive ROI.
Making marketing cuts are never easy. Keep in mind that whatever cuts you decide to make, ensure that you are investing strategically in your business.
Lisa Perry helps companies build leadership brands, driving loyal customers & delivering profitability. She does this through a process that builds brands consumers love. Her goal is to help companies develop, monetize, and grow their brands.
John Schembari, Senior Education Executive
Image from Bigstock
In the wake of COVID-19, the Federal Government has provided ESSER funding (Elementary and Secondary School Emergency Relief) to schools which will continue through 2022. However, next year, all bets are off when it comes to how much money schools will have in their coiffures.
When public schools face financial downturn, non-discretionary costs come first. The biggest non-discretionary cost is usually salaries often followed by special education services that districts are mandated to provide students with individualized education plans. If schools cannot provide these services in-house, schools must pay to send students to outside programs that can support their needs.
Transportation also is usually a large expenditure for some school districts as is electricity, power, and heating (in colder climates—this is why some districts like NYC have traditionally had winter recess to cut down on expenses). Far from fixed, we have seen many of these inescapable costs rise exponentially during our current times of inflation.
Some private schools have endowments to weather financial storms. However, for public district schools, financial insecurity usually means that extracurricular programming and extra academic supports are on the chopping block.
This sometimes even includes professional development services for teachers like the services that I provide as a learning coach and consultant. While many school systems have used ESSER funding to provide post-COVID-19 catch-up tutoring that is still gravely needed, tutoring supports may also fall by the wayside in turbulent financial times.
In addition to a reduction in extracurricular programming, we also usually see a reduction in course offerings in non-“academic core” subject areas—like arts and music education—and field trips/excursions. Although controversial and perhaps not in line with healthy life choices, under duress, we may see some schools allowing certain companies (such as food/beverage companies) to brand/sell their merchandise in school, in exchange for financial compensation, so that extracurricular programming can continue.
John Schembari is a current K-12 teacher/school leader academic improvement coach and former school building and district administrator. He loves to draw, travel, swing dance, and read nonfiction.
Sarita Kincaid, Tech Media Executive
Image form Bigstock
During economically uncertain times, many tech companies enact deep cost-cutting measures as an alternative to a reduction in force (RIF). Budget “downsizing” is a frequent topic of conversation these days among corporate communications leaders—what to cut and for how long are common discussions.
Aside from obvious OPEX cuts like freezing incremental headcount requisitions, reducing business travel, and canceling sponsorships, the following reductions should be considered with care:
AR and PR Agencies: While agencies consume a big part of most marketing budgets, reducing hours/scope of work should be well thought out. The impact that effective AR and PR have on a business is significant to supporting business and revenue goals. And, in smaller companies, an agency often serves as an entire department.
Influencer Programs: Co-marketing activities with influencers may seem like an easy place to take budget cuts, but these programs are high profile, touch a lot of buyers, and should be generating positive PR. It’s difficult to establish and grow brand awareness and preference, but taking your “foot off the gas” after establishing that momentum with influencers, isn’t a great strategy. And, remember, influencers need to make a living; if they aren’t working with your company, they may start working with a competitor.
Sarita Kincaid is a tech media executive with a demonstrated ability to build and grow award-winning programs. She brings a data-driven approach to influencer relations with a focus on developing strong brand advocates and aligning them with sales programs.
What cost-cutting measures are usually taken in your industry? Join the conversation inside Work It Daily's Executive Program.