The Workplace
Women in Japan were told not to wear glasses to work. Their response has been fiery.
Glasses, say some companies in Japan, are just not right for women to wear to work.
In recent reports by Japan’s Nippon TV and Business Insider Japan, women from a range of industries described being instructed by their employers not to wear glasses.
One receptionist recalled being told that glasses for her were not allowed, while a male receptionist was permitted to don corrective eyewear, Business Insider reported. A nurse at a beauty clinic developed dry eye from long hours in contacts but also was not allowed to wear glasses. Her employer imposed other requirements: Makeup was a must, as was making sure she didn’t gain too much weight. A domestic airline reportedly has the no-glasses rule for safety reasons. Some restaurants said glasses on female employees didn’t go well with their traditional attire.
More...
https://www.msn.com/en-ca/news/world/wo ... ailsignout
Glasses, say some companies in Japan, are just not right for women to wear to work.
In recent reports by Japan’s Nippon TV and Business Insider Japan, women from a range of industries described being instructed by their employers not to wear glasses.
One receptionist recalled being told that glasses for her were not allowed, while a male receptionist was permitted to don corrective eyewear, Business Insider reported. A nurse at a beauty clinic developed dry eye from long hours in contacts but also was not allowed to wear glasses. Her employer imposed other requirements: Makeup was a must, as was making sure she didn’t gain too much weight. A domestic airline reportedly has the no-glasses rule for safety reasons. Some restaurants said glasses on female employees didn’t go well with their traditional attire.
More...
https://www.msn.com/en-ca/news/world/wo ... ailsignout
How Performance Reviews Can Kill Your Culture
Performance reviews are designed to motivate and bring the best out of our teams, but they often do the opposite. Here’s how to bring out the best in your people.
If you ask people what’s wrong with corporate workplaces, it won’t take long before you hear someone mention something about being put into a performance bucket. The A bucket is for the best, and the C bucket is for the underperformers. The middle and most common bucket is B, as it spares the supervisor from having to justify why an individual is exceptional or on the verge of getting fired. The problem is that ranking someone against their peers is not the ranking that matters and is counterproductive in terms of building an exceptional corporate culture.
People hate performance reviews. And why wouldn’t they? You either come up short against the superstars, walk away being told to keep doing what you’re doing, or leave feeling like your days are numbered. In this common construct, no one is getting the information they need to properly grow, and a toxic competitive situation is created within the organization. Forced comparisons against others don’t accomplish what we want from them. We think it inspires people. It often makes them dislike each other.
The problem is the system.
The goal of performance reviews is ostensibly to help people become better, but forced ranking has two serious flaws. First, it doesn’t take account of individual rates of improvement. We’re all starting from different places, and we’re also all improving at different rates. If you always come up short, no matter how hard you try, eventually you can’t be bothered putting in the effort to get better.
The second, more important, argument is that forced rankings create a toxic environment that rewards poor behavior. When you’re pitted against your coworkers, you start to game the system. You don’t need to improve at all to get into the A bucket, you just need to make the others look bad. The success of one person means the failure of another. How likeable are you? How good are you at whispering and gossip? How big is your Christmas present to your boss? You can end up cutting others down to stand out as a star performer. But undermining the success of your coworkers ultimately means undermining the success of the entire organization.
Margaret Heffernan, author and former CEO, explained on The Knowledge Project how the relationship between coworkers is fundamental to the function of an organization:
“…the whole premise of organizational life is that together you can do more than you can do in isolation, but that only works if people are connected to each other. It only really works if they trust each other and help each other. That isn’t automatic. … You’re only really going to get the value out of organizational life to the degree that people begin to feel safe with each other, to trust each other, to want to help each other…What impedes the flow is distrust, rivalry, or not knowing what other people need.”
Most of us inevitably compare ourselves to others at some point. Chronic comparing though leads to misery. What matters is not what we do compared to what someone else does, it’s what we do compared to what we’re capable of doing. Both as individuals and in organizations, we need to pay attention to this gap—the gap between where we are right now and what we’re capable of.
Internal motivation is easier to sustain. We produce and push ourselves because we get this immense satisfaction from what we are doing, which motivates us to keep doing it. It doesn’t work the same way when your motivation comes in the form of external comparisons.
So what do we do instead?
If you must grade performances, do it against the past. Is she learning? Is he improving? How can we increase the rate of progress and development? Empower people to help and learn from each other. The range of skills in an organization is often an untapped resource.
Organizations today are often grappling with significant corporate culture issues. It can be the one thing that differentiates you from your competitors. Comparing people against their past selves instead of each other is one of the most effective ways to build a culture in which everyone wants to give their best.
https://fs.blog/2019/11/performance-rev ... l-culture/
Performance reviews are designed to motivate and bring the best out of our teams, but they often do the opposite. Here’s how to bring out the best in your people.
If you ask people what’s wrong with corporate workplaces, it won’t take long before you hear someone mention something about being put into a performance bucket. The A bucket is for the best, and the C bucket is for the underperformers. The middle and most common bucket is B, as it spares the supervisor from having to justify why an individual is exceptional or on the verge of getting fired. The problem is that ranking someone against their peers is not the ranking that matters and is counterproductive in terms of building an exceptional corporate culture.
People hate performance reviews. And why wouldn’t they? You either come up short against the superstars, walk away being told to keep doing what you’re doing, or leave feeling like your days are numbered. In this common construct, no one is getting the information they need to properly grow, and a toxic competitive situation is created within the organization. Forced comparisons against others don’t accomplish what we want from them. We think it inspires people. It often makes them dislike each other.
The problem is the system.
The goal of performance reviews is ostensibly to help people become better, but forced ranking has two serious flaws. First, it doesn’t take account of individual rates of improvement. We’re all starting from different places, and we’re also all improving at different rates. If you always come up short, no matter how hard you try, eventually you can’t be bothered putting in the effort to get better.
The second, more important, argument is that forced rankings create a toxic environment that rewards poor behavior. When you’re pitted against your coworkers, you start to game the system. You don’t need to improve at all to get into the A bucket, you just need to make the others look bad. The success of one person means the failure of another. How likeable are you? How good are you at whispering and gossip? How big is your Christmas present to your boss? You can end up cutting others down to stand out as a star performer. But undermining the success of your coworkers ultimately means undermining the success of the entire organization.
Margaret Heffernan, author and former CEO, explained on The Knowledge Project how the relationship between coworkers is fundamental to the function of an organization:
“…the whole premise of organizational life is that together you can do more than you can do in isolation, but that only works if people are connected to each other. It only really works if they trust each other and help each other. That isn’t automatic. … You’re only really going to get the value out of organizational life to the degree that people begin to feel safe with each other, to trust each other, to want to help each other…What impedes the flow is distrust, rivalry, or not knowing what other people need.”
Most of us inevitably compare ourselves to others at some point. Chronic comparing though leads to misery. What matters is not what we do compared to what someone else does, it’s what we do compared to what we’re capable of doing. Both as individuals and in organizations, we need to pay attention to this gap—the gap between where we are right now and what we’re capable of.
Internal motivation is easier to sustain. We produce and push ourselves because we get this immense satisfaction from what we are doing, which motivates us to keep doing it. It doesn’t work the same way when your motivation comes in the form of external comparisons.
So what do we do instead?
If you must grade performances, do it against the past. Is she learning? Is he improving? How can we increase the rate of progress and development? Empower people to help and learn from each other. The range of skills in an organization is often an untapped resource.
Organizations today are often grappling with significant corporate culture issues. It can be the one thing that differentiates you from your competitors. Comparing people against their past selves instead of each other is one of the most effective ways to build a culture in which everyone wants to give their best.
https://fs.blog/2019/11/performance-rev ... l-culture/
Diversity and Leadership in the Workplace
“Women-owned companies are responsible for creating 2.2 million jobs across the nation. They are more and more present in leadership positions as founders, executives, and corporate board members," says Farzana Nayani. Despite these achievements, she believes more needs to be done through “workplace infrastructure and policies that engage and promote belonging for women, as opposed to a culture of traditionally excluding and holding back women.”
Everyday reality is bittersweet when it comes to gender equality in the workplace. In recent times, we have made great strides as more women than men graduate from college and graduate school, and nearly equal numbers of American women and men now go into medicine and law – fields traditionally considered a man’s domain.
While there have been improvements, it is still a man’s world when it comes to leadership positions and high-paying jobs in cutting-edge companies. According to a 2015 Dow Jones study, only 4% of Fortune 500 Chief Executive Officers (CEOs) are women. The US Census Bureau reports that women earn 80 percent of what men are paid on average, comparing all jobs, but other studies indicate that the gap is far less when comparing the same jobs, a much better situation than in years past.
Why Diversity and Inclusion?
Diversity and inclusion have a direct effect on an organization’s bottom line. A more diverse workforce fosters greater creativity and is critical for innovation. Companies with higher levels of gender diversity and with policies and practices that focus on gender diversity are linked to lower levels of employee turnover. Organizations ranked highly on Fortune’s "World's Most Admired Companies" list have twice as many women in senior management than do companies with lower rankings. Also, mixed-gender boards have fewer instances of fraud.
“The idea of ‘diversity for the sake of it,’ is something that is not motivating - or we would have had more progress by now,” says Farzana. “There are clear impacts of inequitable practices and systemic oppression that each of us needs to actively work towards overcoming, and the benefit is success by companies and institutions that can recognize the power, talent, and expertise that women can bring. This is the value proposition,” she adds.
Farzana has also designed and delivered Unconscious Bias workshop sessions on behalf of the Ismaili Council for USA, in partnership with ITREB to our institutional leaders. These workshops addressed how we can create more inclusive environments for our children and fellow teachers.
Coming from an interfaith family, Farzana credits guidance from the Ismaili faith as the foundation for how she approaches diversity today, saying, "It is the teachings about pluralism and the ethical values that I uphold and that drives my perspectives on diversity and inclusion.” She adds, “Also, my early experiences teaching women sports at the Aga Khan University Sport and Rehabilitation Centre when it first opened in Karachi, reinforced my commitment to global understanding and the empowerment of women, in all aspects of life."
Farzana was one of the speakers at the Diamond Jubilee Alliances Conference in 2018, joining others in a TED Talk style session on "The Power of Possibility: Journeys of Women in Leadership." She shared her origin story, how she left the workforce to found her own consulting company, and shared personal aspects of how she has balanced life as an entrepreneur.
What Can Companies Do?
Photos and more...
https://the.ismaili/usa/diversity-and-l ... -workplace
“Women-owned companies are responsible for creating 2.2 million jobs across the nation. They are more and more present in leadership positions as founders, executives, and corporate board members," says Farzana Nayani. Despite these achievements, she believes more needs to be done through “workplace infrastructure and policies that engage and promote belonging for women, as opposed to a culture of traditionally excluding and holding back women.”
Everyday reality is bittersweet when it comes to gender equality in the workplace. In recent times, we have made great strides as more women than men graduate from college and graduate school, and nearly equal numbers of American women and men now go into medicine and law – fields traditionally considered a man’s domain.
While there have been improvements, it is still a man’s world when it comes to leadership positions and high-paying jobs in cutting-edge companies. According to a 2015 Dow Jones study, only 4% of Fortune 500 Chief Executive Officers (CEOs) are women. The US Census Bureau reports that women earn 80 percent of what men are paid on average, comparing all jobs, but other studies indicate that the gap is far less when comparing the same jobs, a much better situation than in years past.
Why Diversity and Inclusion?
Diversity and inclusion have a direct effect on an organization’s bottom line. A more diverse workforce fosters greater creativity and is critical for innovation. Companies with higher levels of gender diversity and with policies and practices that focus on gender diversity are linked to lower levels of employee turnover. Organizations ranked highly on Fortune’s "World's Most Admired Companies" list have twice as many women in senior management than do companies with lower rankings. Also, mixed-gender boards have fewer instances of fraud.
“The idea of ‘diversity for the sake of it,’ is something that is not motivating - or we would have had more progress by now,” says Farzana. “There are clear impacts of inequitable practices and systemic oppression that each of us needs to actively work towards overcoming, and the benefit is success by companies and institutions that can recognize the power, talent, and expertise that women can bring. This is the value proposition,” she adds.
Farzana has also designed and delivered Unconscious Bias workshop sessions on behalf of the Ismaili Council for USA, in partnership with ITREB to our institutional leaders. These workshops addressed how we can create more inclusive environments for our children and fellow teachers.
Coming from an interfaith family, Farzana credits guidance from the Ismaili faith as the foundation for how she approaches diversity today, saying, "It is the teachings about pluralism and the ethical values that I uphold and that drives my perspectives on diversity and inclusion.” She adds, “Also, my early experiences teaching women sports at the Aga Khan University Sport and Rehabilitation Centre when it first opened in Karachi, reinforced my commitment to global understanding and the empowerment of women, in all aspects of life."
Farzana was one of the speakers at the Diamond Jubilee Alliances Conference in 2018, joining others in a TED Talk style session on "The Power of Possibility: Journeys of Women in Leadership." She shared her origin story, how she left the workforce to found her own consulting company, and shared personal aspects of how she has balanced life as an entrepreneur.
What Can Companies Do?
Photos and more...
https://the.ismaili/usa/diversity-and-l ... -workplace
How modern workers are at the mercy of ratings
I am Number 0.6
A close friend of Bartleby’s just got the news that their department was shedding 2.6 workers. At first sight, the concept of 0.6 of a worker sounds pretty odd. But workers who are freelance, on temporary contracts, or in part-time employment register in the headcount as less than a whole number.
Being classed as 0.6 of a worker seems dehumanising. Few people want to be thought of as just a number, let alone a fraction. In “The Prisoner”, a cult British television series from the 1960s, the hero, played by Patrick McGoohan, resigns from his job as a secret agent only to be abducted and taken to a village. He is only referred to as “Number 6” and his frequent escape attempts are frustrated.
Although he insists that “I am not a number, I am a free man”, the audience never learns his name. The programme has a very 1960s vibe—it focuses on the individual’s efforts to assert himself in the face of a repressive, conformist society. At one point, the title character declares: “I will not be pushed, filed, stamped, indexed, briefed, debriefed or numbered. My life is my own.”
These days many workers would sympathise. They feel pushed, filed, indexed and numbered. When they apply for a job, they may be assessed by artificial intelligence, which parses résumés for key words without which an applicant’s odds of an interview lengthen. Based on works like “Evidence-Based Recruiting” by Atta Tarki, who claims that scores in general-mental-ability tests have a strong 65% correlation with job performance, firms may ask candidates to take an intelligence test.
When they get a job, employees find the indexing and numbering continues. Workers at warehouses have to pick a certain number of items per hour; those at call-centres are assessed by software that monitors their hourly number of calls, and the amount of time spent on each one. Fall behind the target and you may feel unable to take a break. When their task is completed, employees are often rated again, this time by the customers.
Manufacturing workers have long faced these kind of numerical targets, as well as the need to clock in and out of work. The big change is that similar metrics and rating systems are spreading to more and more parts of the economy. Academics get rated by students; nurses may be judged on a “behaviourally anchored rating scale” which assesses how much empathy they showed to patients.
Ratings are at the heart of the gig economy, where workers are connected with employers and customers via the internet. Just as TripAdvisor ratings allow holidaymakers to assess hotels, Uber drivers get a score out of five. The same goes for ratings on services like TaskRabbit (for odd jobs) and Etsy (for arts-and-crafts sellers).
Such systems are understandable in parts of the economy where output is difficult to measure precisely. But they can be arbitrary. People might give an Uber driver a poor rating because they are in a bad mood or because they encountered unexpected traffic disruption (the drivers themselves also rate customers, which is meant to discourage abuse).
The result can be increased insecurity for gig-economy workers. Their income is uncertain when they are at the mercy of the assessment system. Even a tiny fall in their rating—of, say, 0.6—can harm their job prospects. A detailed study* of 65 gig-economy workers found that they relished their independence but it came with a host of personal, social and economic anxieties.
Even full-time workers may find themselves dependent on their score in one category or another. Businesses want to avoid accusations of hiring biases on grounds of gender or ethnicity; using “objective” rating systems can protect them from discrimination lawsuits. And employees need to be concerned about how they are rated.
https://www.economist.com/business/2020 ... of-ratings
I am Number 0.6
A close friend of Bartleby’s just got the news that their department was shedding 2.6 workers. At first sight, the concept of 0.6 of a worker sounds pretty odd. But workers who are freelance, on temporary contracts, or in part-time employment register in the headcount as less than a whole number.
Being classed as 0.6 of a worker seems dehumanising. Few people want to be thought of as just a number, let alone a fraction. In “The Prisoner”, a cult British television series from the 1960s, the hero, played by Patrick McGoohan, resigns from his job as a secret agent only to be abducted and taken to a village. He is only referred to as “Number 6” and his frequent escape attempts are frustrated.
Although he insists that “I am not a number, I am a free man”, the audience never learns his name. The programme has a very 1960s vibe—it focuses on the individual’s efforts to assert himself in the face of a repressive, conformist society. At one point, the title character declares: “I will not be pushed, filed, stamped, indexed, briefed, debriefed or numbered. My life is my own.”
These days many workers would sympathise. They feel pushed, filed, indexed and numbered. When they apply for a job, they may be assessed by artificial intelligence, which parses résumés for key words without which an applicant’s odds of an interview lengthen. Based on works like “Evidence-Based Recruiting” by Atta Tarki, who claims that scores in general-mental-ability tests have a strong 65% correlation with job performance, firms may ask candidates to take an intelligence test.
When they get a job, employees find the indexing and numbering continues. Workers at warehouses have to pick a certain number of items per hour; those at call-centres are assessed by software that monitors their hourly number of calls, and the amount of time spent on each one. Fall behind the target and you may feel unable to take a break. When their task is completed, employees are often rated again, this time by the customers.
Manufacturing workers have long faced these kind of numerical targets, as well as the need to clock in and out of work. The big change is that similar metrics and rating systems are spreading to more and more parts of the economy. Academics get rated by students; nurses may be judged on a “behaviourally anchored rating scale” which assesses how much empathy they showed to patients.
Ratings are at the heart of the gig economy, where workers are connected with employers and customers via the internet. Just as TripAdvisor ratings allow holidaymakers to assess hotels, Uber drivers get a score out of five. The same goes for ratings on services like TaskRabbit (for odd jobs) and Etsy (for arts-and-crafts sellers).
Such systems are understandable in parts of the economy where output is difficult to measure precisely. But they can be arbitrary. People might give an Uber driver a poor rating because they are in a bad mood or because they encountered unexpected traffic disruption (the drivers themselves also rate customers, which is meant to discourage abuse).
The result can be increased insecurity for gig-economy workers. Their income is uncertain when they are at the mercy of the assessment system. Even a tiny fall in their rating—of, say, 0.6—can harm their job prospects. A detailed study* of 65 gig-economy workers found that they relished their independence but it came with a host of personal, social and economic anxieties.
Even full-time workers may find themselves dependent on their score in one category or another. Businesses want to avoid accusations of hiring biases on grounds of gender or ethnicity; using “objective” rating systems can protect them from discrimination lawsuits. And employees need to be concerned about how they are rated.
https://www.economist.com/business/2020 ... of-ratings
Skills for the future: Part one
Data and analytics will only become more prevalent in the future, as more of our actions are tracked. Learning how to use data to power your decisions will be essential for most roles in the future.
The World Economic Forum predicts that millions of jobs will be lost in the coming years as artificial intelligence, robotics, nanotechnology, and other socio-economic factors replace the need for human workers. How can we begin to prepare for a future that will no doubt be more mobile, autonomous, and machine-driven than today?
At the Peterson Lecture to the International Baccalaureate in Atlanta in 2008, Mawlana Hazar Imam outlined a number of key attributes required to be adaptable, saying, “In a world of rapid change, an agile and adaptable mind, a pragmatic and cooperative temperament, a strong ethical orientation - these are increasingly the keys to effective leadership. And I would add to this list a capacity for intellectual humility which keeps one’s mind constantly open to a variety of viewpoints and which welcomes pluralistic exchange.”
How can we even begin to prepare for a future that will no doubt be more mobile, autonomous, and machine-driven than today? This question is prodding workers to think about lifetime commitments to retraining and upgrading their skills, and is seeping into society’s consciousness about where these constantly-evolving skills should be learned. The following are a few skills and attitudes identified as critical for success in the future:
Grit
“Grit” has become one of the biggest buzzwords among talent acquisition, education-policy, and young professional circles. Largely credited to scholar and psychologist Angela Duckworth of the University of Pennsylvania, grit is comprised of characteristics like guts, resilience, initiative, and tenacity, and it centers on the idea that grit is a better indicator of success than talent or IQ. It can also be taught and found in anyone regardless of age, race, or gender.
Radhika Aggarwal, who is CBO and co-founder of ShopClues, can attest to that. When leading a team as a strategy manager at Nordstrom, she said she rebuffed individual smartness and intelligence as the hallmarks of success. Instead, she began valuing grit and mindset as “fundamentally more critical to individual and organizational success.
“Running a start-up or doing any other job for that matter, requires extended periods of commitment and hard work,” she said. “The fact that you are there every day, especially on those days when every part of your body and mind is screaming, ‘I can't do this anymore.’ The one who picks herself up and is determined to grow is the one who wins in the long run.”
Social intelligence and intellectual humility
According to the World Economic Forum, empathy, creativity, leadership, intuition, and social intelligence make up the perfect mélange of skills to prepare for the future. Additionally, it advises to pay attention to how machines function and think. The Pew Research Center confirms the general public also sees a mix of technical skills, soft skills, and attitudes as being fundamental to success in today’s economy.
Social intelligence includes empathy, but goes beyond to consider more of a global orientation. As Mawlana Hazar Imam has cautioned in multiple speeches: “Knowledge gaps so often run the risk of becoming empathy gaps." In a time when one’s various roles and projects will be changing frequently, transferring relevant information and displaying empathy in the transition will only make for smoother cross-disciplinary, even cross-cultural collaboration.
Intellectual humility is about recognizing and accepting the limits of one’s knowledge. It is especially important considering we subconsciously run the risk of filtering and consuming information that results in confirmation bias and perpetuates even more stubbornness in our beliefs. To this, Hazar Imam has urged listeners to exhibit intellectual humility and combat the custom by “launching an ardent, lifelong search for the knowledge they will need.”
Oftentimes, that knowledge comes in unexpected ways. In fact, research has shown that intellectually humble adults are more likely to learn from people with whom they disagree.
“When we’re more engaged and listening to the other side, the disagreements tend to be more constructive,” said Tenelle Porter, a postdoctoral researcher in psychology at the University of California, Davis. But to get there, she added, “We have to be willing to expose ourselves to opposing perspectives in the first place.” Once we do, the results can lead to civil discourse and a cosmopolitan outlook.
Finally, this matters in the job market, too. Laszlo Block, Google's senior vice president of people operations, said he searches for candidates who exhibit intellectual humility because without it, he contends, one is unable to learn.
Zahir Ladhani, marketing professional and life coach, echoes this sentiment: “In order to empower high performing teams, leaders must ask more questions than they provide answers. Asking questions displays the self-awareness that one person, even if he or she has authority, cannot know everything. Humility manifests in listening and responding, so as to inspire everyone to contribute. Consultation and consensus generally provide the best outcome”
Sensemaking or analytics
Consider sensemaking (the process by which people give meaning to their collective experiences) to be the new wave of critical thinking. The most common demonstration of this is via data analysis. Currently, an innumerable amount of information about us is being collected. But data can be overwhelming, and is only useful when you can extract insights and then act on them. At that point, it becomes powerful. Data and analytics will only become more prevalent in the future, as more of our actions are tracked. Learning how to use data to power your decisions will be essential for most roles in the future.
This could be one reason Bill Gates, a self-described futurist, has recently determined that workers savvy in science, engineering, and economics will be “the agents of change for all institutions.”
Coding is another means by which sensemaking can occur. While we’re not prompting everyone to become the next Y Combinator Hackathon winner, learning how to code can change your mindset and provide related advantages in how we view the world, see what is possible, and problem solve for it.
https://the.ismaili/our-stories/skills-future-part-one
Data and analytics will only become more prevalent in the future, as more of our actions are tracked. Learning how to use data to power your decisions will be essential for most roles in the future.
The World Economic Forum predicts that millions of jobs will be lost in the coming years as artificial intelligence, robotics, nanotechnology, and other socio-economic factors replace the need for human workers. How can we begin to prepare for a future that will no doubt be more mobile, autonomous, and machine-driven than today?
At the Peterson Lecture to the International Baccalaureate in Atlanta in 2008, Mawlana Hazar Imam outlined a number of key attributes required to be adaptable, saying, “In a world of rapid change, an agile and adaptable mind, a pragmatic and cooperative temperament, a strong ethical orientation - these are increasingly the keys to effective leadership. And I would add to this list a capacity for intellectual humility which keeps one’s mind constantly open to a variety of viewpoints and which welcomes pluralistic exchange.”
How can we even begin to prepare for a future that will no doubt be more mobile, autonomous, and machine-driven than today? This question is prodding workers to think about lifetime commitments to retraining and upgrading their skills, and is seeping into society’s consciousness about where these constantly-evolving skills should be learned. The following are a few skills and attitudes identified as critical for success in the future:
Grit
“Grit” has become one of the biggest buzzwords among talent acquisition, education-policy, and young professional circles. Largely credited to scholar and psychologist Angela Duckworth of the University of Pennsylvania, grit is comprised of characteristics like guts, resilience, initiative, and tenacity, and it centers on the idea that grit is a better indicator of success than talent or IQ. It can also be taught and found in anyone regardless of age, race, or gender.
Radhika Aggarwal, who is CBO and co-founder of ShopClues, can attest to that. When leading a team as a strategy manager at Nordstrom, she said she rebuffed individual smartness and intelligence as the hallmarks of success. Instead, she began valuing grit and mindset as “fundamentally more critical to individual and organizational success.
“Running a start-up or doing any other job for that matter, requires extended periods of commitment and hard work,” she said. “The fact that you are there every day, especially on those days when every part of your body and mind is screaming, ‘I can't do this anymore.’ The one who picks herself up and is determined to grow is the one who wins in the long run.”
Social intelligence and intellectual humility
According to the World Economic Forum, empathy, creativity, leadership, intuition, and social intelligence make up the perfect mélange of skills to prepare for the future. Additionally, it advises to pay attention to how machines function and think. The Pew Research Center confirms the general public also sees a mix of technical skills, soft skills, and attitudes as being fundamental to success in today’s economy.
Social intelligence includes empathy, but goes beyond to consider more of a global orientation. As Mawlana Hazar Imam has cautioned in multiple speeches: “Knowledge gaps so often run the risk of becoming empathy gaps." In a time when one’s various roles and projects will be changing frequently, transferring relevant information and displaying empathy in the transition will only make for smoother cross-disciplinary, even cross-cultural collaboration.
Intellectual humility is about recognizing and accepting the limits of one’s knowledge. It is especially important considering we subconsciously run the risk of filtering and consuming information that results in confirmation bias and perpetuates even more stubbornness in our beliefs. To this, Hazar Imam has urged listeners to exhibit intellectual humility and combat the custom by “launching an ardent, lifelong search for the knowledge they will need.”
Oftentimes, that knowledge comes in unexpected ways. In fact, research has shown that intellectually humble adults are more likely to learn from people with whom they disagree.
“When we’re more engaged and listening to the other side, the disagreements tend to be more constructive,” said Tenelle Porter, a postdoctoral researcher in psychology at the University of California, Davis. But to get there, she added, “We have to be willing to expose ourselves to opposing perspectives in the first place.” Once we do, the results can lead to civil discourse and a cosmopolitan outlook.
Finally, this matters in the job market, too. Laszlo Block, Google's senior vice president of people operations, said he searches for candidates who exhibit intellectual humility because without it, he contends, one is unable to learn.
Zahir Ladhani, marketing professional and life coach, echoes this sentiment: “In order to empower high performing teams, leaders must ask more questions than they provide answers. Asking questions displays the self-awareness that one person, even if he or she has authority, cannot know everything. Humility manifests in listening and responding, so as to inspire everyone to contribute. Consultation and consensus generally provide the best outcome”
Sensemaking or analytics
Consider sensemaking (the process by which people give meaning to their collective experiences) to be the new wave of critical thinking. The most common demonstration of this is via data analysis. Currently, an innumerable amount of information about us is being collected. But data can be overwhelming, and is only useful when you can extract insights and then act on them. At that point, it becomes powerful. Data and analytics will only become more prevalent in the future, as more of our actions are tracked. Learning how to use data to power your decisions will be essential for most roles in the future.
This could be one reason Bill Gates, a self-described futurist, has recently determined that workers savvy in science, engineering, and economics will be “the agents of change for all institutions.”
Coding is another means by which sensemaking can occur. While we’re not prompting everyone to become the next Y Combinator Hackathon winner, learning how to code can change your mindset and provide related advantages in how we view the world, see what is possible, and problem solve for it.
https://the.ismaili/our-stories/skills-future-part-one
Skills for the future: Part two
Even though we’re in the automation age, it’s important to remember that machines can’t do everything. It must be balanced by human understanding of the technical and implementation of social and emotional skills.
Although we now live in an an age of automation, it’s important to remember that machines can’t do everything. Technical efforts must be balanced with social and emotional skills. Part two of our Future Skills article highlights the importance of technical, cognitive, and soft skills in preparing for the future.
In a speech at the Opening Ceremony of the Aga Khan School in Osh, the Kyrgyz Republic in 2002 Mawlana Hazar Imam said, "The ability to make judgments that are grounded in solid information, and employ careful analysis should be one of the most important goals for any educational endeavor. As students develop this capacity, they can begin to grapple with the most important and difficult step: to learn to place such judgements in an ethical framework. Therein lies the formation of the kind of social consciousness that our world so desperately needs."
Growth mindset, adaptive thinking, and agile methodology
Twenty years ago, if someone was said to be pursuing a career in social media management or “user experience,” it would result in confused staring. Today, it is not certain what the latest job trends will be even a decade from now.
Yet, there is a way to train to become ready to see – and seize – the opportunity when it arises. It starts by having a growth mindset. Popularized by Stanford University psychologist and author of Mindset: The New Psychology of Success, Carol Dweck, this is the notion that intelligence is as malleable as a muscle, as opposed to fixed and limited from birth. Whether the “muscle” is being strengthened or atrophied is up to each individual, but those who recognize that metaphoric “strength training” is possible, will ultimately be more inclined to “train” leading to greater learning and tenacity.
Being mentally nimble matters to employers.
“For Google, the number one thing we look for is general cognitive ability, and it’s not IQ, it’s learning ability. It’s the ability to process on the fly. It’s the ability to pull together disparate bits of information,” said Block.
These values are being hailed by other elite companies as well. IBM urges its workforce to “restlessly reinvent.” According to its CEO, Ginny Rometty, that’s the only way a company so old can stay relevant today, and that requires a paradigm shift; to be known for the inquisitive people who lead to world-changing innovation, versus products that may come and go.
Agile thinking takes these first two components one step further. Typically used in the context of the design-based thinking sweeping the globe, mental agility is more aligned with a methodology to elicit practical and creative problem solving by adjusting as you go. No more putting all your eggs into one basket or attempt and hope it doesn’t fail. The goal is to explore, prototype, evaluate, and modify along the way.
Considering we ourselves have become products and brands that are constantly valued and bought, this iterative method makes sense to harness as we too must develop new skills and competencies continuously. In doing so, Hazar Imam says that we “nurture the spirit of anticipation and agility, adaptability and adventure.”
Ethical literacy, moral reasoning, and decision making
The Pew Research Center analysis of government jobs data finds that for the past several decades, employment has been rising faster in jobs requiring higher levels of preparation – that is, more education, training, and experience. The number of workers in occupations requiring average to above-average education, training, and experience increased by 67% between 1980 and 2015.
What good is additional training without the right purpose? Whether it’s for personal needs or business strategy, decision making skills are highly in-demand.
“It’s feeling the sense of responsibility, the sense of ownership, to step in, to try to solve any problem, and the humility to step back and embrace the better ideas of others,” Block said. “Your end goal is what can we do together to problem-solve.”
This stems beyond teamwork and “playing nice with others.” Hazar Imam told the University of Alberta in 2009: “It seems to me to be the responsibility of educators everywhere to help develop ‘ethically literate’ people who can reason morally whenever they analyze and resolve problems, who see the world through the lens of ethics, who can articulate their moral reasoning clearly — even in a world of cultural and religious diversity — and have the courage to make tough choices.”
Selling, communication skills and people skills
What good is a vision if one cannot articulate it, or an idea if one cannot sell it? Both are needed to convince people of the value you, your idea, or both would bring. Despite the machine takeover, the vast majority of deals require selling and negotiation face to face. This will only prove more challenging as the volume of ideas being sold will also increase. Therefore, those who can sell ideas, products, and themselves, will have a strategic advantage and more potential financial gains.
This is particularly true because even though we’re in the automation age, it’s important to remember that machines can’t do everything. It must be balanced by human understanding of the technical and implementation of social and emotional skills. For example, breaking away from the automation craze, Toyota actually replaced the robots in its factories with people “because human workers can, unlike their machine counterparts, propose ideas for improvement,” according to Bloomberg.
The accelerating pace of change in all fields of endeavor is evident; the successful will be those who can manage to keep up with it and use their technical, collaborative, and people skills, to their advantage. As Prince Rahim said in a speech: “Like the great Muslim artists, philosophers, and scientists of centuries past, we must enthusiastically pursue knowledge on every hand, always ready to embrace a better understanding of Allah’s creation, and always ready to harness this knowledge in improving the quality of life of all peoples.”
https://the.ismaili/our-stories/skills-future-part-two
Even though we’re in the automation age, it’s important to remember that machines can’t do everything. It must be balanced by human understanding of the technical and implementation of social and emotional skills.
Although we now live in an an age of automation, it’s important to remember that machines can’t do everything. Technical efforts must be balanced with social and emotional skills. Part two of our Future Skills article highlights the importance of technical, cognitive, and soft skills in preparing for the future.
In a speech at the Opening Ceremony of the Aga Khan School in Osh, the Kyrgyz Republic in 2002 Mawlana Hazar Imam said, "The ability to make judgments that are grounded in solid information, and employ careful analysis should be one of the most important goals for any educational endeavor. As students develop this capacity, they can begin to grapple with the most important and difficult step: to learn to place such judgements in an ethical framework. Therein lies the formation of the kind of social consciousness that our world so desperately needs."
Growth mindset, adaptive thinking, and agile methodology
Twenty years ago, if someone was said to be pursuing a career in social media management or “user experience,” it would result in confused staring. Today, it is not certain what the latest job trends will be even a decade from now.
Yet, there is a way to train to become ready to see – and seize – the opportunity when it arises. It starts by having a growth mindset. Popularized by Stanford University psychologist and author of Mindset: The New Psychology of Success, Carol Dweck, this is the notion that intelligence is as malleable as a muscle, as opposed to fixed and limited from birth. Whether the “muscle” is being strengthened or atrophied is up to each individual, but those who recognize that metaphoric “strength training” is possible, will ultimately be more inclined to “train” leading to greater learning and tenacity.
Being mentally nimble matters to employers.
“For Google, the number one thing we look for is general cognitive ability, and it’s not IQ, it’s learning ability. It’s the ability to process on the fly. It’s the ability to pull together disparate bits of information,” said Block.
These values are being hailed by other elite companies as well. IBM urges its workforce to “restlessly reinvent.” According to its CEO, Ginny Rometty, that’s the only way a company so old can stay relevant today, and that requires a paradigm shift; to be known for the inquisitive people who lead to world-changing innovation, versus products that may come and go.
Agile thinking takes these first two components one step further. Typically used in the context of the design-based thinking sweeping the globe, mental agility is more aligned with a methodology to elicit practical and creative problem solving by adjusting as you go. No more putting all your eggs into one basket or attempt and hope it doesn’t fail. The goal is to explore, prototype, evaluate, and modify along the way.
Considering we ourselves have become products and brands that are constantly valued and bought, this iterative method makes sense to harness as we too must develop new skills and competencies continuously. In doing so, Hazar Imam says that we “nurture the spirit of anticipation and agility, adaptability and adventure.”
Ethical literacy, moral reasoning, and decision making
The Pew Research Center analysis of government jobs data finds that for the past several decades, employment has been rising faster in jobs requiring higher levels of preparation – that is, more education, training, and experience. The number of workers in occupations requiring average to above-average education, training, and experience increased by 67% between 1980 and 2015.
What good is additional training without the right purpose? Whether it’s for personal needs or business strategy, decision making skills are highly in-demand.
“It’s feeling the sense of responsibility, the sense of ownership, to step in, to try to solve any problem, and the humility to step back and embrace the better ideas of others,” Block said. “Your end goal is what can we do together to problem-solve.”
This stems beyond teamwork and “playing nice with others.” Hazar Imam told the University of Alberta in 2009: “It seems to me to be the responsibility of educators everywhere to help develop ‘ethically literate’ people who can reason morally whenever they analyze and resolve problems, who see the world through the lens of ethics, who can articulate their moral reasoning clearly — even in a world of cultural and religious diversity — and have the courage to make tough choices.”
Selling, communication skills and people skills
What good is a vision if one cannot articulate it, or an idea if one cannot sell it? Both are needed to convince people of the value you, your idea, or both would bring. Despite the machine takeover, the vast majority of deals require selling and negotiation face to face. This will only prove more challenging as the volume of ideas being sold will also increase. Therefore, those who can sell ideas, products, and themselves, will have a strategic advantage and more potential financial gains.
This is particularly true because even though we’re in the automation age, it’s important to remember that machines can’t do everything. It must be balanced by human understanding of the technical and implementation of social and emotional skills. For example, breaking away from the automation craze, Toyota actually replaced the robots in its factories with people “because human workers can, unlike their machine counterparts, propose ideas for improvement,” according to Bloomberg.
The accelerating pace of change in all fields of endeavor is evident; the successful will be those who can manage to keep up with it and use their technical, collaborative, and people skills, to their advantage. As Prince Rahim said in a speech: “Like the great Muslim artists, philosophers, and scientists of centuries past, we must enthusiastically pursue knowledge on every hand, always ready to embrace a better understanding of Allah’s creation, and always ready to harness this knowledge in improving the quality of life of all peoples.”
https://the.ismaili/our-stories/skills-future-part-two
The Pandemic May Mean the End of the Open-Floor Office
As businesses contemplate the return of workers to their desks, many are considering large and small changes to the modern workplace culture and trappings.
SAN FRANCISCO — The modern corporate office is renowned for open, collaborative work spaces, in-house coffee bars and standing desks with room for two giant computer monitors.
Soon, there may be a new must-have perk: the sneeze guard.
This plexiglass barrier that can be mounted on a desk is one of many ideas being mulled by employers as they contemplate a return to the workplace after coronavirus lockdowns. Their post-pandemic makeovers may include hand sanitizers built into desks that are positioned at 90-degree angles or that are enclosed by translucent plastic partitions; air filters that push air down and not up; outdoor gathering space to allow collaboration without viral transmission; and windows that actually open, for freer air flow.
The conversation about how to reconfigure the American workplace is taking place throughout the business world, from small start-ups to giant Wall Street firms. The design and furniture companies that have been hired for the makeovers say the virus may even be tilting workplaces back toward a concept they had been moving away from since the Mad Men era: privacy.
The question is whether any of the changes being contemplated will actually result in safer workplaces.
More...
https://www.nytimes.com/2020/05/04/heal ... 778d3e6de3
As businesses contemplate the return of workers to their desks, many are considering large and small changes to the modern workplace culture and trappings.
SAN FRANCISCO — The modern corporate office is renowned for open, collaborative work spaces, in-house coffee bars and standing desks with room for two giant computer monitors.
Soon, there may be a new must-have perk: the sneeze guard.
This plexiglass barrier that can be mounted on a desk is one of many ideas being mulled by employers as they contemplate a return to the workplace after coronavirus lockdowns. Their post-pandemic makeovers may include hand sanitizers built into desks that are positioned at 90-degree angles or that are enclosed by translucent plastic partitions; air filters that push air down and not up; outdoor gathering space to allow collaboration without viral transmission; and windows that actually open, for freer air flow.
The conversation about how to reconfigure the American workplace is taking place throughout the business world, from small start-ups to giant Wall Street firms. The design and furniture companies that have been hired for the makeovers say the virus may even be tilting workplaces back toward a concept they had been moving away from since the Mad Men era: privacy.
The question is whether any of the changes being contemplated will actually result in safer workplaces.
More...
https://www.nytimes.com/2020/05/04/heal ... 778d3e6de3
The future of work: In the office, at home
The future is here. Once a remote possibility, working from home has become commonplace and more accepted, much sooner than anticipated. While most employers have in the past resisted the idea of their workforces performing their duties from home, necessity is the mother of invention, and the current coronavirus crisis has left many employers with no other option.
How does this sudden change affect organisations and employees? Remote working was increasing gradually, but the workplace has now transformed about a decade earlier than expected. Of course, this option is only available to those employees who are in professions that can be carried out in home environments, an option not available to those working in manufacturing, retail and many other sectors.
Workers who have often been invisible and unrecognised are now the ones classified as “essential,” such as delivery drivers and grocery store employees — the very ones who cannot work remotely. Many others, not deemed essential, will simply be unable to return to work as their businesses, and especially restaurants, may not re-open.
The US Bureau of Labor estimates that of the top 25 percent of income-earners, 60 percent could work from home, while the figure for the lowest 25 percent of earners is 10 percent. Those able to work remotely would be far fewer in developing countries where equipment and Internet access may be greater impediments.
More...
https://the.ismaili/global/news/feature ... ffice-home
The future is here. Once a remote possibility, working from home has become commonplace and more accepted, much sooner than anticipated. While most employers have in the past resisted the idea of their workforces performing their duties from home, necessity is the mother of invention, and the current coronavirus crisis has left many employers with no other option.
How does this sudden change affect organisations and employees? Remote working was increasing gradually, but the workplace has now transformed about a decade earlier than expected. Of course, this option is only available to those employees who are in professions that can be carried out in home environments, an option not available to those working in manufacturing, retail and many other sectors.
Workers who have often been invisible and unrecognised are now the ones classified as “essential,” such as delivery drivers and grocery store employees — the very ones who cannot work remotely. Many others, not deemed essential, will simply be unable to return to work as their businesses, and especially restaurants, may not re-open.
The US Bureau of Labor estimates that of the top 25 percent of income-earners, 60 percent could work from home, while the figure for the lowest 25 percent of earners is 10 percent. Those able to work remotely would be far fewer in developing countries where equipment and Internet access may be greater impediments.
More...
https://the.ismaili/global/news/feature ... ffice-home
Who Gets Left Behind in the Work-From-Home Revolution?
An increase in remote workers won’t automatically usher in a gender-equal utopia. If we want it, we have to make it so.
Ever since the coronavirus pandemic began keeping most of us sheltered at home, work has rapidly shifted from the cubicle to the kitchen table. A number of surveys indicate that about half of the American work force is now doing their work at home. Companies that may have once been resistant to letting employees off the in-person leash are finding that yes, work can still get done outside the confines of an office building.
That realization may last long after stay-at-home orders are lifted, leading to a permanent change in how we work. Silicon Valley is leading the way, with Twitter, Square and Facebook announcing that employees will be able to work remotely after the pandemic subsides. Companies in other white-collar industries are certain to follow. Nearly two-thirds of surveyed hiring managers say that their workforces will be more remote moving forward.
But offices are already starting to reopen, and it’s likely to be up to individual workers to decide whether to return. We may end up, then, in a world of haves and have-nots — those who have more ability to start commuting again and those who can’t, because they have increased health risks or they have children at home and no child-care options. And among heterosexual couples, it’s not hard to guess which parent will almost certainly be stuck at home longer until child-care options are open again. Will these employees be treated differently, even inadvertently?
It’s hard to predict just how these shifts will play out — but as things stand, women are in a poor position to benefit.
More...
https://www.nytimes.com/2020/06/25/opin ... 778d3e6de3
An increase in remote workers won’t automatically usher in a gender-equal utopia. If we want it, we have to make it so.
Ever since the coronavirus pandemic began keeping most of us sheltered at home, work has rapidly shifted from the cubicle to the kitchen table. A number of surveys indicate that about half of the American work force is now doing their work at home. Companies that may have once been resistant to letting employees off the in-person leash are finding that yes, work can still get done outside the confines of an office building.
That realization may last long after stay-at-home orders are lifted, leading to a permanent change in how we work. Silicon Valley is leading the way, with Twitter, Square and Facebook announcing that employees will be able to work remotely after the pandemic subsides. Companies in other white-collar industries are certain to follow. Nearly two-thirds of surveyed hiring managers say that their workforces will be more remote moving forward.
But offices are already starting to reopen, and it’s likely to be up to individual workers to decide whether to return. We may end up, then, in a world of haves and have-nots — those who have more ability to start commuting again and those who can’t, because they have increased health risks or they have children at home and no child-care options. And among heterosexual couples, it’s not hard to guess which parent will almost certainly be stuck at home longer until child-care options are open again. Will these employees be treated differently, even inadvertently?
It’s hard to predict just how these shifts will play out — but as things stand, women are in a poor position to benefit.
More...
https://www.nytimes.com/2020/06/25/opin ... 778d3e6de3
Is the Five-Day Office Week Over?
The pandemic has shown employees and employers alike that there’s value in working from home — at least, some of the time.
Most American office workers are in no hurry to return to the office full time, even after the coronavirus is under control. But that doesn’t mean they want to work from home forever. The future for them, a variety of new data shows, is likely to be workweeks split between office and home.
Recent surveys show that both employees and employers support this arrangement. And research suggests that a couple of days a week at each location is the magic number to cancel out the negatives of each arrangement while reaping the benefits of both.
“You should never be thinking about full time or zero time,” said Nicholas Bloom, an economics professor at Stanford whose research has identified causal links between remote work and employee performance. “I’m a firm believer in post-Covid half time in the office.”
According to a new survey by Morning Consult, 47 percent of those working remotely say that once it’s safe to return to work, their ideal arrangement would be to continue working from home one to four days a week. Forty percent would work from home every day, and just 14 percent would return to the office every day.
More...
https://www.nytimes.com/2020/07/02/upsh ... 778d3e6de3
The pandemic has shown employees and employers alike that there’s value in working from home — at least, some of the time.
Most American office workers are in no hurry to return to the office full time, even after the coronavirus is under control. But that doesn’t mean they want to work from home forever. The future for them, a variety of new data shows, is likely to be workweeks split between office and home.
Recent surveys show that both employees and employers support this arrangement. And research suggests that a couple of days a week at each location is the magic number to cancel out the negatives of each arrangement while reaping the benefits of both.
“You should never be thinking about full time or zero time,” said Nicholas Bloom, an economics professor at Stanford whose research has identified causal links between remote work and employee performance. “I’m a firm believer in post-Covid half time in the office.”
According to a new survey by Morning Consult, 47 percent of those working remotely say that once it’s safe to return to work, their ideal arrangement would be to continue working from home one to four days a week. Forty percent would work from home every day, and just 14 percent would return to the office every day.
More...
https://www.nytimes.com/2020/07/02/upsh ... 778d3e6de3
God Is Dead. So Is the Office. These People Want to Save Both
Divinity consultants are designing sacred rituals for corporations and their spiritually depleted employees.
In the beginning there was Covid-19, and the tribe of the white collars rent their garments, for their workdays were a formless void, and all their rituals were gone. New routines came to replace the old, but the routines were scattered, and there was chaos around how best to exit a Zoom, onboard an intern, end a workweek.
The adrift may yet find purpose, for a new corporate clergy has arisen to formalize the remote work life. They go by different names: ritual consultants, sacred designers, soul-centered advertisers. They have degrees from divinity schools. Their business is borrowing from religious tradition to bring spiritual richness to corporate America.
In simpler times, divinity schools sent their graduates out to lead congregations or conduct academic research. Now there is a more office-bound calling: the spiritual consultant. Those who have chosen this path have founded agencies — some for-profit, some not — with similar-sounding names: Sacred Design Lab, Ritual Design Lab, Ritualist. They blend the obscure language of the sacred with the also obscure language of management consulting to provide clients with a range of spiritually inflected services, from architecture to employee training to ritual design.
Their larger goal is to soften cruel capitalism, making space for the soul, and to encourage employees to ask if what they are doing is good in a higher sense. Having watched social justice get readily absorbed into corporate culture, they want to see if more American businesses are ready for faith.
More...
https://www.nytimes.com/2020/08/28/busi ... 778d3e6de3
Divinity consultants are designing sacred rituals for corporations and their spiritually depleted employees.
In the beginning there was Covid-19, and the tribe of the white collars rent their garments, for their workdays were a formless void, and all their rituals were gone. New routines came to replace the old, but the routines were scattered, and there was chaos around how best to exit a Zoom, onboard an intern, end a workweek.
The adrift may yet find purpose, for a new corporate clergy has arisen to formalize the remote work life. They go by different names: ritual consultants, sacred designers, soul-centered advertisers. They have degrees from divinity schools. Their business is borrowing from religious tradition to bring spiritual richness to corporate America.
In simpler times, divinity schools sent their graduates out to lead congregations or conduct academic research. Now there is a more office-bound calling: the spiritual consultant. Those who have chosen this path have founded agencies — some for-profit, some not — with similar-sounding names: Sacred Design Lab, Ritual Design Lab, Ritualist. They blend the obscure language of the sacred with the also obscure language of management consulting to provide clients with a range of spiritually inflected services, from architecture to employee training to ritual design.
Their larger goal is to soften cruel capitalism, making space for the soul, and to encourage employees to ask if what they are doing is good in a higher sense. Having watched social justice get readily absorbed into corporate culture, they want to see if more American businesses are ready for faith.
More...
https://www.nytimes.com/2020/08/28/busi ... 778d3e6de3
BOOK
Conscious Leadership: Elevating Humanity Through Business
From Whole Foods CEO John Mackey and his coauthors, a follow-up to groundbreaking bestseller Conscious Capitalism—revealing what it takes to lead a purpose-driven, sustainable business.
John Mackey started a movement when he founded Whole Foods, bringing natural, organic food to the masses and not only changing the market, but breaking the mold. Now, for the first time, Conscious Leadership closely explores the vision, virtues, and mindset that have informed Mackey’s own leadership journey, providing a roadmap for innovative, value-based leadership—in business and in society.
Conscious Leadership demystifies strategies that have helped Mackey shepherd Whole Foods through four decades of incredible growth and innovation, including its recent sale to Amazon. Each chapter will challenge you to rethink conventional business wisdom through anecdotes, case studies, profiles of conscious leaders, and innovative techniques for self-development, culminating in an empowering call to action for entrepreneurs and trailblazers—to step up as leaders who see beyond the bottom line.
https://www.amazon.com/Conscious-Leader ... 593083628/
Conscious Leadership: Elevating Humanity Through Business
From Whole Foods CEO John Mackey and his coauthors, a follow-up to groundbreaking bestseller Conscious Capitalism—revealing what it takes to lead a purpose-driven, sustainable business.
John Mackey started a movement when he founded Whole Foods, bringing natural, organic food to the masses and not only changing the market, but breaking the mold. Now, for the first time, Conscious Leadership closely explores the vision, virtues, and mindset that have informed Mackey’s own leadership journey, providing a roadmap for innovative, value-based leadership—in business and in society.
Conscious Leadership demystifies strategies that have helped Mackey shepherd Whole Foods through four decades of incredible growth and innovation, including its recent sale to Amazon. Each chapter will challenge you to rethink conventional business wisdom through anecdotes, case studies, profiles of conscious leaders, and innovative techniques for self-development, culminating in an empowering call to action for entrepreneurs and trailblazers—to step up as leaders who see beyond the bottom line.
https://www.amazon.com/Conscious-Leader ... 593083628/
After the Pandemic, a Revolution in Education and Work Awaits
Providing more Americans with portable health care, portable pensions and opportunities for lifelong learning is what politics needs to be about post-Nov. 3
.
The good Lord works in mysterious ways. He (She?) threw a pandemic at us at the exact same time as a tectonic shift in the way we will learn, work and employ. Fasten your seatbelt. When we emerge from this corona crisis, we’re going to be greeted with one of the most profound eras of Schumpeterian creative destruction ever — which this pandemic is both accelerating and disguising.
No job, no K-12 school, no university, no factory, no office will be spared. And it will touch both white-collar and blue-collar workers, which is why this election matters so much. How we provide more Americans with portable health care, portable pensions and opportunities for lifelong learning to get the most out of this moment and cushion the worst is what politics needs to be about after Nov. 3 — or we’re really headed for instability.
The reason the post-pandemic era will be so destructive and creative is that never have more people had access to so many cheap tools of innovation, never have more people had access to high-powered, inexpensive computing, never have more people had access to such cheap credit — virtually free money — to invent new products and services, all as so many big health, social, environmental and economic problems need solving.
Put all of that together and KABOOM!
You’re going to see some amazing stuff emerge, some long-established institutions, like universities, disappear — and the nature of work, workplaces and the workforce be transformed.
I’ve been discussing this moment with Ravi Kumar, the president of the Indian tech services company Infosys, whose headquarters is in Bangalore. Because Infosys helps companies prepare for a digital world, I’ve always found it a source of great insight on global employment/education trends. I started my book “The World Is Flat” there in 2004. Back then, Infosys’ main business was doing work that American companies would outsource to India. Today, Kumar operates from New York City, where he’s creating thousands of jobs in America. How could that be?
It starts with the fact, explained Kumar, that the Industrial Revolution produced a world in which there were sharp distinctions between employers and employees, between educators and employers and between governments and employers and educators, “but now you’re going to see a blurring of all these lines.”
Because the pace of technological change, digitization and globalization just keeps accelerating, two things are happening at once: the world is being knit together more tightly than ever — sure, the globalization of goods and people has been slowed by the pandemic and politics, but the globalization of services has soared — and “the half-life of skills is steadily shrinking,” said Kumar, meaning that whatever skill you possess today is being made obsolete faster and faster.
Your children can expect to change jobs and professions multiple times in their lifetimes, which means their career path will no longer follow a simple “learn-to-work’’ trajectory, as Heather E. McGowan, co-author of “The Adaptation Advantage,” likes to say, but rather a path of “work-learn-work-learn-work-learn.”
More...
https://www.nytimes.com/2020/10/20/opin ... 778d3e6de3
Providing more Americans with portable health care, portable pensions and opportunities for lifelong learning is what politics needs to be about post-Nov. 3
.
The good Lord works in mysterious ways. He (She?) threw a pandemic at us at the exact same time as a tectonic shift in the way we will learn, work and employ. Fasten your seatbelt. When we emerge from this corona crisis, we’re going to be greeted with one of the most profound eras of Schumpeterian creative destruction ever — which this pandemic is both accelerating and disguising.
No job, no K-12 school, no university, no factory, no office will be spared. And it will touch both white-collar and blue-collar workers, which is why this election matters so much. How we provide more Americans with portable health care, portable pensions and opportunities for lifelong learning to get the most out of this moment and cushion the worst is what politics needs to be about after Nov. 3 — or we’re really headed for instability.
The reason the post-pandemic era will be so destructive and creative is that never have more people had access to so many cheap tools of innovation, never have more people had access to high-powered, inexpensive computing, never have more people had access to such cheap credit — virtually free money — to invent new products and services, all as so many big health, social, environmental and economic problems need solving.
Put all of that together and KABOOM!
You’re going to see some amazing stuff emerge, some long-established institutions, like universities, disappear — and the nature of work, workplaces and the workforce be transformed.
I’ve been discussing this moment with Ravi Kumar, the president of the Indian tech services company Infosys, whose headquarters is in Bangalore. Because Infosys helps companies prepare for a digital world, I’ve always found it a source of great insight on global employment/education trends. I started my book “The World Is Flat” there in 2004. Back then, Infosys’ main business was doing work that American companies would outsource to India. Today, Kumar operates from New York City, where he’s creating thousands of jobs in America. How could that be?
It starts with the fact, explained Kumar, that the Industrial Revolution produced a world in which there were sharp distinctions between employers and employees, between educators and employers and between governments and employers and educators, “but now you’re going to see a blurring of all these lines.”
Because the pace of technological change, digitization and globalization just keeps accelerating, two things are happening at once: the world is being knit together more tightly than ever — sure, the globalization of goods and people has been slowed by the pandemic and politics, but the globalization of services has soared — and “the half-life of skills is steadily shrinking,” said Kumar, meaning that whatever skill you possess today is being made obsolete faster and faster.
Your children can expect to change jobs and professions multiple times in their lifetimes, which means their career path will no longer follow a simple “learn-to-work’’ trajectory, as Heather E. McGowan, co-author of “The Adaptation Advantage,” likes to say, but rather a path of “work-learn-work-learn-work-learn.”
More...
https://www.nytimes.com/2020/10/20/opin ... 778d3e6de3
What Makes a Good Entrepreneur?
Entrepreneurs have a unique opportunity to make a positive impact on the planet and its people.
Many stereotypes exist for the archetypal entrepreneur: that they are charismatic and talented risk-takers, focused only on increasing profits. Yet a new wave of business leaders is emerging, passionate about doing good and improving the world.
Careers of the Future is an original series airing exclusively on The Ismaili TV, where students and young professionals can hear directly from members of the Jamat at the leading edge of their fields about how to most effectively prepare for the future of work.
In the fourth and fifth episodes, we heard from entrepreneurs Aly Madhavji and Aiaze Mitha. Aly is the Managing Partner at Blockchain Founders Fund, a venture capital firm based in Singapore, which invests in startup companies; Aiaze is a Fintech entrepreneur and investor based in France. They shared some interesting insights on how to become a successful entrepreneur in today’s rapidly evolving world.
Entrepreneurship is the process of growing a new business venture and looking for innovative and sustainable solutions to the problems that society faces. It is often associated with technological advancements, which is making efficient use of technology for the development of a firm. For instance, Fintech is a combination of finance and technology. It uses technology in every area of finance, such as mobile banking and net banking. An entrepreneur’s role (one of many) in Fintech could be to make sure that there is an appropriate use of technology in delivering financial services.
Fast-growing small enterprises also have plenty of potential to contribute to the development of society via global organisations such as the United Nations (UN), whose work both Aly and Aiaze have contributed to.
“We help to look at how emerging technologies such as BlockChain and Artificial Intelligence can actually help to achieve the UN’s sustainable development goals and help transform countries or regions by using these technologies to solve some of the biggest challenges that exist,” said Aly.
Similarly, financial technology is beginning to make a positive impact on the planet and its people, promoting climate responsibility and financial inclusion for all.
“In many ways Fintech is being more and more harnessed to actually create efficiencies, to reduce waste in some ways, and to help us really invest in a better future, in a better world,” said Aiaze. “This is the new direction that Fintech is taking, which is ‘Fintech for good.’”
Being an entrepreneur is also not age-dependent. One can be any age and come up with innovative ideas. These ideas are then sometimes developed by venture capitalists. A venture capital firm invests in start-up companies and early-stage companies. They look for young entrepreneurs with innovative ideas and analyse their potential, starting with the idea, understanding the problem, and evaluating how efficient the solution is. They also look for experience, resilience, and dedication towards solving the problem.
Lastly, an entrepreneur has to have the curiosity and willingness to achieve their ultimate goal and the ability to adapt, as this is a very dynamic field. Fintech, for instance, is a very broad field, and there are many different aspects of finance from banking to financial markets to investments and more. There is a wide range of topics for an entrepreneur to consider. One must know their skills, strengths, and weaknesses as it would allow them to have a clear idea of the area an individual is interested in.
Both episodes are now available to watch in full on The Ismaili TV On Demand https://tv.ismaili/browse/careers-of-the-future.
https://the.ismaili/global/news/communi ... trepreneur
Entrepreneurs have a unique opportunity to make a positive impact on the planet and its people.
Many stereotypes exist for the archetypal entrepreneur: that they are charismatic and talented risk-takers, focused only on increasing profits. Yet a new wave of business leaders is emerging, passionate about doing good and improving the world.
Careers of the Future is an original series airing exclusively on The Ismaili TV, where students and young professionals can hear directly from members of the Jamat at the leading edge of their fields about how to most effectively prepare for the future of work.
In the fourth and fifth episodes, we heard from entrepreneurs Aly Madhavji and Aiaze Mitha. Aly is the Managing Partner at Blockchain Founders Fund, a venture capital firm based in Singapore, which invests in startup companies; Aiaze is a Fintech entrepreneur and investor based in France. They shared some interesting insights on how to become a successful entrepreneur in today’s rapidly evolving world.
Entrepreneurship is the process of growing a new business venture and looking for innovative and sustainable solutions to the problems that society faces. It is often associated with technological advancements, which is making efficient use of technology for the development of a firm. For instance, Fintech is a combination of finance and technology. It uses technology in every area of finance, such as mobile banking and net banking. An entrepreneur’s role (one of many) in Fintech could be to make sure that there is an appropriate use of technology in delivering financial services.
Fast-growing small enterprises also have plenty of potential to contribute to the development of society via global organisations such as the United Nations (UN), whose work both Aly and Aiaze have contributed to.
“We help to look at how emerging technologies such as BlockChain and Artificial Intelligence can actually help to achieve the UN’s sustainable development goals and help transform countries or regions by using these technologies to solve some of the biggest challenges that exist,” said Aly.
Similarly, financial technology is beginning to make a positive impact on the planet and its people, promoting climate responsibility and financial inclusion for all.
“In many ways Fintech is being more and more harnessed to actually create efficiencies, to reduce waste in some ways, and to help us really invest in a better future, in a better world,” said Aiaze. “This is the new direction that Fintech is taking, which is ‘Fintech for good.’”
Being an entrepreneur is also not age-dependent. One can be any age and come up with innovative ideas. These ideas are then sometimes developed by venture capitalists. A venture capital firm invests in start-up companies and early-stage companies. They look for young entrepreneurs with innovative ideas and analyse their potential, starting with the idea, understanding the problem, and evaluating how efficient the solution is. They also look for experience, resilience, and dedication towards solving the problem.
Lastly, an entrepreneur has to have the curiosity and willingness to achieve their ultimate goal and the ability to adapt, as this is a very dynamic field. Fintech, for instance, is a very broad field, and there are many different aspects of finance from banking to financial markets to investments and more. There is a wide range of topics for an entrepreneur to consider. One must know their skills, strengths, and weaknesses as it would allow them to have a clear idea of the area an individual is interested in.
Both episodes are now available to watch in full on The Ismaili TV On Demand https://tv.ismaili/browse/careers-of-the-future.
https://the.ismaili/global/news/communi ... trepreneur
Take control of your financial destiny
The skills needed for our future financial security are likely to be centred on what makes us human, and differentiates us from technology.
With the rapidly changing world we find ourselves in, individuals and families are having to face an ever-fluctuating economy and changing patterns of employment. Yasmin Jetha, a non-executive director of the NatWest Group Plc, the Nation Media Group in East Africa, and the Guardian Media Group in the UK, shares her thoughts on navigating these changes and working towards long-term economic empowerment.
The way I view the world, my values and beliefs, and hence the way I relate to people, have all been shaped by Mawlana Hazar Imam’s guidance. Take, for instance, our Imam’s emphasis on the importance of education and lifelong learning. Why this emphasis? I recall his explanation decades ago about the flexibility this gives to an individual, well ahead of the era of globalisation.
Obtaining a degree and then working for a lifetime in a single company is no longer the norm. This means that in addition to budgeting carefully, spending thoughtfully, and living within our means, constantly acquiring knowledge is essential in order to navigate the economic and social challenges of today and tomorrow.
Learning throughout our lives can help us to prepare for uncertainty, for societal changes, and for global shifts. The ones we are currently experiencing will not be the last.
Our Imam has often spoken about working together in order to build the knowledge societies of the future. In Cairo in 2006, he said that, “From the very beginnings of Islam, the search for knowledge has been central to our cultures... In his teachings, Hazrat Ali emphasized that ‘No honour is like knowledge.’ And then he added that ‘No belief is like modesty and patience, no attainment is like humility, no power is like forbearance, and no support is more reliable than consultation.’”
“What he thus is telling us is that we find knowledge best by admitting first what it is we do not know, and by opening our minds to what others can teach us.”
Learning from others
Mawlana Hazar Imam reminds us that nobody succeeds all on their own. Afterall, we cannot know everything. So while success rarely comes without real courage, sacrifice, and a lot of discipline, it also requires working with, and learning from others. You need to possess not only self-belief but also belief in the power of building trust by working with others.
As technology plays an increasingly greater role in our lives and our work, the skills needed for the future are likely to be centred on what makes us human. In other words, we need to develop attributes that computers and robots do not have. Skills such as creativity, emotional intelligence, critical thinking, people management, leadership, and working with others are crucial in order to ‘future-proof’ ourselves and our financial security.
While collaboration takes effort and can sometimes be difficult since we are each wonderfully unique, the advantage is that you may draw on diverse opinions, knowledge, and experiences. Effective collaboration requires all parties to be inclusive and listen actively to differing views before making decisions. This has helped me in my career to develop strong teamwork skills, which in turn helped with more meticulous planning and sound execution.
Helping others
Colleagues are important — but so are your family, friends, and the community. Alongside one’s faith, your support network and the wider community can give you strength when you are tired, offer healing when you are hurt, and provide comfort when things go wrong.
And by the same token, there will be times when they may need you. Hazar Imam explained on 21 October 1986 in Nairobi, “We have all seen examples of God's most wonderful creature, the person - whether in a government bureau, a business, or a private development agency - who is inspired to give generously of himself, to go beyond the mechanical requirements of a task.”
“Such men and women, paid or unpaid, express the spirit of the volunteer — literally the will to make a product better, a school the very best, a clinic more compassionate and effective. Their spirit, generating new ideas, resisting discouragement, and demanding results, animates the heart of every effective society."
So when you're called on to give support, give it readily and without expectation of anything in return. Good work is never wasted, and offering service on a voluntary basis can help you to feel more content. “Enlightened self-fulfilment” is a phrase that Hazar Imam used in February 2014 in Ottawa.
YasminJ
During her career, Yasmin Jetha has held senior roles in healthcare, financial services, and media organisations.
Building your own capabilities
Finally, during more difficult times, I find that it can help to have the same structure as on “normal” days, as this makes it easier to overcome setbacks. That includes maintaining a positive attitude as often as I can, remembering my faith, and allocating part of the day to develop further my knowledge and skills, as well as to do some things I enjoy - what I like to call R&D, or Relaxation and Development.
Many organisations have been thinking about what the future of work will look like. The recent World Economic Forum's Future of Jobs report states that by 2025, 50% of all today’s employees will need reskilling, as adoption of technology increases. Interestingly, the time required for a significant proportion is between two and six months – meaning that these skills can be acquired reasonably quickly by being committed.
The more you develop your skills, I find the more control you take over your destiny. Because skills empower you. Skills give you more freedom. They give you somewhere you can stand your ground and have your say.
Make sure you know what you want; make sure you have what it takes to get what you want; then go for it. And keep going; and going; and for good measure keep going again.
https://the.ismaili/global/news/feature ... al-destiny
The skills needed for our future financial security are likely to be centred on what makes us human, and differentiates us from technology.
With the rapidly changing world we find ourselves in, individuals and families are having to face an ever-fluctuating economy and changing patterns of employment. Yasmin Jetha, a non-executive director of the NatWest Group Plc, the Nation Media Group in East Africa, and the Guardian Media Group in the UK, shares her thoughts on navigating these changes and working towards long-term economic empowerment.
The way I view the world, my values and beliefs, and hence the way I relate to people, have all been shaped by Mawlana Hazar Imam’s guidance. Take, for instance, our Imam’s emphasis on the importance of education and lifelong learning. Why this emphasis? I recall his explanation decades ago about the flexibility this gives to an individual, well ahead of the era of globalisation.
Obtaining a degree and then working for a lifetime in a single company is no longer the norm. This means that in addition to budgeting carefully, spending thoughtfully, and living within our means, constantly acquiring knowledge is essential in order to navigate the economic and social challenges of today and tomorrow.
Learning throughout our lives can help us to prepare for uncertainty, for societal changes, and for global shifts. The ones we are currently experiencing will not be the last.
Our Imam has often spoken about working together in order to build the knowledge societies of the future. In Cairo in 2006, he said that, “From the very beginnings of Islam, the search for knowledge has been central to our cultures... In his teachings, Hazrat Ali emphasized that ‘No honour is like knowledge.’ And then he added that ‘No belief is like modesty and patience, no attainment is like humility, no power is like forbearance, and no support is more reliable than consultation.’”
“What he thus is telling us is that we find knowledge best by admitting first what it is we do not know, and by opening our minds to what others can teach us.”
Learning from others
Mawlana Hazar Imam reminds us that nobody succeeds all on their own. Afterall, we cannot know everything. So while success rarely comes without real courage, sacrifice, and a lot of discipline, it also requires working with, and learning from others. You need to possess not only self-belief but also belief in the power of building trust by working with others.
As technology plays an increasingly greater role in our lives and our work, the skills needed for the future are likely to be centred on what makes us human. In other words, we need to develop attributes that computers and robots do not have. Skills such as creativity, emotional intelligence, critical thinking, people management, leadership, and working with others are crucial in order to ‘future-proof’ ourselves and our financial security.
While collaboration takes effort and can sometimes be difficult since we are each wonderfully unique, the advantage is that you may draw on diverse opinions, knowledge, and experiences. Effective collaboration requires all parties to be inclusive and listen actively to differing views before making decisions. This has helped me in my career to develop strong teamwork skills, which in turn helped with more meticulous planning and sound execution.
Helping others
Colleagues are important — but so are your family, friends, and the community. Alongside one’s faith, your support network and the wider community can give you strength when you are tired, offer healing when you are hurt, and provide comfort when things go wrong.
And by the same token, there will be times when they may need you. Hazar Imam explained on 21 October 1986 in Nairobi, “We have all seen examples of God's most wonderful creature, the person - whether in a government bureau, a business, or a private development agency - who is inspired to give generously of himself, to go beyond the mechanical requirements of a task.”
“Such men and women, paid or unpaid, express the spirit of the volunteer — literally the will to make a product better, a school the very best, a clinic more compassionate and effective. Their spirit, generating new ideas, resisting discouragement, and demanding results, animates the heart of every effective society."
So when you're called on to give support, give it readily and without expectation of anything in return. Good work is never wasted, and offering service on a voluntary basis can help you to feel more content. “Enlightened self-fulfilment” is a phrase that Hazar Imam used in February 2014 in Ottawa.
YasminJ
During her career, Yasmin Jetha has held senior roles in healthcare, financial services, and media organisations.
Building your own capabilities
Finally, during more difficult times, I find that it can help to have the same structure as on “normal” days, as this makes it easier to overcome setbacks. That includes maintaining a positive attitude as often as I can, remembering my faith, and allocating part of the day to develop further my knowledge and skills, as well as to do some things I enjoy - what I like to call R&D, or Relaxation and Development.
Many organisations have been thinking about what the future of work will look like. The recent World Economic Forum's Future of Jobs report states that by 2025, 50% of all today’s employees will need reskilling, as adoption of technology increases. Interestingly, the time required for a significant proportion is between two and six months – meaning that these skills can be acquired reasonably quickly by being committed.
The more you develop your skills, I find the more control you take over your destiny. Because skills empower you. Skills give you more freedom. They give you somewhere you can stand your ground and have your say.
Make sure you know what you want; make sure you have what it takes to get what you want; then go for it. And keep going; and going; and for good measure keep going again.
https://the.ismaili/global/news/feature ... al-destiny
Solving problems and adding value
Organisations hire management consultants to aid in solving complex issues linked with marketing, sales, logistics, administration, and more.
In order to improve their performance, organisations often hire the services of management consultants. But what exactly is a management consultant, and what do they do?
Careers of the Future is an original series airing exclusively on The Ismaili TV, where students and young professionals can hear directly from Ismailis at the leading edge of their fields about how to most effectively prepare for the future of work.
In Episode 11, we heard from Amyn Merchant, a managing director and a senior partner at Boston Consulting Group, based in New York City. His work focuses on global plant networks, the launching of new products, and ways to get products to the market, both on the supply chain and production sides.
Perhaps one of the most popular fields in the business world at present is management consulting. Many major companies and institutions hire consultants to aid in solving complex issues linked with marketing, sales, logistics, administration, and more. These consultants are professional experts who aim to enhance an organisation’s financial and organisational performance by providing ideas and strategies backed by large amounts of research, data, and analysis.
Amyn shared his experience of working at a consulting firm and provided insights on the future of management consulting. He also shared the essential skill-sets required to succeed in the field.On a daily basis, the job entails analysing company data, interviewing client employees, preparing presentations and business proposals, and managing the team in charge of putting these ideas into action.
Amyn began by describing the role of a management consultant, emphasising how it acts as a “doctor” for firms by providing solutions. He also discussed the impact of pandemic-like situations on this field and suggested that students be dynamic in their approach to their careers. He also advised youth to work on their communication and creativity skills to provide a better client experience.
The constant travel, exposure to senior company leaders, and opportunity to work on complex business issues has contributed to the glamour commonly associated with the field. While all these are excellent aspects of this field, a career as a management consultant does require one to be critical, hardworking, effective, and passionate about achieving the goal.
Towards the end of the episode, Amyn shared some ways for students to prepare and explore a career in this field. He suggests that they gain as much experience through their projects and volunteer work in order to explore, gain, and develop the essential skills for management consulting.
Amyn also emphasised that networking plays a pivotal role in preparing for a career in this field. Making connections can allow an individual to find a mentor who will help them learn about the different options available to them, and make the right choices in order to achieve their career goals.
The episode is available to watch now on The Ismaili TV On Demand https://tv.ismaili/watch/careers-of-the ... ting-s2-e5.
https://the.ismaili/global/news/communi ... ding-value
Organisations hire management consultants to aid in solving complex issues linked with marketing, sales, logistics, administration, and more.
In order to improve their performance, organisations often hire the services of management consultants. But what exactly is a management consultant, and what do they do?
Careers of the Future is an original series airing exclusively on The Ismaili TV, where students and young professionals can hear directly from Ismailis at the leading edge of their fields about how to most effectively prepare for the future of work.
In Episode 11, we heard from Amyn Merchant, a managing director and a senior partner at Boston Consulting Group, based in New York City. His work focuses on global plant networks, the launching of new products, and ways to get products to the market, both on the supply chain and production sides.
Perhaps one of the most popular fields in the business world at present is management consulting. Many major companies and institutions hire consultants to aid in solving complex issues linked with marketing, sales, logistics, administration, and more. These consultants are professional experts who aim to enhance an organisation’s financial and organisational performance by providing ideas and strategies backed by large amounts of research, data, and analysis.
Amyn shared his experience of working at a consulting firm and provided insights on the future of management consulting. He also shared the essential skill-sets required to succeed in the field.On a daily basis, the job entails analysing company data, interviewing client employees, preparing presentations and business proposals, and managing the team in charge of putting these ideas into action.
Amyn began by describing the role of a management consultant, emphasising how it acts as a “doctor” for firms by providing solutions. He also discussed the impact of pandemic-like situations on this field and suggested that students be dynamic in their approach to their careers. He also advised youth to work on their communication and creativity skills to provide a better client experience.
The constant travel, exposure to senior company leaders, and opportunity to work on complex business issues has contributed to the glamour commonly associated with the field. While all these are excellent aspects of this field, a career as a management consultant does require one to be critical, hardworking, effective, and passionate about achieving the goal.
Towards the end of the episode, Amyn shared some ways for students to prepare and explore a career in this field. He suggests that they gain as much experience through their projects and volunteer work in order to explore, gain, and develop the essential skills for management consulting.
Amyn also emphasised that networking plays a pivotal role in preparing for a career in this field. Making connections can allow an individual to find a mentor who will help them learn about the different options available to them, and make the right choices in order to achieve their career goals.
The episode is available to watch now on The Ismaili TV On Demand https://tv.ismaili/watch/careers-of-the ... ting-s2-e5.
https://the.ismaili/global/news/communi ... ding-value
How the World Ran Out of Everything
Global shortages of many goods reflect the disruption of the pandemic combined with decades of companies limiting their inventories.
In the story of how the modern world was constructed, Toyota stands out as the mastermind of a monumental advance in industrial efficiency. The Japanese automaker pioneered so-called Just In Time manufacturing, in which parts are delivered to factories right as they are required, minimizing the need to stockpile them.
Over the last half-century, this approach has captivated global business in industries far beyond autos. From fashion to food processing to pharmaceuticals, companies have embraced Just In Time to stay nimble, allowing them to adapt to changing market demands, while cutting costs.
But the tumultuous events of the past year have challenged the merits of paring inventories, while reinvigorating concerns that some industries have gone too far, leaving them vulnerable to disruption. As the pandemic has hampered factory operations and sown chaos in global shipping, many economies around the world have been bedeviled by shortages of a vast range of goods — from electronics to lumber to clothing.
In a time of extraordinary upheaval in the global economy, Just In Time is running late.
“It’s sort of like supply chain run amok,” said Willy C. Shih, an international trade expert at Harvard Business School. “In a race to get to the lowest cost, I have concentrated my risk. We are at the logical conclusion of all that.”
More...
https://www.nytimes.com/2021/06/01/busi ... 778d3e6de3
Global shortages of many goods reflect the disruption of the pandemic combined with decades of companies limiting their inventories.
In the story of how the modern world was constructed, Toyota stands out as the mastermind of a monumental advance in industrial efficiency. The Japanese automaker pioneered so-called Just In Time manufacturing, in which parts are delivered to factories right as they are required, minimizing the need to stockpile them.
Over the last half-century, this approach has captivated global business in industries far beyond autos. From fashion to food processing to pharmaceuticals, companies have embraced Just In Time to stay nimble, allowing them to adapt to changing market demands, while cutting costs.
But the tumultuous events of the past year have challenged the merits of paring inventories, while reinvigorating concerns that some industries have gone too far, leaving them vulnerable to disruption. As the pandemic has hampered factory operations and sown chaos in global shipping, many economies around the world have been bedeviled by shortages of a vast range of goods — from electronics to lumber to clothing.
In a time of extraordinary upheaval in the global economy, Just In Time is running late.
“It’s sort of like supply chain run amok,” said Willy C. Shih, an international trade expert at Harvard Business School. “In a race to get to the lowest cost, I have concentrated my risk. We are at the logical conclusion of all that.”
More...
https://www.nytimes.com/2021/06/01/busi ... 778d3e6de3
8 Hours a Day, 5 Days a Week Is Not Working for Us
With more than half of American adults fully vaccinated against Covid, employers and employees alike have turned their eyes back to the office. They’re locked in a conflict over when they’ll return and, when they do, what the return will look like. But we shouldn’t just be talking about the parameters of how we get work done in a postpandemic world. We should be pushing to do less of it.
In truth, the debate over the return to the office is fraught. Employers are used to being able to dictate when and where employees work, but we have now discovered that a lot of work can be done at odd hours between remote school lessons and from home offices or even the comfort of one’s bed.
So now there’s a tense push and pull over when and how much people should start commuting and how much power over the question employees can exert. Everyone is focused on how we will make work work after such a severe shock to the system for how things used to get done. But the ultimate answer won’t be found in hybrid remote and in-person offices or even in letting employees shift their hours around. The way to make work work is to cut it back.
More...
https://www.nytimes.com/2021/07/20/opin ... 778d3e6de3
With more than half of American adults fully vaccinated against Covid, employers and employees alike have turned their eyes back to the office. They’re locked in a conflict over when they’ll return and, when they do, what the return will look like. But we shouldn’t just be talking about the parameters of how we get work done in a postpandemic world. We should be pushing to do less of it.
In truth, the debate over the return to the office is fraught. Employers are used to being able to dictate when and where employees work, but we have now discovered that a lot of work can be done at odd hours between remote school lessons and from home offices or even the comfort of one’s bed.
So now there’s a tense push and pull over when and how much people should start commuting and how much power over the question employees can exert. Everyone is focused on how we will make work work after such a severe shock to the system for how things used to get done. But the ultimate answer won’t be found in hybrid remote and in-person offices or even in letting employees shift their hours around. The way to make work work is to cut it back.
More...
https://www.nytimes.com/2021/07/20/opin ... 778d3e6de3
Are Workplace Diversity Programs Doing More Harm Than Good?
Employer's everywhere are deploying D.E.I. programs to be less racist. But do they even work?
It’s time to rethink what’s working in the modern workplace and what’s failing. Amid a pandemic that overturned how so many work, increased calls for racial and social justice put a new pressure on companies to ensure — or at least to seem as if they ensure — equality among their employees. Diversity, equity and inclusion (D.E.I.) programs are an increasingly popular solution deployed by management. But do these initiatives do marginalized employees any good? And who are the true beneficiaries of diversity programs, anyway?
Jane Coaston has spent years on the receiving end of diversity initiatives, and for that reason, she’s skeptical. To debate D.E.I. programs’ efficacy, she brought together Dr. Sonia Kang, an associate professor of organizational behavior and human resource management who studies identity, diversity and inclusion at the University of Toronto, and Lily Zheng, a D.E.I. strategy consultant and public speaker, to argue what works and doesn’t when it comes to making workplaces fair for all.
Listen to the podcast at:
https://www.nytimes.com/2021/08/11/opin ... 778d3e6de3
Employer's everywhere are deploying D.E.I. programs to be less racist. But do they even work?
It’s time to rethink what’s working in the modern workplace and what’s failing. Amid a pandemic that overturned how so many work, increased calls for racial and social justice put a new pressure on companies to ensure — or at least to seem as if they ensure — equality among their employees. Diversity, equity and inclusion (D.E.I.) programs are an increasingly popular solution deployed by management. But do these initiatives do marginalized employees any good? And who are the true beneficiaries of diversity programs, anyway?
Jane Coaston has spent years on the receiving end of diversity initiatives, and for that reason, she’s skeptical. To debate D.E.I. programs’ efficacy, she brought together Dr. Sonia Kang, an associate professor of organizational behavior and human resource management who studies identity, diversity and inclusion at the University of Toronto, and Lily Zheng, a D.E.I. strategy consultant and public speaker, to argue what works and doesn’t when it comes to making workplaces fair for all.
Listen to the podcast at:
https://www.nytimes.com/2021/08/11/opin ... 778d3e6de3
Few Women Ascend Japan’s Corporate Ladder. Is Change Finally Coming?
Only 6 percent of board seats at Japanese companies are held by women. After years of unkept promises, these businesses are now facing pressure both at home and abroad to diversify.
TOKYO — When Naomi Koshi was elected in June to the board of one of Japan’s largest telecommunications companies, she became one of the few women in the country to reach the top of the corporate ladder.
Now that she’s there, she wants to pull others up with her.
“In Japan now, in most companies, only old men make decisions,” Ms. Koshi, a partner at the law firm Miura & Partners, said during a recent interview. “If we have more female votes on boards, we can change companies,” she said, adding that “if more people join the decision-making process, that will change the culture and create innovation.”
Japanese companies are under growing pressure both at home and abroad to elevate more women to positions of authority. Next year, the Tokyo Stock Exchange will adopt new rules that push companies listed in its top tier to take steps to ensure diversity, including the promotion of women, a move that aligns it with other major stock markets. This month, Nasdaq received U.S. approval for a similar, albeit more far-reaching, policy.
The efforts in Japan are intended to overcome decades of unkept promises from political and business leaders to increase opportunities for Japanese women, who face some of the starkest inequality in the developed world. They remain less likely to be hired as full-time employees and on average earn almost 44 percent less than men. Many leave their jobs after having a child, and making up the lost time is almost impossible under Japan’s seniority-based system.
More...
https://www.nytimes.com/2021/08/25/busi ... 778d3e6de3
Only 6 percent of board seats at Japanese companies are held by women. After years of unkept promises, these businesses are now facing pressure both at home and abroad to diversify.
TOKYO — When Naomi Koshi was elected in June to the board of one of Japan’s largest telecommunications companies, she became one of the few women in the country to reach the top of the corporate ladder.
Now that she’s there, she wants to pull others up with her.
“In Japan now, in most companies, only old men make decisions,” Ms. Koshi, a partner at the law firm Miura & Partners, said during a recent interview. “If we have more female votes on boards, we can change companies,” she said, adding that “if more people join the decision-making process, that will change the culture and create innovation.”
Japanese companies are under growing pressure both at home and abroad to elevate more women to positions of authority. Next year, the Tokyo Stock Exchange will adopt new rules that push companies listed in its top tier to take steps to ensure diversity, including the promotion of women, a move that aligns it with other major stock markets. This month, Nasdaq received U.S. approval for a similar, albeit more far-reaching, policy.
The efforts in Japan are intended to overcome decades of unkept promises from political and business leaders to increase opportunities for Japanese women, who face some of the starkest inequality in the developed world. They remain less likely to be hired as full-time employees and on average earn almost 44 percent less than men. Many leave their jobs after having a child, and making up the lost time is almost impossible under Japan’s seniority-based system.
More...
https://www.nytimes.com/2021/08/25/busi ... 778d3e6de3
Frigoken Ltd | Investing in Sustainability and Workplace Wellness
Video:
https://www.youtube.com/watch?v=Ic_84mBFJoc
Aga Khan Development Network (AKDN)
#Frigoken Limited – a project company of the #AgaKhan Fund for #EconomicDevelopment – is the largest vegetable processor in East and Central Africa. In this video, produced by Global Compact Network Kenya as part of its Sustainable Development Goals in Action series, Frigoken’s CEO Mr. Karim Dostmohamed discusses the company's investment in all-round workplace wellness to advance corporate sustainability. He details why this move was a business imperative for Frigoken and why other Kenyan companies should follow suit.
Video:
https://www.youtube.com/watch?v=Ic_84mBFJoc
Aga Khan Development Network (AKDN)
#Frigoken Limited – a project company of the #AgaKhan Fund for #EconomicDevelopment – is the largest vegetable processor in East and Central Africa. In this video, produced by Global Compact Network Kenya as part of its Sustainable Development Goals in Action series, Frigoken’s CEO Mr. Karim Dostmohamed discusses the company's investment in all-round workplace wellness to advance corporate sustainability. He details why this move was a business imperative for Frigoken and why other Kenyan companies should follow suit.
College Degrees Are Overrated
Recruiters are insisting on college degrees for jobs that don’t need them. Why? Risk aversion. If recruiters recommend a nongraduate who doesn’t work out, they’ll get blamed. Whereas, if they reject a nongraduate who would have been a huge success — well, no one will ever know, will they? It’s a costly but undetectable mistake.
Byron Auguste has co-founded a nonprofit organization, Opportunity@Work, whose purpose is to give a leg up to people he calls STARs, short for “skilled through alternative routes.” I interviewed him recently.
He told me that he’s haunted by the invisible tragedy of successful careers that never happen because applicants without college degrees aren’t given a chance. It affects first-time job-seekers, those stuck in dead-end careers, and older victims of layoffs who no longer qualify for the jobs they landed at a more forgiving time.
“It’s a pretty dysfunctional market in a lot of ways,” he says. “You’re not just giving extra weight to a bachelor’s. You’re insisting on it. And there’s no way to even learn what you’re missing. That’s why you can keep making this mistake over and over.”
In 1971, Auguste’s father left a job on a shipping dock to study computer programming. Despite lacking a college degree, he was hired by Detroit Edison. “That was where our family’s trajectory into the American middle class began,” Auguste says. Indeed: Auguste got a bachelor’s degree from Yale and a doctorate in economics from the University of Oxford. He spent 20 years at the consulting firm McKinsey & Co., rising to senior partner, then worked for President Barack Obama as deputy assistant to the president for economic policy and deputy director of the National Economic Council before cofounding Opportunity@Work in 2015.
It’s a classic American success story. Yet today, an employer might not take a chance on someone like Auguste’s father.
More...
https://www.nytimes.com/2021/10/18/opin ... 778d3e6de3
Recruiters are insisting on college degrees for jobs that don’t need them. Why? Risk aversion. If recruiters recommend a nongraduate who doesn’t work out, they’ll get blamed. Whereas, if they reject a nongraduate who would have been a huge success — well, no one will ever know, will they? It’s a costly but undetectable mistake.
Byron Auguste has co-founded a nonprofit organization, Opportunity@Work, whose purpose is to give a leg up to people he calls STARs, short for “skilled through alternative routes.” I interviewed him recently.
He told me that he’s haunted by the invisible tragedy of successful careers that never happen because applicants without college degrees aren’t given a chance. It affects first-time job-seekers, those stuck in dead-end careers, and older victims of layoffs who no longer qualify for the jobs they landed at a more forgiving time.
“It’s a pretty dysfunctional market in a lot of ways,” he says. “You’re not just giving extra weight to a bachelor’s. You’re insisting on it. And there’s no way to even learn what you’re missing. That’s why you can keep making this mistake over and over.”
In 1971, Auguste’s father left a job on a shipping dock to study computer programming. Despite lacking a college degree, he was hired by Detroit Edison. “That was where our family’s trajectory into the American middle class began,” Auguste says. Indeed: Auguste got a bachelor’s degree from Yale and a doctorate in economics from the University of Oxford. He spent 20 years at the consulting firm McKinsey & Co., rising to senior partner, then worked for President Barack Obama as deputy assistant to the president for economic policy and deputy director of the National Economic Council before cofounding Opportunity@Work in 2015.
It’s a classic American success story. Yet today, an employer might not take a chance on someone like Auguste’s father.
More...
https://www.nytimes.com/2021/10/18/opin ... 778d3e6de3
Book
New publication: Futureproof Your Career – How to Lead and Succeed in a Changing World
A new book titled Futureproof Your Career: How to Lead and Succeed in a Changing World by Professor Shaheena Janjuha-Jivraj and her colleague Neema Pasha has just been published.
The book is an essential guide to improving your career and taking full advantage of opportunities for progression. With a major focus on the changing business, economic, and technological landscape, it explores the new challenges of job retention and career progression.
Available at https://www.amazon.co.uk/Futureproof-Yo ... B091Y37WZC
Reviews and more...
https://ismailimail.blog/2021/10/25/new ... ing-world/
New publication: Futureproof Your Career – How to Lead and Succeed in a Changing World
A new book titled Futureproof Your Career: How to Lead and Succeed in a Changing World by Professor Shaheena Janjuha-Jivraj and her colleague Neema Pasha has just been published.
The book is an essential guide to improving your career and taking full advantage of opportunities for progression. With a major focus on the changing business, economic, and technological landscape, it explores the new challenges of job retention and career progression.
Available at https://www.amazon.co.uk/Futureproof-Yo ... B091Y37WZC
Reviews and more...
https://ismailimail.blog/2021/10/25/new ... ing-world/
‘Great Attrition’ or ‘Great Attraction’? The choice is yours
A record number of employees are quitting or thinking about doing so. Organizations that take the time to learn why—and act thoughtfully—will have an edge in attracting and retaining talent.
More than 19 million US workers—and counting—have quit their jobs since April 2021, a record pace disrupting businesses everywhere. Companies are struggling to address the problem, and many will continue to struggle for one simple reason: they don’t really understand why their employees are leaving in the first place. Rather than take the time to investigate the true causes of attrition, many companies are jumping to well-intentioned quick fixes that fall flat: for example, they’re bumping up pay or financial perks, like offering “thank you” bonuses without making any effort to strengthen the relational ties people have with their colleagues and their employers. The result? Rather than sensing appreciation, employees sense a transaction. This transactional relationship reminds them that their real needs aren’t being met.
If the past 18 months have taught us anything, it’s that employees crave investment in the human aspects of work. Employees are tired, and many are grieving. They want a renewed and revised sense of purpose in their work. They want social and interpersonal connections with their colleagues and managers. They want to feel a sense of shared identity. Yes, they want pay, benefits, and perks, but more than that they want to feel valued by their organizations and managers. They want meaningful—though not necessarily in-person—interactions, not just transactions.
By not understanding what their employees are running from, and what they might gravitate to, company leaders are putting their very businesses at risk. Moreover, because many employers are handling the situation similarly—failing to invest in a more fulfilling employee experience and failing to meet new demands for autonomy and flexibility at work—some employees are deliberately choosing to withdraw entirely from traditional forms of full-time employment.
In this article, we highlight new McKinsey research into the nature and characteristics of the Great Attrition and what’s driving it (see sidebar, “About the research”). The bottom line: the Great Attrition is happening, it’s widespread and likely to persist—if not accelerate—and many companies don’t understand what’s really going on, despite their best efforts. These companies are making ineffective moves based on faulty assumptions.
Full article: https://www.mckinsey.com/business-funct ... ational-pe...
https://the.ismaili/portugal/%E2%80%98g ... oice-yours
A record number of employees are quitting or thinking about doing so. Organizations that take the time to learn why—and act thoughtfully—will have an edge in attracting and retaining talent.
More than 19 million US workers—and counting—have quit their jobs since April 2021, a record pace disrupting businesses everywhere. Companies are struggling to address the problem, and many will continue to struggle for one simple reason: they don’t really understand why their employees are leaving in the first place. Rather than take the time to investigate the true causes of attrition, many companies are jumping to well-intentioned quick fixes that fall flat: for example, they’re bumping up pay or financial perks, like offering “thank you” bonuses without making any effort to strengthen the relational ties people have with their colleagues and their employers. The result? Rather than sensing appreciation, employees sense a transaction. This transactional relationship reminds them that their real needs aren’t being met.
If the past 18 months have taught us anything, it’s that employees crave investment in the human aspects of work. Employees are tired, and many are grieving. They want a renewed and revised sense of purpose in their work. They want social and interpersonal connections with their colleagues and managers. They want to feel a sense of shared identity. Yes, they want pay, benefits, and perks, but more than that they want to feel valued by their organizations and managers. They want meaningful—though not necessarily in-person—interactions, not just transactions.
By not understanding what their employees are running from, and what they might gravitate to, company leaders are putting their very businesses at risk. Moreover, because many employers are handling the situation similarly—failing to invest in a more fulfilling employee experience and failing to meet new demands for autonomy and flexibility at work—some employees are deliberately choosing to withdraw entirely from traditional forms of full-time employment.
In this article, we highlight new McKinsey research into the nature and characteristics of the Great Attrition and what’s driving it (see sidebar, “About the research”). The bottom line: the Great Attrition is happening, it’s widespread and likely to persist—if not accelerate—and many companies don’t understand what’s really going on, despite their best efforts. These companies are making ineffective moves based on faulty assumptions.
Full article: https://www.mckinsey.com/business-funct ... ational-pe...
https://the.ismaili/portugal/%E2%80%98g ... oice-yours
Remote Work Is Failing Young Employees
Kiersten graduated from college straight into the middle of a pandemic and a precarious job market. She managed to find an entry-level job with a government contractor that allowed her to work from the safety of her home. There was no fanfare on her first day; she simply opened her laptop and began an endless series of training sessions conducted over Zoom. The sessions were helpful, Kiersten recalls, but very formal, with little room for socializing. Even among her fellow new hires, Kiersten felt at a remove. “I just stared at their Zoom boxes and willed us to be friends,” she told us. “But we never had the opportunity to interact.”
With time, she grew accustomed to the daily cadences of her job. But she still felt like a stranger at her own company, whose remote policies were haphazard at best. To chat, employees used an outdated version of Skype; in Zoom meetings, almost all co-workers left their cameras off. Months into her job, she could identify people only by their chat avatars and voices. At one point, she says, she began “obsessively stalking” her company’s Glassdoor reviews, just to try to get a sense of the company culture. She was, by her own admission, unmoored, totally unmentored and insecure, with no way to learn from her colleagues. It’s one thing to start a new job remotely. It’s another to start your entire career that way.
“I was shocked at how all the skills I had learned on how to navigate this type of environment in person evaporated remotely,” Kiersten said. “They feel entirely inaccessible to me now.” She’s not alone. While reporting “Out of Office,” a book we’re writing on remote work, we heard similar stories from early career workers who’ve felt adrift during the Covid-19 pandemic. (The participants, concerned about retaliation from their employers, agreed to speak with us about their experiences on the condition that we withhold their last names.) All were grateful to be employed, but many felt left behind, invisible and, in some cases, unsure about how to actually do their jobs. While their companies adapted their workflows to function outside the office, few spent the time to craft policies to mentor young professionals, many of whom found themselves stuck on their couches, attempting to decipher cryptic emails and emojis sent over Slack.
Most newcomers are terrified of messing up and hesitant to ask questions that might make them sound naïve. Which, of course, means that they’re also scared that they’re already failing. “I think I’m missing out on a lot of the soft skills that one picks up in the first few years of working,” Haziq, a 22-year-old living in Ireland, told us. He’s found it nearly impossible to socialize with colleagues and lacks the confidence to casually ask a question of his manager or teammates. “If I was sitting next to my manager, I could just have a quick chat and move on,” he said. “But I’m much less likely to Slack my manager and ask something because I don’t know what they’re up to at the moment. The amount of on-the-job learning has reduced dramatically.”
More...
https://www.nytimes.com/2021/11/22/opin ... 778d3e6de3
Kiersten graduated from college straight into the middle of a pandemic and a precarious job market. She managed to find an entry-level job with a government contractor that allowed her to work from the safety of her home. There was no fanfare on her first day; she simply opened her laptop and began an endless series of training sessions conducted over Zoom. The sessions were helpful, Kiersten recalls, but very formal, with little room for socializing. Even among her fellow new hires, Kiersten felt at a remove. “I just stared at their Zoom boxes and willed us to be friends,” she told us. “But we never had the opportunity to interact.”
With time, she grew accustomed to the daily cadences of her job. But she still felt like a stranger at her own company, whose remote policies were haphazard at best. To chat, employees used an outdated version of Skype; in Zoom meetings, almost all co-workers left their cameras off. Months into her job, she could identify people only by their chat avatars and voices. At one point, she says, she began “obsessively stalking” her company’s Glassdoor reviews, just to try to get a sense of the company culture. She was, by her own admission, unmoored, totally unmentored and insecure, with no way to learn from her colleagues. It’s one thing to start a new job remotely. It’s another to start your entire career that way.
“I was shocked at how all the skills I had learned on how to navigate this type of environment in person evaporated remotely,” Kiersten said. “They feel entirely inaccessible to me now.” She’s not alone. While reporting “Out of Office,” a book we’re writing on remote work, we heard similar stories from early career workers who’ve felt adrift during the Covid-19 pandemic. (The participants, concerned about retaliation from their employers, agreed to speak with us about their experiences on the condition that we withhold their last names.) All were grateful to be employed, but many felt left behind, invisible and, in some cases, unsure about how to actually do their jobs. While their companies adapted their workflows to function outside the office, few spent the time to craft policies to mentor young professionals, many of whom found themselves stuck on their couches, attempting to decipher cryptic emails and emojis sent over Slack.
Most newcomers are terrified of messing up and hesitant to ask questions that might make them sound naïve. Which, of course, means that they’re also scared that they’re already failing. “I think I’m missing out on a lot of the soft skills that one picks up in the first few years of working,” Haziq, a 22-year-old living in Ireland, told us. He’s found it nearly impossible to socialize with colleagues and lacks the confidence to casually ask a question of his manager or teammates. “If I was sitting next to my manager, I could just have a quick chat and move on,” he said. “But I’m much less likely to Slack my manager and ask something because I don’t know what they’re up to at the moment. The amount of on-the-job learning has reduced dramatically.”
More...
https://www.nytimes.com/2021/11/22/opin ... 778d3e6de3
Daniel Kahneman’s Favorite Approach For Making Better Decisions
Bob Sutton’s book, Scaling Up Excellence: Getting to More Without Settling for Less https://www.amazon.ca/dp/0385347022/ref ... ative=9325 , contains an interesting section towards the end on looking back from the future, which talks about “a mind trick that goads and guides people to act on what they know and, in turn, amplifies their odds of success.”
We build on Nobel winner Daniel Kahneman’s favorite approach for making better decisions. This may sound weird, but it’s a form of imaginary time travel.
It’s called the premortem. And, while it may be Kahneman’s favorite, he didn’t come up with it. A fellow by the name of Gary Klein invented the premortem technique.
A premortem works something like this. When you’re on the verge of making a decision, not just any decision but a big decision, you call a meeting. At the meeting, you ask each member of your team to imagine that it’s a year later.
Split them into two groups. Have one group imagine that the effort was an unmitigated disaster. Have the other pretend it was a roaring success. Ask each member to work independently and generate reasons, or better yet, write a story, about why the success or failure occurred. Instruct them to be as detailed as possible, and, as Klein emphasizes, to identify causes that they wouldn’t usually mention “for fear of being impolite.” Next, have each person in the “failure” group read their list or story aloud, and record and collate the reasons. Repeat this process with the “success” group. Finally use the reasons from both groups to strengthen your … plan. If you uncover overwhelming and impassible roadblocks, then go back to the drawing board.
Premortems encourage people to use “prospective hindsight,” or, more accurately, to talk in “future perfect tense.” Instead of thinking, “we will devote the next six months to implementing a new HR software initiative,” for example, we travel to the future and think, “we have devoted six months to implementing a new HR software package.”
You imagine that a concrete success or failure has occurred and look “back from the future” to tell a story about the causes.
[…]
Pretending that a success or failure has already occurred—and looking back and inventing the details of why it happened—seems almost absurdly simple. Yet renowned scholars including Kahneman, Klein, and Karl Weick supply compelling logic and evidence that this approach generates better decisions, predictions, and plans. Their work suggests several reasons why. …
1. This approach helps people overcome blind spots
As … upcoming events become more distant, people develop more grandiose and vague plans and overlook the nitty-gritty daily details required to achieve their long-term goals.
2. This approach helps people bridge short-term and long-term thinking
Weick argues that this shift is effective, in part, because it is far easier to imagine the detailed causes of a single outcome than to imagine multiple outcomes and try to explain why each may have occurred. Beyond that, analyzing a single event as if it has already occurred rather than pretending it might occur makes it seem more concrete and likely to actually happen, which motivates people to devote more attention to explaining it.
3. Looking back dampens excessive optimism
As Kahneman and other researchers show, most people overestimate the chances that good things will happen to them and underestimate the odds that they will face failures, delays, and setbacks. Kahneman adds that “in general, organizations really don’t like pessimists” and that when naysayers raise risks and drawbacks, they are viewed as “almost disloyal.”
Max Bazerman, a Harvard professor, believes that we’re less prone to irrational optimism when we predict the fate of projects that are not our own. For example, when it comes to friends’ home renovation projects, most people estimate the costs will run 25 to 50 percent over budget. When it comes to our projects; however, they will be “completed on time and near the project costs.”
4. A premortem challenges the illusion of consensus
Most times, not everyone on a team agrees with the course of action. Even when you have enough cognitive diversity in the room, people still keep their mouths shut because people in power tend to reward people who agree with them while punishing those who dare to speak up with a dissenting view.
The resulting corrosive conformity is evident when people don’t raise private doubts, known risks, and inconvenient facts. In contrast, as Klein explains, a premortem can create a competition where members feel accountable for raising obstacles that others haven’t. “The whole dynamic changes from trying to avoid anything that might disrupt harmony to trying to surface potential problems.”
https://fs.blog/kahneman-better-decisions/
Bob Sutton’s book, Scaling Up Excellence: Getting to More Without Settling for Less https://www.amazon.ca/dp/0385347022/ref ... ative=9325 , contains an interesting section towards the end on looking back from the future, which talks about “a mind trick that goads and guides people to act on what they know and, in turn, amplifies their odds of success.”
We build on Nobel winner Daniel Kahneman’s favorite approach for making better decisions. This may sound weird, but it’s a form of imaginary time travel.
It’s called the premortem. And, while it may be Kahneman’s favorite, he didn’t come up with it. A fellow by the name of Gary Klein invented the premortem technique.
A premortem works something like this. When you’re on the verge of making a decision, not just any decision but a big decision, you call a meeting. At the meeting, you ask each member of your team to imagine that it’s a year later.
Split them into two groups. Have one group imagine that the effort was an unmitigated disaster. Have the other pretend it was a roaring success. Ask each member to work independently and generate reasons, or better yet, write a story, about why the success or failure occurred. Instruct them to be as detailed as possible, and, as Klein emphasizes, to identify causes that they wouldn’t usually mention “for fear of being impolite.” Next, have each person in the “failure” group read their list or story aloud, and record and collate the reasons. Repeat this process with the “success” group. Finally use the reasons from both groups to strengthen your … plan. If you uncover overwhelming and impassible roadblocks, then go back to the drawing board.
Premortems encourage people to use “prospective hindsight,” or, more accurately, to talk in “future perfect tense.” Instead of thinking, “we will devote the next six months to implementing a new HR software initiative,” for example, we travel to the future and think, “we have devoted six months to implementing a new HR software package.”
You imagine that a concrete success or failure has occurred and look “back from the future” to tell a story about the causes.
[…]
Pretending that a success or failure has already occurred—and looking back and inventing the details of why it happened—seems almost absurdly simple. Yet renowned scholars including Kahneman, Klein, and Karl Weick supply compelling logic and evidence that this approach generates better decisions, predictions, and plans. Their work suggests several reasons why. …
1. This approach helps people overcome blind spots
As … upcoming events become more distant, people develop more grandiose and vague plans and overlook the nitty-gritty daily details required to achieve their long-term goals.
2. This approach helps people bridge short-term and long-term thinking
Weick argues that this shift is effective, in part, because it is far easier to imagine the detailed causes of a single outcome than to imagine multiple outcomes and try to explain why each may have occurred. Beyond that, analyzing a single event as if it has already occurred rather than pretending it might occur makes it seem more concrete and likely to actually happen, which motivates people to devote more attention to explaining it.
3. Looking back dampens excessive optimism
As Kahneman and other researchers show, most people overestimate the chances that good things will happen to them and underestimate the odds that they will face failures, delays, and setbacks. Kahneman adds that “in general, organizations really don’t like pessimists” and that when naysayers raise risks and drawbacks, they are viewed as “almost disloyal.”
Max Bazerman, a Harvard professor, believes that we’re less prone to irrational optimism when we predict the fate of projects that are not our own. For example, when it comes to friends’ home renovation projects, most people estimate the costs will run 25 to 50 percent over budget. When it comes to our projects; however, they will be “completed on time and near the project costs.”
4. A premortem challenges the illusion of consensus
Most times, not everyone on a team agrees with the course of action. Even when you have enough cognitive diversity in the room, people still keep their mouths shut because people in power tend to reward people who agree with them while punishing those who dare to speak up with a dissenting view.
The resulting corrosive conformity is evident when people don’t raise private doubts, known risks, and inconvenient facts. In contrast, as Klein explains, a premortem can create a competition where members feel accountable for raising obstacles that others haven’t. “The whole dynamic changes from trying to avoid anything that might disrupt harmony to trying to surface potential problems.”
https://fs.blog/kahneman-better-decisions/
You Quit. I Quit. We All Quit. And It’s Not a Coincidence.
Why the decision to leave a job can become contagious.
Something infectious is spreading through the work force. Its symptoms present in a spate of two-week notices. Its transmission is visible in real time. And few bosses seem to know how to inoculate their staff against this quitagion.
It catches quickly. “There’s a shock when you see multiple people leaving — it’s like, oh, is there something I’m not seeing?” said Tiff Cheng, 27, who left her job in digital marketing in July, along with five of her close friends at the 40-person agency. “Is it my time to leave as well?”
Quitting rates were high in August, September and October. Then, according to Labor Department data, they climbed even further: More than 4.5 million people left their jobs voluntarily in November, a record high in two decades of tracking.
Economists explained the numbers by noting that competition for workers led to better pay and benefits, driving some to seek out new opportunities. Psychologists have an additional explanation: Quitting is contagious.
When workers weigh whether to jump jobs, they don’t just assess their own pay, benefits and career development. They look around and take note of how friends feel about the team culture. When one employee leaves, the departure signals to others that it might be time to take stock of their options, what researchers call “turnover contagion.”
So quitting begets more quitting, a challenge that employers can’t always solve with raises or perks. Even a single resignation notice can breed a “hot spot,” said Will Felps, who teaches management at the University of New South Wales and was an author of a study of turnover contagion.
Mr. Felps and his team studied staffing at a hospitality company and a selection of bank branches, all in the United States, and found that one worker’s decision to leave is especially likely to inspire others who don’t feel strongly embedded at the company. In a recent poll of more than 21,000 LinkedIn members, 59 percent said a colleague’s departure had led them to consider quitting as well.
The office has long been a petri dish for infectious behavior. Lying, cheating and job satisfaction all tend to spread from desk to desk. Financial advisers, for example, are 37 percent more likely to commit misconduct if they encounter teammates who have done so, what researchers refer to as “peer effects,” noting that one case of misconduct results on average in an additional 0.59 cases. Employees also mimic the nutritional patterns of people they sit with in the cafeteria. Teammates are suggestible to one another in far subtler ways than they realize.
But when it comes to heading for the exit, peer effects are particularly potent.
“When you walk by a restaurant and it’s full of people, it’s a clue this restaurant is pretty good,” Mr. Felps said. “Similarly, when the people you know, like and respect are leaving a job, you think maybe the grass is greener somewhere else.”
Ms. Cheng saw her inbox begin to fill with resignation notes last summer. Every other week she got an email from a colleague who was quitting her company, where hours were long and career advancement options seemed limited. She decided to turn full time to her own coaching business, which she now runs from Vancouver, British Columbia.
“It’s always really scary to make a decision to leave your job, and it was nice to be able to see other people were doing it,” Ms. Cheng said. “It didn’t feel as lonely, or like I was an outsider.”
ImageNikissa Granados, inspired by seeing two of her colleagues resign, decided to leave her job at a school to do freelance social media marketing.
Nikissa Granados, inspired by seeing two of her colleagues resign, decided to leave her job at a school to do freelance social media marketing.Credit...Maggie Shannon for The New York Times
A sense of workplace disaffection and restlessness started growing for many Americans in the early stages of the pandemic. For some, social media became a therapy couch, a space to vent those employment frustrations.
Back in March 2020, Erika Cruz, 31, was working at a Silicon Valley start-up, where she had grown disgruntled with the hallmarks of work life: “meetings that could have been an email” and lack of control over her schedule.
Sign Up for The Great Read Every weekday, we recommend one piece of exceptional writing from The Times — a narrative or essay that takes you someplace you might not expect to go. Get it sent to your inbox.
She got the motivation she needed to leave that summer when she watched a friend she had met on Instagram ditch a cushy tech job to open a coaching firm. Then Ms. Cruz, who had about six months of living expenses saved up, moved back to her parents’ home in the Bay Area and put in her one month’s notice at work. She sought advice from social media about how to start a business. Ms. Cruz realized, though, that there was no one-size-fits-all approach to upending a career.
“If you Google banana breads, there’s over a million recipes online, and they’re all going to be good but they’re all slightly different,” she said. “You have to choose your own recipe.”
It’s the story of the pandemic: When people posted their banana bread photos, they influenced their friends to start baking as well. But like quitting, it was something no two people did the same way.
The friend who inspired Ms. Cruz’s resignation, Cat Del Carmen, 34, agreed that it was important to develop her own quitting strategy. Ms. Del Carmen was able to leave a job at Adobe by cutting back spending on restaurant meals, vacations and TJ Maxx splurges. The six months after she left her job were high pressure financially. Ms. Del Carmen drew comfort from her correspondence with friends on social media who were also navigating the post-paycheck territory.
That bond forged by resignation, as people look to one another for inspiration and affirmation, is a phenomenon that predates the pandemic.
“It’s a huge decision,” said Anthony Klotz, an organizational psychologist at Texas A&M University. “If you Google how to resign from your job, there’s lots of conflicting guidance. Those answers are not in a company handbook. It makes sense people reach out for sounding boards from trusted others.”
Aimee Wells, 53, who works in public relations, had her own quitagion experience years ago. She had been working at a global marketing firm in San Francisco, where she bristled at the time constraints of corporate life. She was never able to drop off her son at kindergarten. She remembered watching the 1996 movie “One Fine Day,” in which Michelle Pfeiffer plays an architect who decides to make her family a priority over high-powered work. It left Ms. Wells grappling with how to reset the balance between her own corporate job and personal life (far as it was from the realities of Ms. Pfeiffer and George Clooney’s).
One evening, on the train coming home from work at 6, she ran into a neighbor carrying shopping bags full of files, office supplies and photographs. The neighbor told Ms. Wells that she had just quit the role that was burning her out.
“I went home and starting thinking about it a lot more seriously,” Ms. Wells said. One month later, she put in her own resignation notice, catalyzed by the run-in with her neighbor. “She was like my hero.”
The payoffs for some pandemic quitters have been significant. Nikissa Granados, 26, was weighing whether to leave her job at an Orange County, Calif., school in 2020 to do freelance social media marketing. She made the leap after seeing two of her teammates resign.
Ms. Granados went from making $2,100 a month, spending days on her feet setting up cots for nap time and begging children to wear their masks, to making as much as $8,000 monthly while dictating her own schedule, she said. She realized something now viscerally clear to many child care providers: In her work at the school, the mismatch between strain and pay had been stark.
For employers, replacing just one quitter is a straightforward task. But replacing several, or even dozens, is far more challenging, and the interim period tends to leave existing staff with a heavier load, while recruiters field awkward questions about what’s fueling all the departures. With quitting rates soaring, some executives are wondering how to lift morale.
Seth Besmertnik, chief executive of the marketing software company Conductor, had seen his company’s turnover rates hover in the low single digits for years. He even worried that his retention was too strong, making it hard to scout new talent.
Over the last two years, though, turnover rose into the double digits. Mr. Besmertnik had to get creative in his tactics to keep workers content, including adding new holidays and bringing Broadway actors from “Hamilton” and “Dear Evan Hansen” to sing “Burn” and “Waving Through a Window” (respectively) for staff during all-company video meetings.
Career coaches, meanwhile, worry that some people are being too easily influenced by the behaviors of their roaming colleagues. Kathryn Minshew, chief executive of the Muse, a job search site, warns clients that a single employee’s desire to leave a company shouldn’t have too much bearing on the decisions that friends make.
“When one person announces their resignation, there are usually some questions from their colleagues and workplace friends,” she said. “‘Where are you going? Why are you leaving?’”
That Pied Piper trail won’t always lead people to better options, and Ms. Minshew advises workers to assess their companies with the hyper-individualized approach they might take to building relationships.
“The idea that somebody would publish a list of the 50 best people to marry in New York City is silly,” she continued. “Similarly, I think the best companies to work for is a bit of a silly idea.”
But logical career advice can’t always prevent the contagion from catching.
“There’s a little bit of a ‘take this job and shove it’ feeling,” Ms. Wells said. “If you’re in a company where people all start leaving, you’re like, ‘Why am I the last one sitting here?’
https://www.nytimes.com/2022/01/21/busi ... 778d3e6de3
Why the decision to leave a job can become contagious.
Something infectious is spreading through the work force. Its symptoms present in a spate of two-week notices. Its transmission is visible in real time. And few bosses seem to know how to inoculate their staff against this quitagion.
It catches quickly. “There’s a shock when you see multiple people leaving — it’s like, oh, is there something I’m not seeing?” said Tiff Cheng, 27, who left her job in digital marketing in July, along with five of her close friends at the 40-person agency. “Is it my time to leave as well?”
Quitting rates were high in August, September and October. Then, according to Labor Department data, they climbed even further: More than 4.5 million people left their jobs voluntarily in November, a record high in two decades of tracking.
Economists explained the numbers by noting that competition for workers led to better pay and benefits, driving some to seek out new opportunities. Psychologists have an additional explanation: Quitting is contagious.
When workers weigh whether to jump jobs, they don’t just assess their own pay, benefits and career development. They look around and take note of how friends feel about the team culture. When one employee leaves, the departure signals to others that it might be time to take stock of their options, what researchers call “turnover contagion.”
So quitting begets more quitting, a challenge that employers can’t always solve with raises or perks. Even a single resignation notice can breed a “hot spot,” said Will Felps, who teaches management at the University of New South Wales and was an author of a study of turnover contagion.
Mr. Felps and his team studied staffing at a hospitality company and a selection of bank branches, all in the United States, and found that one worker’s decision to leave is especially likely to inspire others who don’t feel strongly embedded at the company. In a recent poll of more than 21,000 LinkedIn members, 59 percent said a colleague’s departure had led them to consider quitting as well.
The office has long been a petri dish for infectious behavior. Lying, cheating and job satisfaction all tend to spread from desk to desk. Financial advisers, for example, are 37 percent more likely to commit misconduct if they encounter teammates who have done so, what researchers refer to as “peer effects,” noting that one case of misconduct results on average in an additional 0.59 cases. Employees also mimic the nutritional patterns of people they sit with in the cafeteria. Teammates are suggestible to one another in far subtler ways than they realize.
But when it comes to heading for the exit, peer effects are particularly potent.
“When you walk by a restaurant and it’s full of people, it’s a clue this restaurant is pretty good,” Mr. Felps said. “Similarly, when the people you know, like and respect are leaving a job, you think maybe the grass is greener somewhere else.”
Ms. Cheng saw her inbox begin to fill with resignation notes last summer. Every other week she got an email from a colleague who was quitting her company, where hours were long and career advancement options seemed limited. She decided to turn full time to her own coaching business, which she now runs from Vancouver, British Columbia.
“It’s always really scary to make a decision to leave your job, and it was nice to be able to see other people were doing it,” Ms. Cheng said. “It didn’t feel as lonely, or like I was an outsider.”
ImageNikissa Granados, inspired by seeing two of her colleagues resign, decided to leave her job at a school to do freelance social media marketing.
Nikissa Granados, inspired by seeing two of her colleagues resign, decided to leave her job at a school to do freelance social media marketing.Credit...Maggie Shannon for The New York Times
A sense of workplace disaffection and restlessness started growing for many Americans in the early stages of the pandemic. For some, social media became a therapy couch, a space to vent those employment frustrations.
Back in March 2020, Erika Cruz, 31, was working at a Silicon Valley start-up, where she had grown disgruntled with the hallmarks of work life: “meetings that could have been an email” and lack of control over her schedule.
Sign Up for The Great Read Every weekday, we recommend one piece of exceptional writing from The Times — a narrative or essay that takes you someplace you might not expect to go. Get it sent to your inbox.
She got the motivation she needed to leave that summer when she watched a friend she had met on Instagram ditch a cushy tech job to open a coaching firm. Then Ms. Cruz, who had about six months of living expenses saved up, moved back to her parents’ home in the Bay Area and put in her one month’s notice at work. She sought advice from social media about how to start a business. Ms. Cruz realized, though, that there was no one-size-fits-all approach to upending a career.
“If you Google banana breads, there’s over a million recipes online, and they’re all going to be good but they’re all slightly different,” she said. “You have to choose your own recipe.”
It’s the story of the pandemic: When people posted their banana bread photos, they influenced their friends to start baking as well. But like quitting, it was something no two people did the same way.
The friend who inspired Ms. Cruz’s resignation, Cat Del Carmen, 34, agreed that it was important to develop her own quitting strategy. Ms. Del Carmen was able to leave a job at Adobe by cutting back spending on restaurant meals, vacations and TJ Maxx splurges. The six months after she left her job were high pressure financially. Ms. Del Carmen drew comfort from her correspondence with friends on social media who were also navigating the post-paycheck territory.
That bond forged by resignation, as people look to one another for inspiration and affirmation, is a phenomenon that predates the pandemic.
“It’s a huge decision,” said Anthony Klotz, an organizational psychologist at Texas A&M University. “If you Google how to resign from your job, there’s lots of conflicting guidance. Those answers are not in a company handbook. It makes sense people reach out for sounding boards from trusted others.”
Aimee Wells, 53, who works in public relations, had her own quitagion experience years ago. She had been working at a global marketing firm in San Francisco, where she bristled at the time constraints of corporate life. She was never able to drop off her son at kindergarten. She remembered watching the 1996 movie “One Fine Day,” in which Michelle Pfeiffer plays an architect who decides to make her family a priority over high-powered work. It left Ms. Wells grappling with how to reset the balance between her own corporate job and personal life (far as it was from the realities of Ms. Pfeiffer and George Clooney’s).
One evening, on the train coming home from work at 6, she ran into a neighbor carrying shopping bags full of files, office supplies and photographs. The neighbor told Ms. Wells that she had just quit the role that was burning her out.
“I went home and starting thinking about it a lot more seriously,” Ms. Wells said. One month later, she put in her own resignation notice, catalyzed by the run-in with her neighbor. “She was like my hero.”
The payoffs for some pandemic quitters have been significant. Nikissa Granados, 26, was weighing whether to leave her job at an Orange County, Calif., school in 2020 to do freelance social media marketing. She made the leap after seeing two of her teammates resign.
Ms. Granados went from making $2,100 a month, spending days on her feet setting up cots for nap time and begging children to wear their masks, to making as much as $8,000 monthly while dictating her own schedule, she said. She realized something now viscerally clear to many child care providers: In her work at the school, the mismatch between strain and pay had been stark.
For employers, replacing just one quitter is a straightforward task. But replacing several, or even dozens, is far more challenging, and the interim period tends to leave existing staff with a heavier load, while recruiters field awkward questions about what’s fueling all the departures. With quitting rates soaring, some executives are wondering how to lift morale.
Seth Besmertnik, chief executive of the marketing software company Conductor, had seen his company’s turnover rates hover in the low single digits for years. He even worried that his retention was too strong, making it hard to scout new talent.
Over the last two years, though, turnover rose into the double digits. Mr. Besmertnik had to get creative in his tactics to keep workers content, including adding new holidays and bringing Broadway actors from “Hamilton” and “Dear Evan Hansen” to sing “Burn” and “Waving Through a Window” (respectively) for staff during all-company video meetings.
Career coaches, meanwhile, worry that some people are being too easily influenced by the behaviors of their roaming colleagues. Kathryn Minshew, chief executive of the Muse, a job search site, warns clients that a single employee’s desire to leave a company shouldn’t have too much bearing on the decisions that friends make.
“When one person announces their resignation, there are usually some questions from their colleagues and workplace friends,” she said. “‘Where are you going? Why are you leaving?’”
That Pied Piper trail won’t always lead people to better options, and Ms. Minshew advises workers to assess their companies with the hyper-individualized approach they might take to building relationships.
“The idea that somebody would publish a list of the 50 best people to marry in New York City is silly,” she continued. “Similarly, I think the best companies to work for is a bit of a silly idea.”
But logical career advice can’t always prevent the contagion from catching.
“There’s a little bit of a ‘take this job and shove it’ feeling,” Ms. Wells said. “If you’re in a company where people all start leaving, you’re like, ‘Why am I the last one sitting here?’
https://www.nytimes.com/2022/01/21/busi ... 778d3e6de3
The Cruel Lesson of a Single Medical Mistake
We all carry the memory of our mistakes. For health care workers like me, these memories surface in the early morning when we cannot sleep or at a bedside where, in some way, we are reminded of a patient who came before. Most were errors in judgment or near misses: a procedure we thought could wait, a subtle abnormality in vital signs that didn’t register as a harbinger of serious illness, an X-ray finding missed, a central line nearly placed in the wrong blood vessel. Even the best of us have stories of missteps, close calls that are caught before they ever cause patient harm.
But some are more devastating. RaDonda Vaught, a former Tennessee nurse, is awaiting sentencing for one particularly catastrophic case that took place in 2017. She administered a paralyzing medication to a patient before a scan instead of the sedative she intended to give to quell anxiety. The patient stopped breathing and ultimately died.
Precisely where all the blame for this tragedy lies remains debated. Ms. Vaught’s attorney argued his client made an honest mistake and faulted the mechanized medication dispensing system at the hospital where she worked. The prosecution maintained, however, that she “overlooked many obvious signs that she’d withdrawn the wrong drug” and failed to monitor her patient after the injection.
Criminal prosecutions for medical errors are rare, but Ms. Vaught was convicted in criminal court of two felonies and now faces up to eight years in prison. This outcome has been met with outrage by doctors and nurses across the country. Many worry that her case creates a dangerous precedent, a chilling effect that will discourage health care workers from reporting errors or close calls. Some nurses are even leaving the profession and citing this case as the final straw after years of caring for patients with Covid-19.
From my vantage point, it is not useful to speculate about where malpractice ends and criminal liability begins. But what I do know as an intensive care unit doctor is this: The pandemic has brought the health care system to the brink, and the Vaught case is not unimaginable, especially with current staffing shortages. That is, perhaps, the most troubling fact of all.
It has been more than 20 years since the Institute of Medicine released a groundbreaking report on preventable medical errors, arguing that errors are due not solely to individual health care providers but also to systems that need to be made safer. The authors called for a 50 percent reduction in errors over five years. Even so, there is still no mandatory, nationwide system for reporting adverse events from medical errors.
When patient safety experts talk about medical errors in the abstract, in lecture halls and classrooms, they talk about a culture of patient safety, which means an openness to discussing mistakes and safety concerns without shifting to individual blame. In reality, however, conversations around errors often have a different tone. Early in my intern year, a senior cardiologist gathered our team one morning, after one of my fellow interns failed to start antibiotics on a septic patient overnight. The intern had been busy with a sick new admission and had missed subtle changes in the now septic patient, who had spiraled into shock by the morning.
“You must never stop being terrified,” the attending doctor told us. Even after decades of practice, she remained in a constant state of high alert. When you allow yourself to neglect your usual compulsiveness, she said, that’s when mistakes happen. Not because of imperfect systems, overwork and divided attention but because an intern was not appropriately terrified.
I carried her words with me for years. I have repeated them to my own residents. And there is a truth here: The cost of distraction on our job can be life or death, and we cannot forget that. But I realize now that no one should have to maintain constant terror. Mistakes happen, even to the most vigilant, particularly when we are juggling multiple high-stress tasks. And that is why we need robust systems, to make sure that the inevitable human errors and missteps are caught before they result in patient harm.
The electronic health records we use now prompt doctors and nurses when patients’ combinations of vital signs and lab results suggest that they might be septic. This can be frustrating when we are fatigued by alarms and alerts, but it helps us recognize and react to patterns that a busy medical team might otherwise miss. When it comes to administering medications, they must generally be approved by a pharmacist before they can become available to a nurse to administer. Some hospitals create a no-talk zone where nurses withdraw these medications, because that process requires a focus that is often impossible in the frenzy of today’s hospitals.
Once the medication is in hand, nurses use a system to scan the drug along with the patient’s wristband to help ensure that the correct medication is given to the correct patient. None of these systems are perfect. But each serves to acknowledge that no individual can hold full responsibility for every step that leads to a patient outcome. Just being vigilant is not enough.
What’s needed alongside these systems is a culture in which doctors and nurses are empowered to speak up and ask questions when they are uncertain or when they suspect that one of their colleagues is making a mistake. This could mean that a nurse questions a doctor’s medication order and discovers it was intended for a different patient. Or that a junior doctor admits she is out of her depth when faced with a procedure that she should know how to do.
Stories in medicine so often celebrate an individual hero. We valorize the surgeon who performs the groundbreaking surgery but rarely acknowledge the layers of teamwork and checklists that made that win possible. Similarly, when a patient is harmed, it is natural to look for a person to blame, a bad apple who can be punished so that everything will feel safe again. It is far easier and more palatable to tell a story about a flawed doctor or a nurse than a flawed system of medication delivery and vital sign management.
But when it comes to medical errors, that is rarely the reality. Health care workers and the public must acknowledge that catastrophic outcomes can happen even to well-intentioned but overworked doctors and nurses who are practicing medicine in an imperfect system. Punishing one nurse does not ensure that a similar tragedy won’t occur in a different hospital on a different day. And regardless of the sentence that Ms. Vaught receives in May and whether it is fair, her case must be viewed as a story not just about individual responsibility but also about the failure of multiple systems and safeguards. That is a harder narrative to accept, but it is a necessary one, without which medicine will never change. And that, too, would be a tragic error but one that is still in our power to prevent.
https://www.nytimes.com/2022/04/15/opin ... 778d3e6de3
But some are more devastating. RaDonda Vaught, a former Tennessee nurse, is awaiting sentencing for one particularly catastrophic case that took place in 2017. She administered a paralyzing medication to a patient before a scan instead of the sedative she intended to give to quell anxiety. The patient stopped breathing and ultimately died.
Precisely where all the blame for this tragedy lies remains debated. Ms. Vaught’s attorney argued his client made an honest mistake and faulted the mechanized medication dispensing system at the hospital where she worked. The prosecution maintained, however, that she “overlooked many obvious signs that she’d withdrawn the wrong drug” and failed to monitor her patient after the injection.
Criminal prosecutions for medical errors are rare, but Ms. Vaught was convicted in criminal court of two felonies and now faces up to eight years in prison. This outcome has been met with outrage by doctors and nurses across the country. Many worry that her case creates a dangerous precedent, a chilling effect that will discourage health care workers from reporting errors or close calls. Some nurses are even leaving the profession and citing this case as the final straw after years of caring for patients with Covid-19.
From my vantage point, it is not useful to speculate about where malpractice ends and criminal liability begins. But what I do know as an intensive care unit doctor is this: The pandemic has brought the health care system to the brink, and the Vaught case is not unimaginable, especially with current staffing shortages. That is, perhaps, the most troubling fact of all.
It has been more than 20 years since the Institute of Medicine released a groundbreaking report on preventable medical errors, arguing that errors are due not solely to individual health care providers but also to systems that need to be made safer. The authors called for a 50 percent reduction in errors over five years. Even so, there is still no mandatory, nationwide system for reporting adverse events from medical errors.
When patient safety experts talk about medical errors in the abstract, in lecture halls and classrooms, they talk about a culture of patient safety, which means an openness to discussing mistakes and safety concerns without shifting to individual blame. In reality, however, conversations around errors often have a different tone. Early in my intern year, a senior cardiologist gathered our team one morning, after one of my fellow interns failed to start antibiotics on a septic patient overnight. The intern had been busy with a sick new admission and had missed subtle changes in the now septic patient, who had spiraled into shock by the morning.
“You must never stop being terrified,” the attending doctor told us. Even after decades of practice, she remained in a constant state of high alert. When you allow yourself to neglect your usual compulsiveness, she said, that’s when mistakes happen. Not because of imperfect systems, overwork and divided attention but because an intern was not appropriately terrified.
I carried her words with me for years. I have repeated them to my own residents. And there is a truth here: The cost of distraction on our job can be life or death, and we cannot forget that. But I realize now that no one should have to maintain constant terror. Mistakes happen, even to the most vigilant, particularly when we are juggling multiple high-stress tasks. And that is why we need robust systems, to make sure that the inevitable human errors and missteps are caught before they result in patient harm.
The electronic health records we use now prompt doctors and nurses when patients’ combinations of vital signs and lab results suggest that they might be septic. This can be frustrating when we are fatigued by alarms and alerts, but it helps us recognize and react to patterns that a busy medical team might otherwise miss. When it comes to administering medications, they must generally be approved by a pharmacist before they can become available to a nurse to administer. Some hospitals create a no-talk zone where nurses withdraw these medications, because that process requires a focus that is often impossible in the frenzy of today’s hospitals.
Once the medication is in hand, nurses use a system to scan the drug along with the patient’s wristband to help ensure that the correct medication is given to the correct patient. None of these systems are perfect. But each serves to acknowledge that no individual can hold full responsibility for every step that leads to a patient outcome. Just being vigilant is not enough.
What’s needed alongside these systems is a culture in which doctors and nurses are empowered to speak up and ask questions when they are uncertain or when they suspect that one of their colleagues is making a mistake. This could mean that a nurse questions a doctor’s medication order and discovers it was intended for a different patient. Or that a junior doctor admits she is out of her depth when faced with a procedure that she should know how to do.
Stories in medicine so often celebrate an individual hero. We valorize the surgeon who performs the groundbreaking surgery but rarely acknowledge the layers of teamwork and checklists that made that win possible. Similarly, when a patient is harmed, it is natural to look for a person to blame, a bad apple who can be punished so that everything will feel safe again. It is far easier and more palatable to tell a story about a flawed doctor or a nurse than a flawed system of medication delivery and vital sign management.
But when it comes to medical errors, that is rarely the reality. Health care workers and the public must acknowledge that catastrophic outcomes can happen even to well-intentioned but overworked doctors and nurses who are practicing medicine in an imperfect system. Punishing one nurse does not ensure that a similar tragedy won’t occur in a different hospital on a different day. And regardless of the sentence that Ms. Vaught receives in May and whether it is fair, her case must be viewed as a story not just about individual responsibility but also about the failure of multiple systems and safeguards. That is a harder narrative to accept, but it is a necessary one, without which medicine will never change. And that, too, would be a tragic error but one that is still in our power to prevent.
https://www.nytimes.com/2022/04/15/opin ... 778d3e6de3
Can Business Schools Really Help Us ‘Reimagine Capitalism’?
This Is Not Your Grandfather’s M.B.A.
If you want to be a leader confident in your deepest values and your role in the universe, go to business school. At least, that’s what business schools say. In recent years, they have branded themselves as places where students learn to stay “true to your mission” and undertake a “truly life-changing experience” that values “health, happiness, and purpose” as well as “authenticity and renewed passion.”
Marketing teams across higher education are fond of quasi-spiritual tag lines, so it might be unfair to pick on business schools. But in the M.B.A. world, the latest, breathless versions of these slogans signal more than the generic American vocation to make money and live your best life now. What is remarkable is this: After decades of emphasis on financial markets and shareholder returns, business schools are trying to take on deeper philosophical problems — including, maybe, tentative questions about the means and ends of capitalism itself.
Over the last few years, student interest in the social impact of business has soared. Even before the pandemic, business schools were offering initiatives and program concentrations with names like “Conscientious Capitalism” and “Sustainable Business,” in line with investors’ growing interest in “environmental, social, governmental” considerations.
“There’s been a little tempering of the fervor for laissez-faire capitalism. There’s healthy conversation about that,” said Brian Lowery, a professor at the Stanford Graduate School of Business, where he recently taught a course on “Reimagining Work Post-Covid.”
Such conversations reflect a longstanding ambivalence about what, exactly, business schools are for. Is their purpose to train general managers as a professional class with a shared body of knowledge, like lawyers or doctors? Or should they provide targeted programs that offer technical skills? Are they a kind of divinity school for secular capitalists, where students discern their true vocation? Today’s business schools try to fulfill all these aims at once — but it is hard to teach narrow, applied skills and also encourage students to wrestle with giant, ambiguous questions about ultimate values and hierarchies of power.
The current surge of interest in deeper questions is not new, but rather a return to the original aims of the first modern business schools. The goal of the Tuck School of Business, founded in 1900 at Dartmouth College, was to educate “the man first and the businessman afterwards.” At the dedication of Harvard Business School’s new campus in 1927, one speaker declared “that the ministers of our business, like the ministers of our churches, should appreciate their responsibility.” He stressed the need for businessmen to have a wide-ranging education, to become “men who have not only a broad outlook in history, politics, and economics — but men who have also that moral and religious training which tends to develop character.”
Then, as now, these grand declarations reflected a mix of sincere conviction and a desire to persuade skeptics that training students to make more money can also be a genuine intellectual enterprise.
Historians of business education have traced the rise and fall of this ideal of “the C.E.O. as enlightened corporate statesman,” as the Harvard sociologist Rakesh Khurana put it in his book “From Higher Aims to Hired Hands.” Faith that managers could — and should — have long-range vision and a sense of public responsibility crumbled in the economic crises of the 1970s. The corporate models that emerged from the wreckage recast executives — and aspiring managers at business schools — primarily as agents of shareholders, indentured to serve the stock market price or valuation of private shares before all else.
This mind-set has pushed business schools to train managers to maximize shareholder value on quarterly returns, in the same way a NASCAR crew chief trains to manage a pit crew to get the car back on the track as quickly and efficiently as possible. This has left little room for that older ambition to cultivate character or wide-ranging intellectual curiosity — although business schools papered over the void by embracing the language of positive psychology and an amorphous idea of “leadership.”
Plenty of critics inside business schools have noted this reluctance to ask big-picture questions, despite the fad for genuflecting to environmental, social and governance concerns. Some note that schools are adept at defanging detractors, cordoning them off in their own professional journals and conferences and keeping them on payroll.
“I’ve been rewarded for being as cheeky as possible,” Martin Parker, who teaches at the University of Bristol’s School of Management, told me. When his current employers hired him, they knew he was about to publish a book called “Shut Down the Business School,” but they didn’t mind. “That doesn’t say they were particularly brave, but that my critique doesn’t matter very much,” Dr. Parker told me. “It’s not particularly threatening. I’m being petted by the emperor.”
Diversity initiatives and attention to environmental and social impact, he said, “amount to a green-washing, or ethics-washing, and conceal the major epistemological and structural issues that business schools assume, and glosses them with a particular kind of website fluff. It’s liberal fairy dust. Others don’t see it that way. They think capitalism just needs to become quite a bit nicer, that we need to orient corporations toward more benign investment strategies and less toxic relations with workers. That would be good — I’m not against small steps — but that diagnosis doesn’t reflect the nature of the problem we have.”
Even professors who push the envelope in their research pull back from challenging the instrumentalism of business schools, the focus on supposedly neutral tools and skills. Professor Lowery of Stanford, whom I mentioned earlier, is a social psychologist who studies the intersection of race and class. But he keeps normative questions out of the classroom. “Most of what I teach is designed to be as neutral as possible in terms of the explicit morality of what you should do,” he said. “I say this explicitly to students: The content is amoral. You can use it to achieve any sort of goal. It just helps you understand how people operate in social environments. It’s a set of tools.”
Dr. Lowery has taught Stanford’s most popular elective, a decades-old course called “Interpersonal Dynamics” (nicknamed the “Touchy Feely” course) in which students exchange candid feedback in intense sessions that many compare to group therapy. Students rave about the experience, which is based on the psychologist Kurt Lewin’s “training group” sessions in the 1940s, a precursor of modern workplace sensitivity programs. This sounds like a welcome break from a curriculum full of financial instruments and quantitative modeling, although the course is perhaps not all that different: Students are simply studying the efficient management and transfer of emotions.
Kelsey Aijala, a student at Stanford who is graduating this spring, told me that the leadership courses she has taken “are not values-driven — not forcing you to think about your values as a leader. There are courses about a purpose-driven life, and I’m taking a course now about strategic pivoting, but these are not asking you to think about your role in society. It’s more like ‘designing your life,’ and the curriculum still sits within a traditional business skill set.” It is hard to see how students can “find their purpose” in a curriculum too focused on sharpening tools to ask what those tools are for.
This may sound like the critique of a fuzzy-headed humanist who has no idea how the real world works, but I’m only echoing the conclusions of insiders. The businessman “needs breadth of knowledge, a sense of historical perspective, and flexibility of mind,” wrote the authors of “Higher Education for Business,” a 1959 study commissioned by the Ford Foundation. “He needs also to have a sensitive and sophisticated appreciation of the role which business does and can play in our kind of society. All this implies some familiarity with the more relevant branches of history and perhaps philosophy, and some knowledge of the social sciences, particularly economics, political science, and sociology.”
The Ford report — and a similar one sponsored the same year by the Carnegie Corporation — warned against ignoring the humanities or allowing faculty members and students to specialize too narrowly. Yet the funding that followed pushed schools in the opposite direction, consistent with the 1960s vogue for number-crunching wonkishness. Business schools embraced the hyper-specialization that pervades the rest of academia, falling especially under the thrall of economics and other heavily quantitative disciplines.
This fragmentation has accelerated in recent years as the more expansive M.B.A. degree has ceded ground to shorter, narrower masters degrees in topics like marketing and operations, often tailored to specific occupational contexts like health care or technology. Many programs permit students to sample electives in other parts of the university, but offer little structure for pulling this hodgepodge together. Business schools now pump “out over half a million narrow specialists per year” into an economic culture that prizes quick returns and efficiency, Roger Martin, the former dean of University of Toronto’s Rotman School of Management, wrote in his recent book “When More Is Not Better.”
“Business schools have long promised, ‘We’ll make you this general, leaderly kind of person,’ but they don’t,” Mr. Martin told me. “You come and get taught a bunch of narrow disciplines, and the assumption is, oh, the students will figure out how to fit those together. They will integrate across those fields and become general managers. But most don’t.” He lamented the absence of the humanities, qualitative disciplines that “teach someone how to think in a complex adaptive system. We treat that system like something else — we silo-ize it, break it into chunks, put it back together and think it will be fine. The humanities are the only hope for thinking about things in holistic, non-quantifiable ways.”
Here is the central tension of modern business education: At a time when society needs managers who can grapple with uncertainty and operate in a culture divided over basic questions of justice and human flourishing, most business schools still emphasize specialized skills and quantitative methods, the seductive simplicity of economic and social scientific models. They often reduce the weirdness of human organizations to the tidy pedagogy of the case method, in which students discuss 15- to 20-page accounts of how an individual or a corporation handled some task or crisis.
“The case method is theater,” Mr. Martin said. “There’s a case, and then there’s a teaching note that says what the point of the case is. Some notes will be as specific as to say: ‘ask the following question, wait ’til you get this answer, then write that out on the board.’ It’s no different from Shakespeare — people have lines, there’s three acts, everyone plays their role, and you know the answer ahead of time.”
The case method does not dominate every business school, but Harvard Business School, where the method originated, sold more than 15 million cases to other schools and organizations in 2020. Mr. Martin estimated that 30 percent of North American business education is “aided and abetted by an H.B.S. case.” Ms. Aijala, the student at Stanford, said that the case method “can be helpful to grapple with some of the dilemmas that business leaders have faced, but we’re usually doing it in a rapid-fire way that I don’t think promotes critical reflective thinking on deeper issues. Because participation is something you’re evaluated on in class, it promotes saying something for the sake of saying something, and doesn’t create space for deeper questioning.”
Yet the crafting and teaching of cases has become more nuanced in recent years, partly in response to the focus on environmental and social impact. “We do this nice thing where in each case, you map out all the stakeholders in the process,” Cynthia Madu, who is about to graduate from the Tuck School at Dartmouth, told me. “It lets us identify all the people actually there, so we don’t think it’s the C.E.O. doing everything. Also, if students know there’s not one single narrative, they’re more willing to debate in class whether it’s the correct narrative.”
Business school professors are also broadening the kinds of questions deemed relevant to modern business. “I’ve been buoyed by the diversity of what’s now considered economic research,” Ethan Rouen, a professor at Harvard Business School who teaches a course called “Reimagining Capitalism,” told me. “At H.B.S. we have people doing research on gun control, on the Rohingya genocide. This is new, and every year it’s going more in that direction.”
More than a half-century ago, the Ford Foundation report noted that “business itself is pulled in two directions,” needing managers with “breadth, perspective and flexibility of mind” as well as “better trained specialists.” Back then, business schools gave into the technocratic tide. It’s time to revisit that other, harder direction — the one that admits that measuring and modeling are not the same as understanding, and sees “Environmental, Social, and Governance” not as politically fashionable hand-waving, but a call to center the M.B.A. on big, hard questions.
Students themselves are pushing for this change. “When I was doing my M.B.A., a large number of students came into business school thinking it was a respite from whatever they were doing. They’d leave banking or consulting, do the M.B.A., then go back for a higher salary,” Dr. Rouen said. “Now so few students come in with that mind-set. Most come in thinking this is an opportunity to figure things out.”
Molly Worthen is the author, most recently, of the audio course “Charismatic Leaders Who Remade America” and an associate professor of history at the University of North Carolina at Chapel Hill.
https://www.nytimes.com/2022/05/05/opin ... 778d3e6de3
If you want to be a leader confident in your deepest values and your role in the universe, go to business school. At least, that’s what business schools say. In recent years, they have branded themselves as places where students learn to stay “true to your mission” and undertake a “truly life-changing experience” that values “health, happiness, and purpose” as well as “authenticity and renewed passion.”
Marketing teams across higher education are fond of quasi-spiritual tag lines, so it might be unfair to pick on business schools. But in the M.B.A. world, the latest, breathless versions of these slogans signal more than the generic American vocation to make money and live your best life now. What is remarkable is this: After decades of emphasis on financial markets and shareholder returns, business schools are trying to take on deeper philosophical problems — including, maybe, tentative questions about the means and ends of capitalism itself.
Over the last few years, student interest in the social impact of business has soared. Even before the pandemic, business schools were offering initiatives and program concentrations with names like “Conscientious Capitalism” and “Sustainable Business,” in line with investors’ growing interest in “environmental, social, governmental” considerations.
“There’s been a little tempering of the fervor for laissez-faire capitalism. There’s healthy conversation about that,” said Brian Lowery, a professor at the Stanford Graduate School of Business, where he recently taught a course on “Reimagining Work Post-Covid.”
Such conversations reflect a longstanding ambivalence about what, exactly, business schools are for. Is their purpose to train general managers as a professional class with a shared body of knowledge, like lawyers or doctors? Or should they provide targeted programs that offer technical skills? Are they a kind of divinity school for secular capitalists, where students discern their true vocation? Today’s business schools try to fulfill all these aims at once — but it is hard to teach narrow, applied skills and also encourage students to wrestle with giant, ambiguous questions about ultimate values and hierarchies of power.
The current surge of interest in deeper questions is not new, but rather a return to the original aims of the first modern business schools. The goal of the Tuck School of Business, founded in 1900 at Dartmouth College, was to educate “the man first and the businessman afterwards.” At the dedication of Harvard Business School’s new campus in 1927, one speaker declared “that the ministers of our business, like the ministers of our churches, should appreciate their responsibility.” He stressed the need for businessmen to have a wide-ranging education, to become “men who have not only a broad outlook in history, politics, and economics — but men who have also that moral and religious training which tends to develop character.”
Then, as now, these grand declarations reflected a mix of sincere conviction and a desire to persuade skeptics that training students to make more money can also be a genuine intellectual enterprise.
Historians of business education have traced the rise and fall of this ideal of “the C.E.O. as enlightened corporate statesman,” as the Harvard sociologist Rakesh Khurana put it in his book “From Higher Aims to Hired Hands.” Faith that managers could — and should — have long-range vision and a sense of public responsibility crumbled in the economic crises of the 1970s. The corporate models that emerged from the wreckage recast executives — and aspiring managers at business schools — primarily as agents of shareholders, indentured to serve the stock market price or valuation of private shares before all else.
This mind-set has pushed business schools to train managers to maximize shareholder value on quarterly returns, in the same way a NASCAR crew chief trains to manage a pit crew to get the car back on the track as quickly and efficiently as possible. This has left little room for that older ambition to cultivate character or wide-ranging intellectual curiosity — although business schools papered over the void by embracing the language of positive psychology and an amorphous idea of “leadership.”
Plenty of critics inside business schools have noted this reluctance to ask big-picture questions, despite the fad for genuflecting to environmental, social and governance concerns. Some note that schools are adept at defanging detractors, cordoning them off in their own professional journals and conferences and keeping them on payroll.
“I’ve been rewarded for being as cheeky as possible,” Martin Parker, who teaches at the University of Bristol’s School of Management, told me. When his current employers hired him, they knew he was about to publish a book called “Shut Down the Business School,” but they didn’t mind. “That doesn’t say they were particularly brave, but that my critique doesn’t matter very much,” Dr. Parker told me. “It’s not particularly threatening. I’m being petted by the emperor.”
Diversity initiatives and attention to environmental and social impact, he said, “amount to a green-washing, or ethics-washing, and conceal the major epistemological and structural issues that business schools assume, and glosses them with a particular kind of website fluff. It’s liberal fairy dust. Others don’t see it that way. They think capitalism just needs to become quite a bit nicer, that we need to orient corporations toward more benign investment strategies and less toxic relations with workers. That would be good — I’m not against small steps — but that diagnosis doesn’t reflect the nature of the problem we have.”
Even professors who push the envelope in their research pull back from challenging the instrumentalism of business schools, the focus on supposedly neutral tools and skills. Professor Lowery of Stanford, whom I mentioned earlier, is a social psychologist who studies the intersection of race and class. But he keeps normative questions out of the classroom. “Most of what I teach is designed to be as neutral as possible in terms of the explicit morality of what you should do,” he said. “I say this explicitly to students: The content is amoral. You can use it to achieve any sort of goal. It just helps you understand how people operate in social environments. It’s a set of tools.”
Dr. Lowery has taught Stanford’s most popular elective, a decades-old course called “Interpersonal Dynamics” (nicknamed the “Touchy Feely” course) in which students exchange candid feedback in intense sessions that many compare to group therapy. Students rave about the experience, which is based on the psychologist Kurt Lewin’s “training group” sessions in the 1940s, a precursor of modern workplace sensitivity programs. This sounds like a welcome break from a curriculum full of financial instruments and quantitative modeling, although the course is perhaps not all that different: Students are simply studying the efficient management and transfer of emotions.
Kelsey Aijala, a student at Stanford who is graduating this spring, told me that the leadership courses she has taken “are not values-driven — not forcing you to think about your values as a leader. There are courses about a purpose-driven life, and I’m taking a course now about strategic pivoting, but these are not asking you to think about your role in society. It’s more like ‘designing your life,’ and the curriculum still sits within a traditional business skill set.” It is hard to see how students can “find their purpose” in a curriculum too focused on sharpening tools to ask what those tools are for.
This may sound like the critique of a fuzzy-headed humanist who has no idea how the real world works, but I’m only echoing the conclusions of insiders. The businessman “needs breadth of knowledge, a sense of historical perspective, and flexibility of mind,” wrote the authors of “Higher Education for Business,” a 1959 study commissioned by the Ford Foundation. “He needs also to have a sensitive and sophisticated appreciation of the role which business does and can play in our kind of society. All this implies some familiarity with the more relevant branches of history and perhaps philosophy, and some knowledge of the social sciences, particularly economics, political science, and sociology.”
The Ford report — and a similar one sponsored the same year by the Carnegie Corporation — warned against ignoring the humanities or allowing faculty members and students to specialize too narrowly. Yet the funding that followed pushed schools in the opposite direction, consistent with the 1960s vogue for number-crunching wonkishness. Business schools embraced the hyper-specialization that pervades the rest of academia, falling especially under the thrall of economics and other heavily quantitative disciplines.
This fragmentation has accelerated in recent years as the more expansive M.B.A. degree has ceded ground to shorter, narrower masters degrees in topics like marketing and operations, often tailored to specific occupational contexts like health care or technology. Many programs permit students to sample electives in other parts of the university, but offer little structure for pulling this hodgepodge together. Business schools now pump “out over half a million narrow specialists per year” into an economic culture that prizes quick returns and efficiency, Roger Martin, the former dean of University of Toronto’s Rotman School of Management, wrote in his recent book “When More Is Not Better.”
“Business schools have long promised, ‘We’ll make you this general, leaderly kind of person,’ but they don’t,” Mr. Martin told me. “You come and get taught a bunch of narrow disciplines, and the assumption is, oh, the students will figure out how to fit those together. They will integrate across those fields and become general managers. But most don’t.” He lamented the absence of the humanities, qualitative disciplines that “teach someone how to think in a complex adaptive system. We treat that system like something else — we silo-ize it, break it into chunks, put it back together and think it will be fine. The humanities are the only hope for thinking about things in holistic, non-quantifiable ways.”
Here is the central tension of modern business education: At a time when society needs managers who can grapple with uncertainty and operate in a culture divided over basic questions of justice and human flourishing, most business schools still emphasize specialized skills and quantitative methods, the seductive simplicity of economic and social scientific models. They often reduce the weirdness of human organizations to the tidy pedagogy of the case method, in which students discuss 15- to 20-page accounts of how an individual or a corporation handled some task or crisis.
“The case method is theater,” Mr. Martin said. “There’s a case, and then there’s a teaching note that says what the point of the case is. Some notes will be as specific as to say: ‘ask the following question, wait ’til you get this answer, then write that out on the board.’ It’s no different from Shakespeare — people have lines, there’s three acts, everyone plays their role, and you know the answer ahead of time.”
The case method does not dominate every business school, but Harvard Business School, where the method originated, sold more than 15 million cases to other schools and organizations in 2020. Mr. Martin estimated that 30 percent of North American business education is “aided and abetted by an H.B.S. case.” Ms. Aijala, the student at Stanford, said that the case method “can be helpful to grapple with some of the dilemmas that business leaders have faced, but we’re usually doing it in a rapid-fire way that I don’t think promotes critical reflective thinking on deeper issues. Because participation is something you’re evaluated on in class, it promotes saying something for the sake of saying something, and doesn’t create space for deeper questioning.”
Yet the crafting and teaching of cases has become more nuanced in recent years, partly in response to the focus on environmental and social impact. “We do this nice thing where in each case, you map out all the stakeholders in the process,” Cynthia Madu, who is about to graduate from the Tuck School at Dartmouth, told me. “It lets us identify all the people actually there, so we don’t think it’s the C.E.O. doing everything. Also, if students know there’s not one single narrative, they’re more willing to debate in class whether it’s the correct narrative.”
Business school professors are also broadening the kinds of questions deemed relevant to modern business. “I’ve been buoyed by the diversity of what’s now considered economic research,” Ethan Rouen, a professor at Harvard Business School who teaches a course called “Reimagining Capitalism,” told me. “At H.B.S. we have people doing research on gun control, on the Rohingya genocide. This is new, and every year it’s going more in that direction.”
More than a half-century ago, the Ford Foundation report noted that “business itself is pulled in two directions,” needing managers with “breadth, perspective and flexibility of mind” as well as “better trained specialists.” Back then, business schools gave into the technocratic tide. It’s time to revisit that other, harder direction — the one that admits that measuring and modeling are not the same as understanding, and sees “Environmental, Social, and Governance” not as politically fashionable hand-waving, but a call to center the M.B.A. on big, hard questions.
Students themselves are pushing for this change. “When I was doing my M.B.A., a large number of students came into business school thinking it was a respite from whatever they were doing. They’d leave banking or consulting, do the M.B.A., then go back for a higher salary,” Dr. Rouen said. “Now so few students come in with that mind-set. Most come in thinking this is an opportunity to figure things out.”
Molly Worthen is the author, most recently, of the audio course “Charismatic Leaders Who Remade America” and an associate professor of history at the University of North Carolina at Chapel Hill.
https://www.nytimes.com/2022/05/05/opin ... 778d3e6de3
In the Battle With Robots, Human Workers Are Winning
Why do I still have a job?
It’s a question readers ask me often, but I mean it more universally: Why do so many of us still have jobs?
It’s 2022, and computers keep stunning us with their achievements. Artificial intelligence systems are writing, drawing, creating videos, diagnosing diseases, dreaming up new molecules for medicine and doing much else to make their parents very proud. Yet somehow we sacks of meat — though prone to exhaustion, distraction, injury and sometimes spectacular error — remain in high demand. How did this happen? Weren’t humans supposed to have been replaced by now — or at least severely undermined by the indefatigable go-getter robots who were said to be gunning for our jobs?
I’ve been thinking about this a lot recently. In part it’s because I was among the worriers — I started warning about the coming robotic threat to human employment in 2011. As the decade progressed and artificial intelligence systems began to surpass even their inventors’ expectations, evidence for the danger seemed to pile up. In 2013, a study by an Oxford economist and an A.I. scientist estimated that 47 percent of jobs are “at risk” of being replaced by computers. In 2017, the McKinsey Global Institute estimated that automation could displace hundreds of millions of workers by 2030, and global economic leaders were discussing what to do about the “robocalypse.” In the 2020 campaign, A.I.’s threat to employment became a topic of presidential debates.
Even then, predictions of robot dominance were not quite panning out, but the pandemic and its aftermath ought to radically shift our thinking. Now, as central bankers around the world are rushing to cool labor markets and tame inflation — a lot of policymakers are hoping that this week’s employment report shows declining demand for new workers — a few economic and technological truths have become evident.
First, humans have been underestimated. It turns out that we (well, many of us) are really amazing at what we do, and for the foreseeable future we are likely to prove indispensable across a range of industries, especially column-writing. Computers, meanwhile, have been overestimated. Though machines can look indomitable in demonstrations, in the real world A.I. has turned out to be a poorer replacement for humans than its boosters have prophesied.
What’s more, the entire project of pitting A.I. against people is beginning to look pretty silly, because the likeliest outcome is what has pretty much always happened when humans acquire new technologies — the technology augments our capabilities rather than replaces us. Is “this time different,” as many Cassandras took to warning over the past few years? It’s looking like not. Decades from now I suspect we’ll have seen that artificial intelligence and people are like peanut butter and jelly: better together.
It was a recent paper by Michael Handel, a sociologist at the Bureau of Labor Statistics, that helped me clarify the picture. Handel has been studying the relationship between technology and jobs for decades, and he’s been skeptical of the claim that technology is advancing faster than human workers can adapt to the changes. In the recent analysis, he examined long-term employment trends across more than two dozen job categories that technologists have warned were particularly vulnerable to automation. Among these were financial advisers, translators, lawyers, doctors, fast-food workers, retail workers, truck drivers, journalists and, poetically, computer programmers.
His upshot: Humans are pretty handily winning the job market. Job categories that a few years ago were said to be doomed by A.I. are doing just fine. The data show “little support” for “the idea of a general acceleration of job loss or a structural break with trends pre-dating the A.I. revolution,” Handel writes.
Consider radiologists, high-paid medical doctors who undergo years of specialty training to diagnose diseases through imaging procedures like X-rays and MRIs. As a matter of technology, what radiologists do looks highly susceptible to automation. Machine learning systems have made computers very good at this sort of task; if you feed a computer enough chest X-rays showing diseases, for instance, it can learn to diagnose those conditions — often faster and with accuracy rivaling or exceeding that of human doctors.
Such developments once provoked alarm in the field. In 2016, an article in The Journal of the American College of Radiology warned that machine learning “could end radiology as a thriving speciality.” The same year, Geoffrey Hinton, one of the originators of machine learning, said that “people should stop training radiologists now” because it was “completely obvious that within five years deep learning is going to be better than radiologists.”
Hinton later added that it could take 10 years, so he may still prove correct — but Handel points out that the numbers aren’t looking good for him. Rather than dying as an occupation, radiology has seen steady growth; between 2000 and 2019, the number of radiologists whose main activity was patient care grew by an average of about 15 percent per decade, Handel found. Some in the field are even worried about a looming shortage of radiologists that will result in longer turnaround times for imagining diagnoses.
How did radiologists survive the A.I. invasion? In a 2019 paper in the journal Radiology Artificial Intelligence, Curtis Langlotz, a radiologist at Stanford, offered a few reasons. One is that humans still routinely outperform machines — even if computers can get very good at spotting certain kind of diseases, they may lack data to diagnose rarer conditions that human experts with experience can easily spot. Radiologists are also adaptable; technological advances (like CT scans and MRIs) have been common in the field, and one of the primary jobs of a human radiologist is to understand and protect patients against the shortcomings of technologies used in the practice. Other experts have pointed to the complications of the health care industry — questions about insurance, liability, patient comfort, ethics and business consolidation may be just as important to the rollout of a new technology as its technical performance.
Langlotz concluded that “Will A.I. replace radiologists?” is “the wrong question.” Instead, he wrote, “The right answer is: Radiologists who use A.I. will replace radiologists who don’t.”
Similar trends have played out in lots of other jobs thought to vulnerable to A.I. Will truck drivers be outmoded by self-driving trucks? Perhaps someday, but as The Times’s A.I. reporter Cade Metz recently pointed out, the technology is perpetually just a few years away from being ready and is “a long way from the moment trucks can drive anywhere on their own.” No wonder, then, the end of the road for truck drivers is nowhere near — the government projects that the number of truck-driving jobs will grow over the next decade.
How about fast-food workers, who were said to be replaceable by robotic food-prep machines and self-ordering kiosks? They’re safe too, Chris Kempczinski, the C.E.O. of McDonald’s, said in an earnings call this summer. Even with a shortage of fast-food workers, robots “may be great for garnering headlines” but are simply “not practical for the vast majority of restaurants,” he said.
It’s possible, even likely, that all of these systems will improve. But there’s no evidence it will happen overnight, or quickly enough to result in catastrophic job losses in the short term.
“I don’t want to minimize the pain and adjustment costs for people who are impacted by technological change,” Handel told me. “But when you look at it, you just don’t see a lot — you just don’t see anything as much as being claimed.”
https://www.nytimes.com/2022/10/07/opin ... 778d3e6de3
It’s a question readers ask me often, but I mean it more universally: Why do so many of us still have jobs?
It’s 2022, and computers keep stunning us with their achievements. Artificial intelligence systems are writing, drawing, creating videos, diagnosing diseases, dreaming up new molecules for medicine and doing much else to make their parents very proud. Yet somehow we sacks of meat — though prone to exhaustion, distraction, injury and sometimes spectacular error — remain in high demand. How did this happen? Weren’t humans supposed to have been replaced by now — or at least severely undermined by the indefatigable go-getter robots who were said to be gunning for our jobs?
I’ve been thinking about this a lot recently. In part it’s because I was among the worriers — I started warning about the coming robotic threat to human employment in 2011. As the decade progressed and artificial intelligence systems began to surpass even their inventors’ expectations, evidence for the danger seemed to pile up. In 2013, a study by an Oxford economist and an A.I. scientist estimated that 47 percent of jobs are “at risk” of being replaced by computers. In 2017, the McKinsey Global Institute estimated that automation could displace hundreds of millions of workers by 2030, and global economic leaders were discussing what to do about the “robocalypse.” In the 2020 campaign, A.I.’s threat to employment became a topic of presidential debates.
Even then, predictions of robot dominance were not quite panning out, but the pandemic and its aftermath ought to radically shift our thinking. Now, as central bankers around the world are rushing to cool labor markets and tame inflation — a lot of policymakers are hoping that this week’s employment report shows declining demand for new workers — a few economic and technological truths have become evident.
First, humans have been underestimated. It turns out that we (well, many of us) are really amazing at what we do, and for the foreseeable future we are likely to prove indispensable across a range of industries, especially column-writing. Computers, meanwhile, have been overestimated. Though machines can look indomitable in demonstrations, in the real world A.I. has turned out to be a poorer replacement for humans than its boosters have prophesied.
What’s more, the entire project of pitting A.I. against people is beginning to look pretty silly, because the likeliest outcome is what has pretty much always happened when humans acquire new technologies — the technology augments our capabilities rather than replaces us. Is “this time different,” as many Cassandras took to warning over the past few years? It’s looking like not. Decades from now I suspect we’ll have seen that artificial intelligence and people are like peanut butter and jelly: better together.
It was a recent paper by Michael Handel, a sociologist at the Bureau of Labor Statistics, that helped me clarify the picture. Handel has been studying the relationship between technology and jobs for decades, and he’s been skeptical of the claim that technology is advancing faster than human workers can adapt to the changes. In the recent analysis, he examined long-term employment trends across more than two dozen job categories that technologists have warned were particularly vulnerable to automation. Among these were financial advisers, translators, lawyers, doctors, fast-food workers, retail workers, truck drivers, journalists and, poetically, computer programmers.
His upshot: Humans are pretty handily winning the job market. Job categories that a few years ago were said to be doomed by A.I. are doing just fine. The data show “little support” for “the idea of a general acceleration of job loss or a structural break with trends pre-dating the A.I. revolution,” Handel writes.
Consider radiologists, high-paid medical doctors who undergo years of specialty training to diagnose diseases through imaging procedures like X-rays and MRIs. As a matter of technology, what radiologists do looks highly susceptible to automation. Machine learning systems have made computers very good at this sort of task; if you feed a computer enough chest X-rays showing diseases, for instance, it can learn to diagnose those conditions — often faster and with accuracy rivaling or exceeding that of human doctors.
Such developments once provoked alarm in the field. In 2016, an article in The Journal of the American College of Radiology warned that machine learning “could end radiology as a thriving speciality.” The same year, Geoffrey Hinton, one of the originators of machine learning, said that “people should stop training radiologists now” because it was “completely obvious that within five years deep learning is going to be better than radiologists.”
Hinton later added that it could take 10 years, so he may still prove correct — but Handel points out that the numbers aren’t looking good for him. Rather than dying as an occupation, radiology has seen steady growth; between 2000 and 2019, the number of radiologists whose main activity was patient care grew by an average of about 15 percent per decade, Handel found. Some in the field are even worried about a looming shortage of radiologists that will result in longer turnaround times for imagining diagnoses.
How did radiologists survive the A.I. invasion? In a 2019 paper in the journal Radiology Artificial Intelligence, Curtis Langlotz, a radiologist at Stanford, offered a few reasons. One is that humans still routinely outperform machines — even if computers can get very good at spotting certain kind of diseases, they may lack data to diagnose rarer conditions that human experts with experience can easily spot. Radiologists are also adaptable; technological advances (like CT scans and MRIs) have been common in the field, and one of the primary jobs of a human radiologist is to understand and protect patients against the shortcomings of technologies used in the practice. Other experts have pointed to the complications of the health care industry — questions about insurance, liability, patient comfort, ethics and business consolidation may be just as important to the rollout of a new technology as its technical performance.
Langlotz concluded that “Will A.I. replace radiologists?” is “the wrong question.” Instead, he wrote, “The right answer is: Radiologists who use A.I. will replace radiologists who don’t.”
Similar trends have played out in lots of other jobs thought to vulnerable to A.I. Will truck drivers be outmoded by self-driving trucks? Perhaps someday, but as The Times’s A.I. reporter Cade Metz recently pointed out, the technology is perpetually just a few years away from being ready and is “a long way from the moment trucks can drive anywhere on their own.” No wonder, then, the end of the road for truck drivers is nowhere near — the government projects that the number of truck-driving jobs will grow over the next decade.
How about fast-food workers, who were said to be replaceable by robotic food-prep machines and self-ordering kiosks? They’re safe too, Chris Kempczinski, the C.E.O. of McDonald’s, said in an earnings call this summer. Even with a shortage of fast-food workers, robots “may be great for garnering headlines” but are simply “not practical for the vast majority of restaurants,” he said.
It’s possible, even likely, that all of these systems will improve. But there’s no evidence it will happen overnight, or quickly enough to result in catastrophic job losses in the short term.
“I don’t want to minimize the pain and adjustment costs for people who are impacted by technological change,” Handel told me. “But when you look at it, you just don’t see a lot — you just don’t see anything as much as being claimed.”
https://www.nytimes.com/2022/10/07/opin ... 778d3e6de3