Under Employers’ Gaze, Gen Z Is Biting Its Tongue On Social Media
April 13, 20195:00 AM ET
The oldest members of Generation Z are around 22 years old — now entering the workforce and adjusting their social media accordingly. They are holding back from posting political opinions and personal information in favor of posting about professional accomplishments.
only about 1 in 10 teenagers say they share personal, religious or political beliefs on social media, according to a recent survey from Pew Research Center.
70 percent of employers and recruiters say they check social media during the hiring process, according to a survey conducted by CareerBuilder
Generation Z, nicknamed “iGen,” is the post-millennial generation responsible for ‘killing’ Facebook and for the rise of TikTok.
Curricula like Common Sense Education’s digital citizenship program are working to educate the younger generation on how to use social media, something the older generations were never taught.
Some users are regularly cleaning up — “re-curating” — their online profiles. Cleanup apps, like TweetDelete,
Gen Zers also use social media in more ephemeral ways than older generations — Snapchat stories that disappear after 24 hours, or Instagram posts that they archive a couple of months later.
Gen Zers already use a multitude of strategies to make sure their online presence is visible only to who they want: They set their account to private, change their profile name or handle, even make completely separate “fake” accounts.
more on social media in this IMS blog
Gen Z is coming to your office. Get ready to adapt
Janet Adamy, Sept 6, 2018
Early signs suggest Gen Z workers are more competitive and pragmatic, but also more anxious and reserved, than millennials, the generation of 72 million born from 1981 to 1996, according to executives, managers, generational consultants and multidecade studies of young people. Gen Zers are also the most racially diverse generation in American histor
With the generation of baby boomers retiring and unemployment at historic lows, Gen Z is filling immense gaps in the workforce. Employers, plagued by worker shortages, are trying to adapt.
LinkedIn Corp. and Intuit Inc. have eased requirements that certain hires hold bachelor’s degrees to reach young adults who couldn’t afford college. At campus recruiting events, EY is raffling off computer tablets because competition for top talent is intense.
Companies are reworking training so it replicates YouTube-style videos that appeal to Gen Z workers reared on smartphones.
“They learn new information much more quickly than their predecessors,”
A few years ago Mr. Stewart noticed that Gen Z hires behaved differently than their predecessors. When the company launched a project to support branch managers, millennials excitedly teamed up and worked together. Gen Z workers wanted individual recognition and extra pay.
Much of Gen Z’s socializing takes place via text messages and social media platforms—a shift that has eroded natural interactions and allowed bullying to play out in front of wider audiences.
The flip side of being digital natives is that Gen Z is even more adept with technology than millennials. Natasha Stough, Americas campus recruiting director at EY in Chicago, was wowed by a young hire who created a bot to answer questions on the company’s Facebook careers page.
To lure more Gen Z workers, EY rolled out video technology that allows job candidates to record answers to interview questions and submit them electronically.
LinkedIn, which used to recruit from about a dozen colleges, broadened its efforts to include hundreds of schools and computer coding boot camps to capture a diverse applicant pool that mirrors the changing population.
more on Gen Z in this IMS blog
Can We Please Stop Talking About Generations as if They Are a Thing?
Millennials are not all narcissists and boomers are not inherently selfish. The research on generations is flawed.
APRIL 13, 2018 9:00 AM
SIVA VAIDHYANATHAN, 2008. https://www.chronicle.com/article/Generational-Myth/32491 Generational Myth
My note: Siva raised this issue from a sociologist point of view as soon as in 2008. Before him, Prensky’s “digitally natives” ideas was already criticized.
Howe and Strauss; Millennials books contributed to the overgeneralizations. https://en.wikipedia.org/wiki/Strauss%E2%80%93Howe_generational_theory
We spend a lot of time debating the characteristics of generations—are baby boomers really selfish and entitled, are millennials really narcissists, and the latest, has the next generation (whatever it is going to be called) already been ruined by cellphones? Many academics—and many consultants—argue that generations are distinct and that organizations, educators, and even parents need to accommodate them. These classifications are often met with resistance from those they supposedly represent, as most people dislike being represented by overgeneralizations, and these disputes only fuel the debate around this contentious topic.
In short, the science shows that generations are not a thing.
It is important to be clear what not a thing means. It does not mean that people today are the same as people 80 years ago or that anything else is static. Times change and so do people. However, the idea that distinct generations capture and represent these changes is unsupported.
What is a generation? Those who promote the concept define it as a group of people who are roughly the same age and who were influenced by a set of significant events. These experiences supposedly create commonalities, making those in the group more similar to each other and more different from other groups now and from groups of the same age in the past.
In line with the definition, there is a commonly held perception that people growing up around the same time and in the same place must have some sort of universally shared set of experiences and characteristics. It helps that the idea of generations intuitively makes sense. But the science does not support it. In fact, most of the research findings showing distinct generations are explained by other causes, have serious scientific flaws, or both.
For example, millennials score lower on job satisfaction than Gen Xers, but are millennials really a less satisfied generation? Early in their careers, Xers were also less satisfied than baby boomers.
Numerous books, articles, and pundits have claimed that millennials are much more narcissistic than young people in the past.
on average, millennials are no more narcissistic now than Xers or boomers were when they were in their 20s, and one study has even found they might be less so than generations past. While millennials today may be more narcissistic than Xers or boomers are today, that is because young people are pretty narcissistic regardless of when they are young. This too is an age effect.
Final example. Research shows that millennials joining the Army now show more pride in their service than boomers or Xers did when they joined 20-plus years ago. Is this a generational effect? Nope. Everyone in the military now shows more pride on average than 20 years ago because of 9/11. The terrorist attack increased military pride across the board. This is known as a period effect and it doesn’t have anything to do with generations.
Another problem—identifying true generational effects is methodologically very hard. The only way to do it would be to collect data from multiple longitudinal panels. Individuals in the first panel would be measured at the start of the study and then in subsequent years with new panels added every year thereafter, allowing assessment of whether people were changing because they were getting older (age effects), because of what was happening around them (period effects), or because of their generation (cohort effects). Unfortunately, such data sets pretty much do not exist. Thus, we’re never really able to determine why a change occurred.
According to one national-culture model, people from the United States are, on average, relatively individualistic, indulgent, and uncomfortable with hierarchical order.
My note: RIchard Nisbett sides with Hofstede and Minkov: http://blog.stcloudstate.edu/ims/2016/06/14/cultural-differences/
Conversely, people from China are generally group-oriented, restrained, and comfortable with hierarchy. However, these countries are so large and diverse that they each have millions of individuals who are more similar to the “averages” of the other country than to their own.
Given these design and data issues, it is not surprising that researchers have tried a variety of different statistical techniques to massage (aka torture) the data in an attempt to find generational differences. Studies showing generational differences have used statistical techniques like analysis of variance (ANOVA) and cross-temporal meta-analysis (CTMA), neither of which is capable of actually attributing the differences to generations.
The statistical challenge derives from the problem we have already raised—generations (i.e., cohorts) are defined by age and period. As such, mathematically separating age, period, and cohort effects is very difficult because they are inherently confounded with one another. Their linear dependency creates what is known as an identification problem, and unless one has access to multiple longitudinal panels like I described above, it is impossible to statistically isolate the unique effect of any one factor.
First, relying on flawed generational science leads to poor advice and bad decisions. An analogy: Women live longer than men, on average. Why? They engage in fewer risky behaviors, take better care of themselves, and have two X chromosomes, giving them backups in case of mutations. But if you are a man and you go to the doctor and ask how to live longer, she doesn’t tell you, “Be a woman.” She says eat better, exercise, and don’t do stupid stuff. Knowing the why guides the recommendation.
Now imagine you are a manager trying to retain your supposedly job-hopping, commitment-averse millennial employees and you know that Xers and boomers are less likely to leave their jobs. If you are that manager, you wouldn’t tell your millennial employees to “be a boomer” or “grow older” (nor would you decide to hire boomers or Xers rather than millennials—remember that individuals vary within populations). Instead, you should focus on addressing benefits, work conditions, and other factors that are reasons for leaving.
Second, this focus on generational distinctions wastes resources. Take the millennials-as-commitment-averse-job-hoppers stereotype. Based on this belief, consultants sell businesses on how to recruit and retain this mercurial generation. But are all (or even most) millennials job-hopping commitment avoiders? Survey research shows that millennials and Xers at the same point in their careers are equally likely to stay with their current employer for five or more years (22 percent v. 21.8 percent). It makes no sense for organizations to spend time and money changing HR policies when employees are just as likely to stick around today as they were 15 years ago.
Third, generations perpetuate stereotyping. Ask millennials if they are narcissistic job-hoppers and most of them will rightly be offended. Treat boomers like materialistic achievement seekers and see how it affects their work quality and commitment. We finally are starting to recognize that those within any specific group of people are varied individuals, and we should remember those same principles in this context too. We are (mostly) past it being acceptable to stereotype and discriminate against women, minorities, and the disabled. Why is it OK to do so to millennials or boomers?
The solutions are fairly straightforward, albeit challenging, to implement. To start, we need to focus on the why when talking about whether groups of people differ. The reasons why any generation should be different have only been generally discussed, and the theoretical mechanism that supposedly creates generations has not been fully fleshed out.
Next, we need to quit using these nonsensical generations labels, because they don’t mean anything. The start and end years are somewhat arbitrary anyway. The original conceptualization of social generations started with a biological generational interval of about 20 years, which historians, sociologists and demographers (for one example, see Strauss and Howe, 1991) then retrofitted with various significant historical events that defined the period.
The problem with this is twofold. First, such events do not occur in nice, neat 20-year intervals. Second, not everyone agrees on what the key events were for each generation, so the start and end dates also move around depending on what people think they were. One review found that start and end dates for boomers, Xers, and millennials varied by as many as nine years, and often four to five, depending on the study and the researcher. As with the statistical problem, how can distinct generations be a thing if simply defining when they start and when they end varies so much from study to study?
In the end, the core scientific problem is that the pop press, consultants, and even some academics who are committed to generations don’t focus on the whys. They have a vested interest in selling the whats (Generation Me has reportedly sold more than 115,000 copies, and Google “generations consultants” and see how many firms are dedicated to promulgating these distinctions), but without the science behind them, any prescriptions are worthless or even harmful
David Costanza is an associate professor of organizational sciences at George Washington University and a senior consortium fellow for the U.S. Army Research Institute. He researches, teaches, and consults in the areas of generations, leadership, culture, and organizational performance.
more on the topic in this IMS blog
Survey: What Gen Z Thinks About Ed Tech in College
A report on digital natives sheds light on their learning preferences.
more on Gen Z in this blog:
Generation Z bibliography
How Genrefication Makes School Libraries More Like Bookstores
Gail Cornwall Jul 22, 2018 https://www.kqed.org/mindshift/51336/how-genrefication-makes-school-libraries-more-like-bookstores
Under the Dewey Decimal System that revolutionized and standardized book shelving starting in 1876, nonfiction essentially already gets the genrefication treatment with, for example, Music located in the 780s and Paleontology in the 560s. Yet most fiction is shelved in one big clump alphabetized by author’s last name.
Many librarians say the “search hurdle” imposed by Dewey classification (a system originally designed for adults) significantly reduces the odds of a child finding something new they’re likely to enjoy. In a genrefied library, on the other hand, a young reader standing near a favorite book need only stick out a hand to find more like it. (It’s a bit like the analog version of Amazon’s recommendation feature: “Customers who bought this item also bought”)
The Dewey-loyal also oppose genrefication in principle for, interestingly enough, the same reason others support it: self-sufficiency. Sure, they argue, kids might be better able to find a book independently in their school library, but what happens when they go to the public one? When they get to high school?
The debate has led to compromise positions. Some leave books for older students in the Dewey arrangement while genrefying for younger ones. Other librarians rearrange middle readers and young adult books but leave picture books shelved by author since it can be unclear how to categorize a story about a duck driving a tractor.
Law is Code: Making Policy for Artificial Intelligence
Jules Polonetsky and Omer Tene January 16, 2019
Twenty years have passed since renowned Harvard Professor Larry Lessig coined the phrase “Code is Law”, suggesting that in the digital age, computer code regulates behavior much like legislative code traditionally did. These days, the computer code that powers artificial intelligence (AI) is a salient example of Lessig’s statement.
- Good AI requires sound data. One of the principles, some would say the organizing principle, of privacy and data protection frameworks is data minimization. Data protection laws require organizations to limit data collection to the extent strictly necessary and retain data only so long as it is needed for its stated goal.
- Preventing discrimination – intentional or not.
When is a distinction between groups permissible or even merited and when is it untoward? How should organizations address historically entrenched inequalities that are embedded in data? New mathematical theories such as “fairness through awareness” enable sophisticated modeling to guarantee statistical parity between groups.
- Assuring explainability – technological due process. In privacy and freedom of information frameworks alike, transparency has traditionally been a bulwark against unfairness and discrimination. As Justice Brandeis once wrote, “Sunlight is the best of disinfectants.”
- Deep learning means that iterative computer programs derive conclusions for reasons that may not be evident even after forensic inquiry.
Yet even with code as law and a rising need for law in code, policymakers do not need to become mathematicians, engineers and coders. Instead, institutions must develop and enhance their technical toolbox by hiring experts and consulting with top academics, industry researchers and civil society voices. Responsible AI requires access to not only lawyers, ethicists and philosophers but also to technical leaders and subject matter experts to ensure an appropriate balance between economic and scientific benefits to society on the one hand and individual rights and freedoms on the other hand.
more on AI in this IMS blog