Wang, Q., Quek, C., & Hu, H. (2017). Designing and Improving a Blended Synchronous Learning Environment : An Educational Design Research. International Review of Research in Open and Distributed Learning, 18(3), 99-118
Definition: blended synchronous learning has attracted much attention and it is often labelled with synchronous hybrid learning (Cain & Henriksen 2013); synchronous blended learning (Okita, 201 3 ); multi – access learning (Irvine, Code, & Richards, 2013); or simultaneous delivery of course s to on – campus and off – campus students (White et al ., 2010). Adapted from the definition given by Bower , Dalgarno, Kennedy, Lee, and Kenney (2015), blended synchronous learning in this paper is defined as a learning method that enables online students to participate in classroom learning activities simultaneously via comput er – mediated communication technologies such as video conferencing . By following this approach , on – campus students attend F2F le ssons in the physical classroom. M eanwhile, online students who are situated at multiple sites participate in the identical class room learning activities via two – way video conferencing in real time .
With regard to educational benefits , blended synchronous learning can help to establish rich teaching presence, social presence, and cognitive presence ( Garrison, Anderson, & Archer, 200 0 ; Szeto, 2015 ). A BSLE provides a mimic classroom environment (White et al. , 2010) , where teachers ’ direct instruction and facilitation can be easily carried out a nd the teaching presence is hence naturally established.
Report: Americans Value College Degrees But Say Higher Ed Falls Short on Delivering Promises
By Sri Ravipati 05/12/17
new survey, “Varying Degrees: New America’s Annual Survey on Higher Education,” which involved more than 1,600 individuals in the United States who are ages 18 and older.
explore the data using an interactive tool
data reveal key differences across categories of age, gender, generation, region, socioeconomic status, race and political ideology
- Most respondents want to see changes made in higher ed, with just 25 percent answering the system is “just fine the way it is” and helps students succeed;
- Students want additional help crossing the finish line, with 57 percent of respondents answering that higher ed institutions should help students succeed;
- Just one in three respondents answered that the federal government is having a positive impact on higher ed;
- Two-year colleges and four-year public universities are seen as especially worth the cost compared to other institution types, with 83 percent and 79 percent of respondents respectively saying these institution types “contribute to a strong American workforce”
“Although people believe in higher education generally, they are not satisfied with specific institutions and policies that we have in place right now — they are broadly dissatisfied,”
more on education and employment in this IMS blog
8 Tips for Lecture Capture on a Shoestring
By Dian Schaffhauser 05/17/17
Whether you’re flipping your courses, creating videos to help your students understand specific concepts or recording lectures for exam review, these tips can help you optimize your production setup on a tight budget.
1) Speak Into the Microphone
2) Reconsider Whether You Want to be a Talking Head
3) Keep Your Recording Device Steady
4) Avoid Using the Camera Built Into Your Laptop
“online video platforms,”
TechSmith Relay, Panopto, Tegrity and Kaltura
6) Forget About Editing Your Videos
7) Remember Accessibility
Record your video and upload it to YouTube. YouTube will apply its machine transcription to the audio as a starting point. Then you can download the captions into your caption editor and improve on the captions from there. Afterward, you can delete the video from YouTube and add it to your institution’s platform.
lecture capture in this IMS blog
6 VR Trends to Watch in Education
By Sri Ravipati 05/16/17
VR devices are expected to increase 85 percent by 2020, with gaming and educational applications driving most of that growth.
Maya Georgieva, an ed tech strategist, author and speaker with more than 15 years of experience in higher education and global education. Georgieva is co-founder of Digital Bodies, a consulting group that provides news and analysis of VR, AR and wearables in education
Emory Craig, currently the director of e-learning at the College of New Rochelle,
six areas with promising developments for educators.
1) More Affordable Headsets
the Oculus Rift or HTC Vive, which I really like, you’re talking close to $2,000 per setup. the 2017 SXSWedu conference,
Microsoft has been collaborating with its partners, such as HP, Acer, Dell and Lenovo, to develop VR headsets that will work with lower-end desktops. Later this year, the companies will debut headsets for $299, “which is much more affordable compared to HoloLens
many Kickstarter crowdfunding efforts are bound to make high-end headsets more accessible for teaching.
the NOLO project. The NOLO system is meant for mobile VR headsets and gives users that “6 degrees of freedom” (or 6 DoF) motion tracking that is currently only found in high-end headsets.
2) Hand Controllers That Will Bring Increased Interactivity
Google Daydream Samsung has also implemented its own hand controller for Gear VR
Microsoft new motion controllers at Microsoft Build
zSpace, with their stylus and AR glasses, continue to develop their immersive applications
3) Easy-to-Use Content Creation Platforms
Game engines like Unity and Unreal are often a starting point for creating simulations.
Labster, which creates virtual chemistry labs — will become important in specialized subjects
ThingLink, for example, recently introduced a school-specific editor for creating 360-degree and VR content. Lifeliqe, Aurasma and Adobe are also working on more interactive tools.
5) 360-Degree Cameras
6) Social VR Spaces
AltspaceVR h uses avatars and supports multiplayer sessions that allow for socialization and user interaction.
Facebook has been continuing to develop its own VR platform, Facebook Spaces, which is in beta and will be out later this year. LectureVR is a similar platform on the horizon.
more on augmented reality in this IMS blog
Don’t Call Me a Millennial — I’m an Old Millennial
the Census Bureau’s definition (born 1982–2000) or Pew’s (about 1981–1997).
In 2015, for example, Juliet Lapidos — born the same year I was — may have put it best in a column for the New York Times headlined “Wait, What, I’m a Millennial?” “I don’t identify with the kids that Time magazine described as technology-addled narcissists, the Justin Bieber fans who ‘boomerang’ back home instead of growing up,” she writes.
Old Millennials, as I’ll call them, who were born around 1988 or earlier (meaning they’re 29 and older today), really have lived substantively different lives than Young Millennials, who were born around 1989 or later, as a result of two epochal events that occurred around the time when members of the older group were mostly young adults and when members of the younger were mostly early adolescents: the financial crisis and smartphones’ profound takeover of society. And according to Jean Twenge, a social psychologist at San Diego State University and the author of Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before, there’s some early, emerging evidence that, in certain ways, these two groups act like different, self-contained generations.
Millennials, we hear over and over again, are absolutely obsessed with social media, and live their entire social lives through their smartphones. I tweet too much, sure, but I’ve never blasted a ’gram (did I say that right?); even thinking about learning how to Snapchat makes me want to take a long, peaceful nap
“The Job-Hopping Generation,” says Gallup — and are much more likely, relative to previous generations when they were in their 20s, to live at home and to put off family formation for a long time.
last week Pew released some numbers suggesting millennials aren’t any job-hoppier than Generation X was at the same age.
Intelligence: a history
Intelligence has always been used as fig-leaf to justify domination and destruction. No wonder we fear super-smart robots
To say that someone is or is not intelligent has never been merely a comment on their mental faculties. It is always also a judgment on what they are permitted to do. Intelligence, in other words, is political.
The problem has taken an interesting 21st-century twist with the rise of Artificial Intelligence (AI).
The term ‘intelligence’ itself has never been popular with English-language philosophers. Nor does it have a direct translation into German or ancient Greek, two of the other great languages in the Western philosophical tradition. But that doesn’t mean philosophers weren’t interested in it. Indeed, they were obsessed with it, or more precisely a part of it: reason or rationality. The term ‘intelligence’ managed to eclipse its more old-fashioned relative in popular and political discourse only with the rise of the relatively new-fangled discipline of psychology, which claimed intelligence for itself.
Plato conclude, in The Republic, that the ideal ruler is ‘the philosopher king’, as only a philosopher can work out the proper order of things. This idea was revolutionary at the time. Athens had already experimented with democracy, the rule of the people – but to count as one of those ‘people’ you just had to be a male citizen, not necessarily intelligent. Elsewhere, the governing classes were made up of inherited elites (aristocracy), or by those who believed they had received divine instruction (theocracy), or simply by the strongest (tyranny).
Plato’s novel idea fell on the eager ears of the intellectuals, including those of his pupil Aristotle. Aristotle was always the more practical, taxonomic kind of thinker. He took the notion of the primacy of reason and used it to establish what he believed was a natural social hierarchy.
So at the dawn of Western philosophy, we have intelligence identified with the European, educated, male human. It becomes an argument for his right to dominate women, the lower classes, uncivilised peoples and non-human animals. While Plato argued for the supremacy of reason and placed it within a rather ungainly utopia, only one generation later, Aristotle presents the rule of the thinking man as obvious and natural.
The late Australian philosopher and conservationist Val Plumwood has argued that the giants of Greek philosophy set up a series of linked dualisms that continue to inform our thought. Opposing categories such as intelligent/stupid, rational/emotional and mind/body are linked, implicitly or explicitly, to others such as male/female, civilised/primitive, and human/animal. These dualisms aren’t value-neutral, but fall within a broader dualism, as Aristotle makes clear: that of dominant/subordinate or master/slave. Together, they make relationships of domination, such as patriarchy or slavery, appear to be part of the natural order of things.
Descartes rendered nature literally mindless, and so devoid of intrinsic value – which thereby legitimated the guilt-free oppression of other species.
For Kant, only reasoning creatures had moral standing. Rational beings were to be called ‘persons’ and were ‘ends in themselves’. Beings that were not rational, on the other hand, had ‘only a relative value as means, and are therefore called things’. We could do with them what we liked.
This line of thinking was extended to become a core part of the logic of colonialism. The argument ran like this: non-white peoples were less intelligent; they were therefore unqualified to rule over themselves and their lands. It was therefore perfectly legitimate – even a duty, ‘the white man’s burden’ – to destroy their cultures and take their territory.
The same logic was applied to women, who were considered too flighty and sentimental to enjoy the privileges afforded to the ‘rational man’.
Galton believe that intellectual ability was hereditary and could be enhanced through selective breeding. He decided to find a way to scientifically identify the most able members of society and encourage them to breed – prolifically, and with each other. The less intellectually capable should be discouraged from reproducing, or indeed prevented, for the sake of the species. Thus eugenics and the intelligence test were born together.
From David Hume to Friedrich Nietzsche, and Sigmund Freud through to postmodernism, there are plenty of philosophical traditions that challenge the notion that we’re as intelligent as we’d like to believe, and that intelligence is the highest virtue.
From 2001: A Space Odyssey to the Terminator films, writers have fantasised about machines rising up against us. Now we can see why. If we’re used to believing that the top spots in society should go to the brainiest, then of course we should expect to be made redundant by bigger-brained robots and sent to the bottom of the heap.
Natural stupidity, rather than artificial intelligence, remains the greatest risk.
more on intelligence in this IMS blog