Archive of ‘learning analytics’ category

Edtech Trends 2022

7 Edtech Trends to Watch in 2022: a Startup Guide for Entrepreneurs

https://www.edsurge.com/news/2022-04-18-7-edtech-trends-to-watch-in-2022-a-startup-guide-for-entrepreneurs

1. Data is abundant and the key to today’s edtech solutions

2. Artificial intelligence (AI) and machine learning (ML) are powering the latest generation of edtechs

3. Game-based learning is transforming how students learn

4. Edtechs are at the forefront of digital transformation in the classroom

5. Workforce upskilling is being supplemented by edtech solutions

6. Edtechs are being called upon to help with student wellbeing

7. Augmented reality (AR) and virtual reality are top of mind

Sex differences in adolescents’ occupational aspirations

In each country and region, more boys than girls aspired to a things-oriented or STEM occupation and more girls than boys to a people-oriented occupation. These sex differences were larger in countries with a higher level of women’s empowerment.

Sex differences in adolescents’ occupational aspirations: Variations across time and place

https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0261438

LAK20

LAK20 – “Celebrating 10 years of LAK: Shaping the future of the field”

23-27 March 2020, Frankfurt, Germany, https://lak20.solaresearch.org

We have the pleasure to invite you to the 10th International Conference on Learning Analytics & Knowledge (LAK20)which will be held in Frankfurt, Germany between 23-27 March 2020. This year, LAK20 will feature 80 research and 12 practitioner presentations, over 60 poster presentations, and best-paper presentations from EDM and ACL EDU conferences.

We also have a great lineup of world-renowned keynote speakers:

Professor Shane Dawson, University of South Australia, Australia
Professor Milena Tsvetkova, London School of Economics and Political Science, The United Kingdom
Professor Allyson Hadwin, The University of Victoria, Canada

As it is the tenth anniversary of the LAK conference, LAK20 celebrates the past successes of the learning analytics community and poses new questions and challenges for the field. The theme for this year is “Shaping the future of the field” and focuses on thinking how we can advance learning analytics and drive its development over the next ten years and beyond.

The LAK conference is intended for both researchers and practitioners. We invite both researchers and practitioners of learning analytics to come and join a proactive dialogue around the future of learning analytics and its practical adoption. We further extend our invite to educators, leaders, administrators, government and industry professionals interested in the field of learning analytics and related disciplines.

For the details of the conference schedule, see https://lak20.solaresearch.org/schedule-overview

Register at https://lak20.solaresearch.org/registration

About the Conference

====================

The International Conference on Learning Analytics & Knowledge is the premier research forum in the field of learning analytics and educational technology, providing common ground for all stakeholders in the design of analytics systems to debate the state of the art at the intersection of Learning and Analytics – including researchers, educators, instructional designers, data scientists, software developers, institutional leaders and governmental policymakers. The conference is organised by the Society for Learning Analytics Research (SoLAR) and held in cooperation with ACM in association with ACM SIGCHI and SIGWEB, with the double-blind, peer-reviewed proceedings archived in the ACM Digital Library.

++++++++++++
more on learning analytics in this IMS blog
https://blog.stcloudstate.edu/ims?s=learning+analytics

Microcredentials and Digital Badges in Higher Ed

Microcredentials and Digital Badges in Higher Education

November 27 – 29, 2018  Savannah, GA

https://www.academicimpressions.com/microcredentials-and-digital-badges-in-higher-education

Badging programs are rapidly gaining momentum in higher education – join us to learn how to get your badging efforts off the ground.

Key Considerations: Assessment of Competencies

During this session, you will learn how to ask the right questions and evaluate if badges are a good fit within your unique institutional context, including determining ROI on badging efforts. You’ll also learn how to assess the competencies behind digital badges.


 

Key Technology Considerations

This session will allow for greater understanding of Open Badges standards, the variety of technology software and platforms, and the portability of badges. We will also explore emerging trends in the digital badging space and discuss campus considerations.

Key Financial Considerations

During this hour, we will take a closer look at answering key financial questions surrounding badges:

  • What does the business model look like behind existing institutional badging initiatives?
  • Are these money-makers for an institution? Is there revenue potential?
  • Where does funding for these efforts come from?
Partnering with Industry

Badging can be a catalyst for partnerships between higher education and industry. In this session, you will have the opportunity to learn more about strategies for collaborating with industry in the development of badges and how badges align with employer expectations.

Branding and Marketing Badges

Now that we have a better idea of the “why” and “what” of badges, how do we market their value to external and internal stakeholders? You’ll see examples of how other institutions are designing and marketing their badges.

Consultation Time

Alongside your peers and our expert instructors, you will have the opportunity to brainstorm ideas, get feedback, ask questions, and get answers.

Next Steps and the Road Ahead: Where Badging in Higher Ed is Going

Most institutions are getting into the badging game, and we’ll talk about the far-reaching considerations in the world of badging. We’ll use this time to engage in forward-thinking and discuss the future of badging and what future trends in badging might be.

+++++++++++++
more on microcredentialing in this IMS blog
https://blog.stcloudstate.edu/ims?s=microcredentialing

Large-scale visualization

The future of collaboration: Large-scale visualization

 http://usblogs.pwc.com/emerging-technology/the-future-of-collaboration-large-scale-visualization/

More data doesn’t automatically lead to better decisions. A shortage of skilled data scientists has hindered progress towards translation of information into actionable business insights. In addition, traditionally dense spreadsheets and linear slideshows are ineffective to present discoveries when dealing with Big Data’s dynamic nature. We need to evolve how we capture, analyze and communicate data.

Large-scale visualization platforms have several advantages over traditional presentation methods. They blur the line between the presenter and audience to increase the level of interactivity and collaboration. They also offer simultaneous views of both macro and micro perspectives, multi-user collaboration and real-time data interaction, and a limitless number of visualization possibilities – critical capabilities for rapidly understanding today’s large data sets.

Visualization walls enable presenters to target people’s preferred learning methods, thus creating a more effective communication tool. The human brain has an amazing ability to quickly glean insights from patterns – and great visualizations make for more efficient storytellers.

Grant: Visualizing Digital Scholarship in Libraries and Learning Spaces
Award amount: $40,000
Funder: Andrew W. Mellon Foundation
Lead institution: North Carolina State University Libraries
Due date: 13 August 2017
Notification date: 15 September 2017
Website: https://immersivescholar.org
Contact: immersivescholar@ncsu.edu

Project Description

NC State University, funded by the Andrew W. Mellon Foundation, invites proposals from institutions interested in participating in a new project for Visualizing Digital Scholarship in Libraries and Learning Spaces. The grant aims to 1) build a community of practice of scholars and librarians who work in large-scale multimedia to help visually immersive scholarly work enter the research lifecycle; and 2) overcome technical and resource barriers that limit the number of scholars and libraries who may produce digital scholarship for visualization environments and the impact of generated knowledge. Libraries and museums have made significant strides in pioneering the use of large-scale visualization technologies for research and learning. However, the utilization, scale, and impact of visualization environments and the scholarship created within them have not reached their fullest potential. A logical next step in the provision of technology-rich, visual academic spaces is to develop best practices and collaborative frameworks that can benefit individual institutions by building economies of scale among collaborators.

The project contains four major elements:

  1. An initial meeting and priority setting workshop that brings together librarians, scholars, and technologists working in large-scale, library and museum-based visualization environments.
  2. Scholars-in-residence at NC State over a multi-year period who pursue open source creative projects, working in collaboration with our librarians and faculty, with the potential to address the articulated limitations.
  3. Funding for modest, competitive block grants to other institutions working on similar challenges for creating, disseminating, validating, and preserving digital scholarship created in and for large-scale visual environments.
  4. A culminating symposium that brings together representatives from the scholars-in-residence and block grant recipient institutions to share and assess results, organize ways of preserving and disseminating digital products produced, and build on the methods, templates, and tools developed for future projects.

Work Summary
This call solicits proposals for block grants from library or museum systems that have visualization installations. Block grant recipients can utilize funds for ideas ranging from creating open source scholarly content for visualization environments to developing tools and templates to enhance sharing of visualization work. An advisory panel will select four institutions to receive awards of up to $40,000. Block grant recipients will also participate in the initial priority setting workshop and the culminating symposium. Participating in a block grant proposal does not disqualify an individual from later applying for one of the grant-supported scholar-in-residence appointments.
Applicants will provide a statement of work that describes the contributions that their organization will make toward the goals of the grant. Applicants will also provide a budget and budget justification.
Activities that can be funded through block grants include, but are not limited to:

  • Commissioning work by a visualization expert
  • Hosting a visiting scholar, artist, or technologist residency
  • Software development or adaptation
  • Development of templates and methodologies for sharing and scaling content utilizing open source software
  • Student or staff labor for content or software development or adaptation
  • Curricula and reusable learning objects for digital scholarship and visualization courses
  • Travel (if necessary) to the initial project meeting and culminating workshop
  • User research on universal design for visualization spaces

Funding for operational expenditures, such as equipment, is not allowed for any grant participant.

Application
Send an application to immersivescholar@ncsu.edu by the end of the day on 13 August 2017 that includes the following:

  • Statement of work (no more than 1000 words) of the project idea your organization plans to develop, its relationship to the overall goals of the grant, and the challenges to be addressed.
  • List the names and contact information for each of the participants in the funded project, including a brief description of their current role, background, expertise, interests, and what they can contribute.
  • Project timeline.
  • Budget table with projected expenditures.
  • Budget narrative detailing the proposed expenditures

Selection and Notification Process
An advisory panel made up of scholars, librarians, and technologists with experience and expertise in large-scale visualization and/or visual scholarship will review and rank proposals. The project leaders are especially keen to receive proposals that develop best practices and collaborative frameworks that can benefit individual institutions by building a community of practice and economies of scale among collaborators.

Awardees will be selected based on:

  • the ability of their proposal to successfully address one or both of the identified problems;
  • the creativity of the proposed activities;
  • relevant demonstrated experience partnering with scholars or students on visualization projects;
  • whether the proposal is extensible;
  • feasibility of the work within the proposed time-frame and budget;
  • whether the project work improves or expands access to large-scale visual environments for users; and
  • the participant’s ability to expand content development and sharing among the network of institutions with large-scale visual environments.

Awardees will be required to send a representative to an initial meeting of the project cohort in Fall 2017.

Awardees will be notified by 15 September 2017.

If you have any questions, please contact immersivescholar@ncsu.edu.

–Mike Nutt Director of Visualization Services Digital Library Initiatives, NCSU Libraries
919.513.0651 http://www.lib.ncsu.edu/do/visualization

 

next gen digital learning environment

Updating the Next Generation Digital Learning Environment for Better Student Learning Outcomes

a learning management system (LMS) is never the solution to every problem in education. Edtech is just one part of the whole learning ecosystem and student experience.

Therefore, the next generation digital learning environment (NGDLE), as envisioned by EDUCAUSE in 2015 …  Looking at the NGDLE requirements from an LMS perspective, I view the NGDLE as being about five areas: interoperability; personalization; analytics, advising, and learning assessment; collaboration; accessibility and universal design.

Interoperability

  • Content can easily be exchanged between systems.
  • Users are able to leverage the tools they love, including discipline-specific apps.
  • Learning data is available to trusted systems and people who need it.
  • The learning environment is “future proof” so that it can adapt and extend as the ecosystem evolves.

Personalization

  • The learning environment reflects individual preferences.
  • Departments, divisions, and institutions can be autonomous.
  • Instructors teach the way they want and are not constrained by the software design.
  • There are clear, individual learning paths.
  • Students have choice in activity, expression, and engagement.

Analytics, Advising, and Learning Assessment

  • Learning analytics helps to identify at-risk students, course progress, and adaptive learning pathways.
  • The learning environment enables integrated planning and assessment of student performance.
  • More data is made available, with greater context around the data.
  • The learning environment supports platform and data standards.

Collaboration

  • Individual spaces persist after courses and after graduation.
  • Learners are encouraged as creators and consumers.
  • Courses include public and private spaces.

Accessibility and Universal Design

  • Accessibility is part of the design of the learning experience.
  • The learning environment enables adaptive learning and supports different types of materials.
  • Learning design includes measurement rubrics and quality control.

The core analogy used in the NGDLE paper is that each component of the learning environment is a Lego brick:

  • The days of the LMS as a “walled garden” app that does everything is over.
  • Today many kinds of amazing learning and collaboration tools (Lego bricks) should be accessible to educators.
  • We have standards that let these tools (including an LMS) talk to each other. That is, all bricks share some properties that let them fit together.
  • Students and teachers sign in once to this “ecosystem of bricks.”
  • The bricks share results and data.
  • These bricks fit together; they can be interchanged and swapped at will, with confidence that the learning experience will continue uninterrupted.

Any “next-gen” attempt to completely rework the pedagogical model and introduce a “mash-up of whatever” to fulfil this model would fall victim to the same criticisms levied at the LMS today: there is too little time and training to expect faculty to figure out the nuances of implementation on their own.

The Lego metaphor works only if we’re talking about “old school” Lego design — bricks of two, three, and four-post pieces that neatly fit together. Modern edtech is a lot more like the modern Lego. There are wheels and rocket launchers and belts and all kinds of amazing pieces that work well with each other, but only when they are configured properly. A user cannot simply stick together different pieces and assume they will work harmoniously in creating an environment through which each student can be successful.

As the NGDLE paper states: “Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features — just 41% of faculty surveyed report using the LMS ‘to promote interaction outside the classroom.'”

But this is what the next generation LMS is good at: being a central nervous system — or learning hub — through which a variety of learning activities and tools are used. This is also where the LMS needs to go: bringing together and making sense of all the amazing innovations happening around it. This is much harder to do, perhaps even impossible, if all the pieces involved are just bricks without anything to orchestrate them or to weave them together into a meaningful, personal experience for achieving well-defined learning outcomes.

  • Making a commitment to build easy, flexible, and smart technology
  • Working with colleges and universities to remove barriers to adopting new tools in the ecosystem
  • Standardizing the vetting of accessibility compliance (the Strategic Nonvisual Access Partner Program from the National Federation of the Blind is a great start)
  • Advancing standards for data exchange while protecting individual privacy
  • Building integrated components that work with the institutions using them — learning quickly about what is and is not working well and applying those lessons to the next generation of interoperability standards
  • Letting people use the tools they love [SIC] and providing more ways for nontechnical individuals (including students) to easily integrate new features into learning activities

My note: something just refused to be accepted at SCSU
Technologists are often very focused on the technology, but the reality is that the more deeply and closely we understand the pedagogy and the people in the institutions — students, faculty, instructional support staff, administrators — the better suited we are to actually making the tech work for them.

++++++++++++++++++++++

Under the Hood of a Next Generation Digital Learning Environment in Progress

The challenge is that although 85 percent of faculty use a campus learning management system (LMS),1 a recent Blackboard report found that, out of 70,000 courses across 927 North American institutions, 53 percent of LMS usage was classified as supplemental(content-heavy, low interaction) and 24 percent as complementary (one-way communication via content/announcements/gradebook).2 Only 11 percent were characterized as social, 10 percent as evaluative (heavy use of assessment), and 2 percent as holistic (balanced use of all previous). Our FYE course required innovating beyond the supplemental course-level LMS to create a more holistic cohort-wide NGDLE in order to fully support the teaching, learning, and student success missions of the program.The key design goals for our NGDLE were to:

  • Create a common platform that could deliver a standard curriculum and achieve parity in all course sections using existing systems and tools and readily available content
  • Capture, store, and analyze any generated learner data to support learning assessment, continuous program improvement, and research
  • Develop reports and actionable analytics for administrators, advisors, instructors, and students

++++++++++++
more on LMS in this blog
https://blog.stcloudstate.edu/ims?s=LMS

more on learning outcomes in this IMS blog
https://blog.stcloudstate.edu/ims?s=learning+outcomes

bibliometrics altmetrics

International Benchmarks for Academic Library Use of Bibliometrics & Altmetrics, 2016-17

ID: 3807768 Report August 2016 115 pages Primary Research Group

http://www.researchandmarkets.com/publication/min3qqb/3807768

The report gives detailed data on the use of various bibliometric and altmetric tools such as Google Scholar, Web of Science, Scimago, Plum Analytics

20 predominantly research universities in the USA, continental Europe, the UK, Canada and Australia/New Zealand. Among the survey participants are: Carnegie Mellon, Cambridge University, Universitat Politècnica de Catalunya the University at Albany, the University of Melbourne, Florida State University, the University of Alberta and Victoria University of Wellington

– 50% of the institutions sampled help their researchers to obtain a Thomsen/Reuters Researcher ID.

ResearcherID provides a solution to the author ambiguity problem within the scholarly research community. Each member is assigned a unique identifier to enable researchers to manage their publication lists, track their times cited counts and h-index, identify potential collaborators and avoid author misidentification. In addition, your ResearcherID information integrates with the Web of Science and is ORCID compliant, allowing you to claim and showcase your publications from a single one account. Search the registry to find collaborators, review publication lists and explore how research is used around the world!

– Just 5% of those surveyed use Facebook Insights in their altmetrics efforts.

 

 

++++++++++++++
more on altmetrics in this IMS blog
https://blog.stcloudstate.edu/ims?s=altmetrics

instagram best social media

Shipley, K. (2016, December 19). Why Instagram is the Best Social Media App of 2016 and Possibly 2017. Retrieved December 20, 2016, from https://www.linkedin.com/pulse/why-instagram-best-social-media-app-2016-possibly-2017-shipley

Instagram positioned itself as the third most popular social media app and the best social media app of 2016.

Twitter saw a decrease in users over the past year and even death of their beloved 6-second video-clip sharing app, Vine.

In an article entitled ‘Why Vine Died,’ Casey Newman reported the following, “Former executives say that a major competitive challenged emerged in the form of Instagram, which introduced 15-second video clips in June 2013.

Instagram remained stable with the introduction of new features like stories and video channels, resources of it’s parent company, Facebook, and the introduction of ads to the platform that look very similar to the posts in a user’s feed.

In addition to a total logo redesign, Instagram shifted its focus from just pictures, to longer video (from 15 sec. to one minute) and direct messaging features, such as group posts and disappearing video. Explore Channels in Discover let people discover new photo and video content based on interests. Instagram Stories added a new element to the Instagram experience showing highlights from friends, celebrities and businesses one follows without interfering with their feed. Instagram also caters to business needs through its Instagram for Business platform that allows for instant contact, detailed analytics and easy-to-follow linked content.

Most recently, Instagram released live video in their stories feature. Users can start a live stream in their Instagram story and view comments and feedback from their viewers in real time! This feature is similar to apps like musical.ly and live.ly which has over 80 million users and 62% of its users are under 21.

#StudentVoices #MillennialMondays #WhatToWatch

#MillennialMondays is a new series that aims to discuss relevant topics on careers and business from a millennial perspective.

+++++++++++++++++++++++++
more on instagram in this IMS blog
https://blog.stcloudstate.edu/ims?s=instagram

text and data mining

38 great resources for learning data mining concepts and techniques

http://www.rubedo.com.br/2016/08/38-great-resources-for-learning-data.html

Learn data mining languages: R, Python and SQL

W3Schools – Fantastic set of interactive tutorials for learning different languages. Their SQL tutorial is second to none. You’ll learn how to manipulate data in MySQL, SQL Server, Access, Oracle, Sybase, DB2 and other database systems.
Treasure Data – The best way to learn is to work towards a goal. That’s what this helpful blog series is all about. You’ll learn SQL from scratch by following along with a simple, but common, data analysis scenario.
10 Queries – This course is recommended for the intermediate SQL-er who wants to brush up on his/her skills. It’s a series of 10 challenges coupled with forums and external videos to help you improve your SQL knowledge and understanding of the underlying principles.
TryR – Created by Code School, this interactive online tutorial system is designed to step you through R for statistics and data modeling. As you work through their seven modules, you’ll earn badges to track your progress helping you to stay on track.
Leada – If you’re a complete R novice, try Lead’s introduction to R. In their 1 hour 30 min course, they’ll cover installation, basic usage, common functions, data structures, and data types. They’ll even set you up with your own development environment in RStudio.
Advanced R – Once you’ve mastered the basics of R, bookmark this page. It’s a fantastically comprehensive style guide to using R. We should all strive to write beautiful code, and this resource (based on Google’s R style guide) is your key to that ideal.
Swirl – Learn R in R – a radical idea certainly. But that’s exactly what Swirl does. They’ll interactively teach you how to program in R and do some basic data science at your own pace. Right in the R console.
Python for beginners – The Python website actually has a pretty comprehensive and easy-to-follow set of tutorials. You can learn everything from installation to complex analyzes. It also gives you access to the Python community, who will be happy to answer your questions.
PythonSpot – A complete list of Python tutorials to take you from zero to Python hero. There are tutorials for beginners, intermediate and advanced learners.
Read all about it: data mining books
Data Jujitsu: The Art of Turning Data into Product – This free book by DJ Patil gives you a brief introduction to the complexity of data problems and how to approach them. He gives nice, understandable examples that cover the most important thought processes of data mining. It’s a great book for beginners but still interesting to the data mining expert. Plus, it’s free!
Data Mining: Concepts and Techniques – The third (and most recent) edition will give you an understanding of the theory and practice of discovering patterns in large data sets. Each chapter is a stand-alone guide to a particular topic, making it a good resource if you’re not into reading in sequence or you want to know about a particular topic.
Mining of Massive Datasets – Based on the Stanford Computer Science course, this book is often sighted by data scientists as one of the most helpful resources around. It’s designed at the undergraduate level with no formal prerequisites. It’s the next best thing to actually going to Stanford!
Big Data, Data Mining, and Machine Learning: Value Creation for Business Leaders and Practitioners – This book is a must read for anyone who needs to do applied data mining in a business setting (ie practically everyone). It’s a complete resource for anyone looking to cut through the Big Data hype and understand the real value of data mining. Pay particular attention to the section on how modeling can be applied to business decision making.
Data Smart: Using Data Science to Transform Information into Insight – The talented (and funny) John Foreman from MailChimp teaches you the “dark arts” of data science. He makes modern statistical methods and algorithms accessible and easy to implement.
Hadoop: The Definitive Guide – As a data scientist, you will undoubtedly be asked about Hadoop. So you’d better know how it works. This comprehensive guide will teach you how to build and maintain reliable, scalable, distributed systems with Apache Hadoop. Make sure you get the most recent addition to keep up with this fast-changing service.
 Online learning: data mining webinars and courses
DataCamp – Learn data mining from the comfort of your home with DataCamp’s online courses. They have free courses on R, Statistics, Data Manipulation, Dynamic Reporting, Large Data Sets and much more.
Coursera – Coursera brings you all the best University courses straight to your computer. Their online classes will teach you the fundamentals of interpreting data, performing analyzes and communicating insights. They have topics for beginners and advanced learners in Data Analysis, Machine Learning, Probability and Statistics and more.
Udemy – With a range of free and pay for data mining courses, you’re sure to find something you like on Udemy no matter your level. There are 395 in the area of data mining! All their courses are uploaded by other Udemy users meaning quality can fluctuate so make sure you read the reviews.
CodeSchool – These courses are handily organized into “Paths” based on the technology you want to learn. You can do everything from build a foundation in Git to take control of a data layer in SQL. Their engaging online videos will take you step-by-step through each lesson and their challenges will let you practice what you’ve learned in a controlled environment.
Udacity – Master a new skill or programming language with Udacity’s unique series of online courses and projects. Each class is developed by a Silicon Valley tech giant, so you know what your learning will be directly applicable to the real world.
Treehouse – Learn from experts in web design, coding, business and more. The video tutorials from Treehouse will teach you the basics and their quizzes and coding challenges will ensure the information sticks. And their UI is pretty easy on the eyes.
Learn from the best: top data miners to follow
John Foreman – Chief Data Scientist at MailChimp and author of Data Smart, John is worth a follow for his witty yet poignant tweets on data science.
DJ Patil – Author and Chief Data Scientist at The White House OSTP, DJ tweets everything you’ve ever wanted to know about data in politics.
Nate Silver – He’s Editor-in-Chief of FiveThirtyEight, a blog that uses data to analyze news stories in Politics, Sports, and Current Events.
Andrew Ng – As the Chief Data Scientist at Baidu, Andrew is responsible for some of the most groundbreaking developments in Machine Learning and Data Science.
Bernard Marr – He might know pretty much everything there is to know about Big Data.
Gregory Piatetsky – He’s the author of popular data science blog KDNuggets, the leading newsletter on data mining and knowledge discovery.
Christian Rudder – As the Co-founder of OKCupid, Christian has access to one of the most unique datasets on the planet and he uses it to give fascinating insight into human nature, love, and relationships
Dean Abbott – He’s contributed to a number of data blogs and authored his own book on Applied Predictive Analytics. At the moment, Dean is Chief Data Scientist at SmarterHQ.
Practice what you’ve learned: data mining competitions
Kaggle – This is the ultimate data mining competition. The world’s biggest corporations offer big prizes for solving their toughest data problems.
Stack Overflow – The best way to learn is to teach. Stackoverflow offers the perfect forum for you to prove your data mining know-how by answering fellow enthusiast’s questions.
TunedIT – With a live leaderboard and interactive participation, TunedIT offers a great platform to flex your data mining muscles.
DrivenData – You can find a number of nonprofit data mining challenges on DataDriven. All of your mining efforts will go towards a good cause.
Quora – Another great site to answer questions on just about everything. There are plenty of curious data lovers on there asking for help with data mining and data science.
Meet your fellow data miner: social networks, groups and meetups
Reddit – Reddit is a forum for finding the latest articles on data mining and connecting with fellow data scientists. We recommend subscribing to r/dataminingr/dataisbeautiful,r/datasciencer/machinelearning and r/bigdata.
Facebook – As with many social media platforms, Facebook is a great place to meet and interact with people who have similar interests. There are a number of very active data mining groups you can join.
LinkedIn – If you’re looking for data mining experts in a particular field, look no further than LinkedIn. There are hundreds of data mining groups ranging from the generic to the hyper-specific. In short, there’s sure to be something for everyone.
Meetup – Want to meet your fellow data miners in person? Attend a meetup! Just search for data mining in your city and you’re sure to find an awesome group near you.
——————————

8 fantastic examples of data storytelling

https://www.import.io/post/8-fantastic-examples-of-data-storytelling/

Data storytelling is the realization of great data visualization. We’re seeing data that’s been analyzed well and presented in a way that someone who’s never even heard of data science can get it.

Google’s Cole Nussbaumer provides a friendly reminder of what data storytelling actually is, it’s straightforward, strategic, elegant, and simple.

 

++++++++++++++++++++++

more on text and data mining in this IMS blog
hthttps://blog.stcloudstate.edu/ims?s=data+mining

1 2