Searching for "qualitative research"

qualitative method research

Cohort 7

By miltenoff | View this Toon at ToonDoo | Create your own Toon

Qualitative Method Research

quote

Data treatment and analysis

Because the questionnaire data comprised both Likert scales and open questions, they were analyzed quantitatively and qualitatively. Textual data (open responses) were qualitatively analyzed by coding: each segment (e.g. a group of words) was assigned to a semantic reference category, as systematically and rigorously as possible. For example, “Using an iPad in class really motivates me to learn” was assigned to the category “positive impact on motivation.” The qualitative analysis was performed using an adapted version of the approaches developed by L’Écuyer (1990) and Huberman and Miles (1991, 1994). Thus, we adopted a content analysis approach using QDAMiner software, which is widely used in qualitative research (see Fielding, 2012; Karsenti, Komis, Depover, & Collin, 2011). For the quantitative analysis, we used SPSS 22.0 software to conduct descriptive and inferential statistics. We also conducted inferential statistics to further explore the iPad’s role in teaching and learning, along with its motivational effect. The results will be presented in a subsequent report (Fievez, & Karsenti, 2013)

Fievez, A., & Karsenti, T. (2013). The iPad in Education: uses, benefits and challenges. A survey of 6057 students and 302 teachers in Quebec, Canada (p. 51). Canada Research Chair in Technologies in Education. Retrieved from https://www.academia.edu/5366978/The_iPad_in_Education_uses_benefits_and_challenges._A_survey_of_6057_students_and_302_teachers_in_Quebec_Canada

unquote

 The 20th century notion of conducting a qualitative research by an oral interview and then processing manually your results had triggered in the second half of the 20th century [sometimes] condescending attitudes by researchers from the exact sciences.
The reason was the advent of computing power in the second half of the 20th century, which allowed exact sciences to claim “scientific” and “data-based” results.
One of the statistical package, SPSS, is today widely known and considered a magnificent tools to bring solid statistically-based argumentation, which further perpetuates the superiority of quantitative over qualitative method.
At the same time, qualitative researchers continue to lag behind, mostly due to the inertia of their approach to qualitative analysis. Qualitative analysis continues to be processed in the olden ways. While there is nothing wrong with the “olden” ways, harnessing computational power can streamline the “olden ways” process and even present options, which the “human eye” sometimes misses.
Below are some suggestions, you may consider, when you embark on the path of qualitative research.
The Use of Qualitative Content Analysis in Case Study Research
Florian Kohlbacher
http://www.qualitative-research.net/index.php/fqs/article/view/75/153

excellent guide to the structure of a qualitative research

Palys, T., & Atchison, C. (2012). Qualitative Research in the Digital Era: Obstacles and Opportunities. International Journal Of Qualitative Methods, 11(4), 352-367.
http://login.libproxy.stcloudstate.edu/login?qurl=http%3a%2f%2fsearch.ebscohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dkeh%26AN%3d89171709%26site%3dehost-live%26scope%3dsite
Palys and Atchison (2012) present a compelling case to bring your qualitative research to the level of the quantitative research by using modern tools for qualitative analysis.
1. The authors correctly promote NVivo as the “jaguar’ of the qualitative research method tools. Be aware, however, about the existence of other “Geo Metro” tools, which, for your research, might achieve the same result (see bottom of this blog entry).
2. The authors promote a new type of approach to Chapter 2 doctoral dissertation and namely OCR-ing PDF articles (most of your literature as of 2017 is mostly either in PDF or electronic textual format) through applications such as
Abbyy Fine Reader, https://www.abbyy.com/en-us/finereader/
OmniPage,  http://www.nuance.com/for-individuals/by-product/omnipage/index.htm
Readirus http://www.irislink.com/EN-US/c1462/Readiris-16-for-Windows—OCR-Software.aspx
The text from the articles is processed either through NVIVO or related programs (see bottom of this blog entry). As the authors propose: ” This is immediately useful for literature review and proposal writing, and continues through the research design, data gathering, and analysis stages— where NVivo’s flexibility for many different sources of data (including audio, video, graphic, and text) are well known—of writing for publication” (p. 353).
In other words, you can try to wrap your head around huge amount of textual information, but you can also approach the task by a parallel process of processing the same text with a tool.
 +++++++++++++++++++++++++++++
Here are some suggestions for Computer Assisted / Aided Qualitative Data Analysis Software (CAQDAS) for a small and a large community applications):

– RQDA (the small one): http://rqda.r-forge.r-project.org/ (see on youtube the tutorials of Metin Caliskan); one active developper.
GATE (the large one): http://gate.ac.uk/ | https://gate.ac.uk/download/

text mining: https://en.wikipedia.org/wiki/Text_mining
Text mining, also referred to as text data mining, roughly equivalent to text analytics, is the process of deriving high-quality information from text. High-quality information is typically derived through the devising of patterns and trends through means such as statistical pattern learning. Text mining usually involves the process of structuring the input text (usually parsing, along with the addition of some derived linguistic features and the removal of others, and subsequent insertion into a database), deriving patterns within the structured data, and finally evaluation and interpretation of the output.
https://ischool.syr.edu/infospace/2013/04/23/what-is-text-mining/
Qualitative data is descriptive data that cannot be measured in numbers and often includes qualities of appearance like color, texture, and textual description. Quantitative data is numerical, structured data that can be measured. However, there is often slippage between qualitative and quantitative categories. For example, a photograph might traditionally be considered “qualitative data” but when you break it down to the level of pixels, which can be measured.
word of caution, text mining doesn’t generate new facts and is not an end, in and of itself. The process is most useful when the data it generates can be further analyzed by a domain expert, who can bring additional knowledge for a more complete picture. Still, text mining creates new relationships and hypotheses for experts to explore further.

quick and easy:

intermediate:

advanced:

http://tidytextmining.com/

Introduction to GATE Developer  https://youtu.be/o5uhMF15vsA


 

use of RapidMiner:

https://rapidminer.com/pricing/

– Coding Analysis Toolkit (CAT) from University of Pittsburgh and University of Massachusetts
– Raven’s Eye is an online natural language ANALYSIS tool based
– ATLAS.TI
– XSIGTH

– QDA Miner: http://provalisresearch.com/products/qualitative-data-analysis-software/

There is also a free version called QDA Miner Lite with limited functionalities: http://provalisresearch.com/products/qualitative-data-analysis-software/freeware/

– MAXQDA

–  NVivo

– SPSS Text Analytics

– Kwalitan

– Transana (include video transcribing capability)

– XSight

– Nud*ist

(Cited from: https://www.researchgate.net/post/Are_there_any_open-source_alternatives_to_Nvivo [accessed Apr 1, 2017].

– OdinText

IBM Watson Conversation
IBM Watson Text to Speech
Google Translate API
MeTA
LingPipe
NLP4J
Timbl
Colibri Core
CRF++
Frog
Ucto
– CRFsuite

– FoLiA
PyNLPl
openNLP
NLP Compromise
MALLET
Cited from: https://www.g2crowd.com/products/nvivo/competitors/alternatives [accessed April 1, 2017
+++++++++++++++++++++++++
more on quantitative research:

Asamoah, D. A., Sharda, R., Hassan Zadeh, A., & Kalgotra, P. (2017). Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course. Decision Sciences Journal of Innovative Education, 15(2), 161–190. https://doi.org/10.1111/dsji.12125
++++++++++++++++++++++++
literature on quantitative research:
Borgman, C. L. (2015). Big Data, Little Data, No Data: Scholarship in the Networked World. MIT Press. https://mplus.mnpals.net/vufind/Record/ebr4_1006438
St. Cloud State University MC Main Collection – 2nd floor AZ195 .B66 2015
p. 161 Data scholarship in the Humanities
p. 166 When Are Data?
Philip Chen, C. L., & Zhang, C.-Y. (2014). Data-intensive applications, challenges, techniques and technologies: A survey on Big Data. Information Sciences, 275(Supplement C), 314–347. https://doi.org/10.1016/j.ins.2014.01.015

Cohort 8 research and write dissertation

When writing your dissertation…

Please have an FAQ-kind of list of the Google Group postings regarding resources and information on research and writing of Chapter 2

digital resource sets available through MnPALS Plus

http://blog.stcloudstate.edu/ims/2017/10/21/digital-resource-sets-available-through-mnpals-plus/ 

+++++++++++++++++++++++++

[how to] write chapter 2

You were reminded to look at dissertations of your peers from previous cohorts and use their dissertations as a “template”: http://repository.stcloudstate.edu/do/discipline_browser/articles?discipline_key=1230

You also were reminded to use the documents in Google Drive: e.g. https://drive.google.com/open?id=0B7IvS0UYhpxFVTNyRUFtNl93blE

Please have also materials, which might help you organize our thoughts and expedite your Chapter 2 writing….

Do you agree with (did you use) the following observations:

The purpose of the review of the literature is to prove that no one has studied the gap in the knowledge outlined in Chapter 1. The subjects in the Review of Literature should have been introduced in the Background of the Problem in Chapter 1. Chapter 2 is not a textbook of subject matter loosely related to the subject of the study.  Every research study that is mentioned should in some way bear upon the gap in the knowledge, and each study that is mentioned should end with the comment that the study did not collect data about the specific gap in the knowledge of the study as outlined in Chapter 1.

The review should be laid out in major sections introduced by organizational generalizations. An organizational generalization can be a subheading so long as the last sentence of the previous section introduces the reader to what the next section will contain.  The purpose of this chapter is to cite major conclusions, findings, and methodological issues related to the gap in the knowledge from Chapter 1. It is written for knowledgeable peers from easily retrievable sources of the most recent issue possible.

Empirical literature published within the previous 5 years or less is reviewed to prove no mention of the specific gap in the knowledge that is the subject of the dissertation is in the body of knowledge. Common sense should prevail. Often, to provide a history of the research, it is necessary to cite studies older than 5 years. The object is to acquaint the reader with existing studies relative to the gap in the knowledge and describe who has done the work, when and where the research was completed, and what approaches were used for the methodology, instrumentation, statistical analyses, or all of these subjects.

If very little literature exists, the wise student will write, in effect, a several-paragraph book report by citing the purpose of the study, the methodology, the findings, and the conclusions.  If there is an abundance of studies, cite only the most recent studies.  Firmly establish the need for the study.  Defend the methods and procedures by pointing out other relevant studies that implemented similar methodologies. It should be frequently pointed out to the reader why a particular study did not match the exact purpose of the dissertation.

The Review of Literature ends with a Conclusion that clearly states that, based on the review of the literature, the gap in the knowledge that is the subject of the study has not been studied.  Remember that a “summary” is different from a “conclusion.”  A Summary, the final main section, introduces the next chapter.

from http://dissertationwriting.com/wp/writing-literature-review/

Here is the template from a different school (then SCSU)

http://semo.edu/education/images/EduLead_DissertGuide_2007.pdf 

+++++++++++++++++

When conducting qualitative data, how many people should be interviewed? Is there a minimum or a max

Here is my take on it:

Simple question, not so simple answer.

It depends.

Generally, the number of respondents depends on the type of qualitative inquiry: case study methodology, phenomenological study, ethnographic study, or ethnomethodology. However, a rule of thumb is for scholars to achieve saturation point–that is the point in which no fresh information is uncovered in response to an issue that is of interest to the researcher.

If your qualitative method is designed to meet rigor and trustworthiness, thick, rich data is important. To achieve these principles you would need at least 12 interviews, ensuring your participants are the holders of knowledge in the area you intend to investigate. In grounded theory you could start with 12 and interview more if your data is not rich enough.

In IPA the norm tends to be 6 interviews.

You may check the sample size in peer reviewed qualitative publications in your field to find out about popular practice. In all depends on the research problem, choice of specific qualitative approach and theoretical framework, so the answer to your question will vary from few to few dozens.

How many interviews are needed in a qualitative research?

There are different views in literature and no one agreed to the exact number. Here I reviewed some mostly cited references. Based Creswell (2014), it is estimated that 16 participants will provide rich and detailed data. There are a couple of researchers agreed ‎on 10–15 in-depth interviews ‎are ‎sufficient ‎‎ (Guest, Bunce & Johnson 2006; Baker & ‎Edwards 2012).

your methodological choices need to reflect your ontological position and understanding of knowledge production, and that’s also where you can argue a strong case for smaller qualitative studies, as you say. This is not only a problem for certain subjects, I think it’s a problem in certain departments or journals across the board of social science research, as it’s a question of academic culture.

here more serious literature and research (in case you need to cite in Chapter 3)

Sample Size and Saturation in PhD Studies Using Qualitative Interviews

http://www.qualitative-research.net/index.php/fqs/article/view/1428/3027

https://researcholic.wordpress.com/2015/03/20/sample_size_interviews/

Gaskell, George (2000). Individual and Group Interviewing. In Martin W. Bauer & George Gaskell (Eds.), Qualitative Researching With Text, Image and Sound. A Practical Handbook (pp. 38-56). London: SAGE Publications.

Lieberson, Stanley 1991: “Small N’s and Big Conclusions.” Social Forces 70:307-20. (http://www.jstor.org/pss/2580241)

Savolainen, Jukka 1994: “The Rationality of Drawing Big Conclusions Based on Small Samples.” Social Forces 72:1217-24. (http://www.jstor.org/pss/2580299).

Small, M.(2009) ‘How many cases do I need ? On science and the logic of case selection in field-based research’ Ethnography 10(1) 5-38

Williams,M. (2000) ‘Interpretivism and generalisation ‘ Sociology 34(2) 209-224

http://james-ramsden.com/semi-structured-interviews-how-many-interviews-is-enough/

+++++++++++++++++

how to start your writing process

If you are a Pinterest user, you are welcome to just sbuscribe to the board:

https://www.pinterest.com/aidedza/doctoral-cohort/

otherwise, I am mirroring the information also in the IMS blog:

http://blog.stcloudstate.edu/ims/2017/08/13/analytical-essay/ 

+++++++++++++++++++++++++++

APA citing of “unusual” resources

http://blog.stcloudstate.edu/ims/2017/08/06/apa-citation/

+++++++++++++++++++++++

statistical modeling: your guide to Chapter 3

working on your dissertation, namely Chapter 3, you probably are consulting with the materials in this shared folder:

https://drive.google.com/drive/folders/0B7IvS0UYhpxFVTNyRUFtNl93blE?usp=sharing

In it, there is a subfolder, called “stats related materials”
https://drive.google.com/open?id=0B7IvS0UYhpxFcVg3aWxCX0RVams

where you have several documents from the Graduate school and myself to start building your understanding and vocabulary regarding your quantitative, qualitative or mixed method research.

It has been agreed that before you go to the Statistical Center (Randy Kolb), it is wise to be prepared and understand the terminology as well as the basics of the research methods.

Please have an additional list of materials available through the SCSU library and the Internet. They can help you further with building a robust foundation to lead your research:

http://blog.stcloudstate.edu/ims/2017/07/10/intro-to-stat-modeling/

In this blog entry, I shared with you:

  1. Books on intro to stat modeling available at the library. I understand the major pain borrowing books from the SCSU library can constitute, but you can use the titles and the authors and see if you can borrow them from your local public library
  2. I also sought and shared with you “visual” explanations of the basics terms and concepts. Once you start looking at those, you should be able to further research (e.g. YouTube) and find suitable sources for your learning style.

I (and the future cohorts) will deeply appreciate if you remember to share those “suitable sources for your learning style” either by sharing in this Google Group thread and/or sharing in the comments section of the blog entry: http://blog.stcloudstate.edu/ims/2017/07/10/intro-to-stat-modeling.  Your Facebook group page is also a good place to discuss among ourselves best practices to learn and use research methods for your chapter 3.

++++++++++++++++
search for sources

Google just posted on their Facebook profile a nifty short video on Google Search
http://blog.stcloudstate.edu/ims/2017/06/26/google-search/

Watching the video, you may remember the same #BooleanSearch techniques from our BI (bibliography instruction) session of last semester.

Considering the fact of preponderance of information in 2017: your Chapter 2 is NOT ONLY about finding information regrading your topic.
Your Chapter 2 is about proving your extensive research of the existing literature.

The techniques presented in the short video will arm you with methods to dig deeper and look further.

If you would like to do a decent job exploring all corners of the vast area called Internet, please consider other search engines similar to Google Scholar:

Microsoft Semantic Scholar (Semantic Scholar); Microsoft Academic Search; Academicindex.net; Proquest Dialog; Quetzal; arXiv;

https://www.google.com/; https://scholar.google.com/ (3 min); http://academic.research.microsoft.com/http://www.dialog.com/http://www.quetzal-search.infohttp://www.arXiv.orghttp://www.journalogy.com/
More about such search engines in the following blog entries:

http://blog.stcloudstate.edu/ims/2017/01/19/digital-literacy-for-glst-495/

and

http://blog.stcloudstate.edu/ims/2017/05/01/history-becker/

Let me know, if more info needed and/or you need help embarking on the “deep” search

+++++++++++++++++

tips for writing and proofreading

please have several infographics to help you with your writing habits (organization) and proofreading, posted in the IMS blog:

https://blog.stcloudstate.edu/ims/2017/06/11/writing-first-draft/
https://blog.stcloudstate.edu/ims/2017/06/11/prewriting-strategies/ 

https://blog.stcloudstate.edu/ims/2017/06/11/essay-checklist/

++++++++++++++

letter – request copyright permission

Here are several samples on mastering such letter:

https://registrar.stanford.edu/students/dissertation-and-thesis-submission/preparing-engineer-theses-paper-submission/sample-3

http://www.iup.edu/graduatestudies/resources-for-current-students/research/thesis-dissertation-information/before-starting-your-research/copyright-permission-instructions-and-sample-letter/

https://brocku.ca/webfm_send/25032

 

+++++++++++++++++

 

 

 

Research and Ethics: If Facebook can tweak our emotions and make us vote, what else can it Do?

If Facebook can tweak our emotions and make us vote, what else can it do?

http://www.businessinsider.com/facebook-calls-experiment-innovative-2014-7#ixzz36PtsxVfL

Google’s chief executive has expressed concern that we don’t trust big companies with our data – but may be dismayed at Facebook’s latest venture into manipulation

Please consider the information on Power, Privacy, and the Internet and details on ethics and big data in this IMS blog entry:http://blog.stcloudstate.edu/ims/2014/07/01/privacy-and-surveillance-obama-advisor-john-podesta-every-country-has-a-history-of-going-over-the-line/

important information:
Please consider the SCSU Research Ethics and the IRB (Institutional Review Board) document:
http://www.stcloudstate.edu/graduatestudies/current/culmProject/documents/ResearchEthicsandQualitative–IRBPresentationforGradStudentsv2.2011.pdf
For more information, please contact the SCSU Institutional Review Board : http://www.stcloudstate.edu/irb/default.asp

The Facebook Conundrum: Where Ethics and Science Collide

http://blogs.kqed.org/mindshift/2014/07/the-facebook-conundrum-where-ethics-and-science-collide

The field of learning analytics isn’t just about advancing the understanding of learning. It’s also being applied in efforts to try to influence and predict student behavior.

Learning analytics has yet to demonstrate its big beneficial breakthrough, its “penicillin,” in the words of Reich. Nor has there been a big ethical failure to creep lots of people out.

“There’s a difference,” Pistilli says, “between what we can do and what we should do.”

IRDL proposal

Applications for the 2018 Institute will be accepted between December 1, 2017 and January 27, 2018. Scholars accepted to the program will be notified in early March 2018.

Title:

Learning to Harness Big Data in an Academic Library

Abstract (200)

Research on Big Data per se, as well as on the importance and organization of the process of Big Data collection and analysis, is well underway. The complexity of the process comprising “Big Data,” however, deprives organizations of ubiquitous “blue print.” The planning, structuring, administration and execution of the process of adopting Big Data in an organization, being that a corporate one or an educational one, remains an elusive one. No less elusive is the adoption of the Big Data practices among libraries themselves. Seeking the commonalities and differences in the adoption of Big Data practices among libraries may be a suitable start to help libraries transition to the adoption of Big Data and restructuring organizational and daily activities based on Big Data decisions.
Introduction to the problem. Limitations

The redefinition of humanities scholarship has received major attention in higher education. The advent of digital humanities challenges aspects of academic librarianship. Data literacy is a critical need for digital humanities in academia. The March 2016 Library Juice Academy Webinar led by John Russel exemplifies the efforts to help librarians become versed in obtaining programming skills, and respectively, handling data. Those are first steps on a rather long path of building a robust infrastructure to collect, analyze, and interpret data intelligently, so it can be utilized to restructure daily and strategic activities. Since the phenomenon of Big Data is young, there is a lack of blueprints on the organization of such infrastructure. A collection and sharing of best practices is an efficient approach to establishing a feasible plan for setting a library infrastructure for collection, analysis, and implementation of Big Data.
Limitations. This research can only organize the results from the responses of librarians and research into how libraries present themselves to the world in this arena. It may be able to make some rudimentary recommendations. However, based on each library’s specific goals and tasks, further research and work will be needed.

 

 

Research Literature

“Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it…”
– Dan Ariely, 2013  https://www.asist.org/publications/bulletin/aprilmay-2017/big-datas-impact-on-privacy-for-librarians-and-information-professionals/

Big Data is becoming an omnipresent term. It is widespread among different disciplines in academia (De Mauro, Greco, & Grimaldi, 2016). This leads to “inconsistency in meanings and necessity for formal definitions” (De Mauro et al, 2016, p. 122). Similarly, to De Mauro et al (2016), Hashem, Yaqoob, Anuar, Mokhtar, Gani and Ullah Khan (2015) seek standardization of definitions. The main connected “themes” of this phenomenon must be identified and the connections to Library Science must be sought. A prerequisite for a comprehensive definition is the identification of Big Data methods. Bughin, Chui, Manyika (2011), Chen et al. (2012) and De Mauro et al (2015) single out the methods to complete the process of building a comprehensive definition.

In conjunction with identifying the methods, volume, velocity, and variety, as defined by Laney (2001), are the three properties of Big Data accepted across the literature. Daniel (2015) defines three stages in big data: collection, analysis, and visualization. According to Daniel, (2015), Big Data in higher education “connotes the interpretation of a wide range of administrative and operational data” (p. 910) and according to Hilbert (2013), as cited in Daniel (2015), Big Data “delivers a cost-effective prospect to improve decision making” (p. 911).

The importance of understanding the process of Big Data analytics is well understood in academic libraries. An example of such “administrative and operational” use for cost-effective improvement of decision making are the Finch & Flenner (2016) and Eaton (2017) case studies of the use of data visualization to assess an academic library collection and restructure the acquisition process. Sugimoto, Ding & Thelwall (2012) call for the discussion of Big Data for libraries. According to the 2017 NMC Horizon Report “Big Data has become a major focus of academic and research libraries due to the rapid evolution of data mining technologies and the proliferation of data sources like mobile devices and social media” (Adams, Becker, et al., 2017, p. 38).

Power (2014) elaborates on the complexity of Big Data in regard to decision-making and offers ideas for organizations on building a system to deal with Big Data. As explained by Boyd and Crawford (2012) and cited in De Mauro et al (2016), there is a danger of a new digital divide among organizations with different access and ability to process data. Moreover, Big Data impacts current organizational entities in their ability to reconsider their structure and organization. The complexity of institutions’ performance under the impact of Big Data is further complicated by the change of human behavior, because, arguably, Big Data affects human behavior itself (Schroeder, 2014).

De Mauro et al (2015) touch on the impact of Dig Data on libraries. The reorganization of academic libraries considering Big Data and the handling of Big Data by libraries is in a close conjunction with the reorganization of the entire campus and the handling of Big Data by the educational institution. In additional to the disruption posed by the Big Data phenomenon, higher education is facing global changes of economic, technological, social, and educational character. Daniel (2015) uses a chart to illustrate the complexity of these global trends. Parallel to the Big Data developments in America and Asia, the European Union is offering access to an EU open data portal (https://data.europa.eu/euodp/home ). Moreover, the Association of European Research Libraries expects under the H2020 program to increase “the digitization of cultural heritage, digital preservation, research data sharing, open access policies and the interoperability of research infrastructures” (Reilly, 2013).

The challenges posed by Big Data to human and social behavior (Schroeder, 2014) are no less significant to the impact of Big Data on learning. Cohen, Dolan, Dunlap, Hellerstein, & Welton (2009) propose a road map for “more conservative organizations” (p. 1492) to overcome their reservations and/or inability to handle Big Data and adopt a practical approach to the complexity of Big Data. Two Chinese researchers assert deep learning as the “set of machine learning techniques that learn multiple levels of representation in deep architectures (Chen & Lin, 2014, p. 515). Deep learning requires “new ways of thinking and transformative solutions (Chen & Lin, 2014, p. 523). Another pair of researchers from China present a broad overview of the various societal, business and administrative applications of Big Data, including a detailed account and definitions of the processes and tools accompanying Big Data analytics.  The American counterparts of these Chinese researchers are of the same opinion when it comes to “think about the core principles and concepts that underline the techniques, and also the systematic thinking” (Provost and Fawcett, 2013, p. 58). De Mauro, Greco, and Grimaldi (2016), similarly to Provost and Fawcett (2013) draw attention to the urgent necessity to train new types of specialists to work with such data. As early as 2012, Davenport and Patil (2012), as cited in Mauro et al (2016), envisioned hybrid specialists able to manage both technological knowledge and academic research. Similarly, Provost and Fawcett (2013) mention the efforts of “academic institutions scrambling to put together programs to train data scientists” (p. 51). Further, Asomoah, Sharda, Zadeh & Kalgotra (2017) share a specific plan on the design and delivery of a big data analytics course. At the same time, librarians working with data acknowledge the shortcomings in the profession, since librarians “are practitioners first and generally do not view usability as a primary job responsibility, usually lack the depth of research skills needed to carry out a fully valid” data-based research (Emanuel, 2013, p. 207).

Borgman (2015) devotes an entire book to data and scholarly research and goes beyond the already well-established facts regarding the importance of Big Data, the implications of Big Data and the technical, societal, and educational impact and complications posed by Big Data. Borgman elucidates the importance of knowledge infrastructure and the necessity to understand the importance and complexity of building such infrastructure, in order to be able to take advantage of Big Data. In a similar fashion, a team of Chinese scholars draws attention to the complexity of data mining and Big Data and the necessity to approach the issue in an organized fashion (Wu, Xhu, Wu, Ding, 2014).

Bruns (2013) shifts the conversation from the “macro” architecture of Big Data, as focused by Borgman (2015) and Wu et al (2014) and ponders over the influx and unprecedented opportunities for humanities in academia with the advent of Big Data. Does the seemingly ubiquitous omnipresence of Big Data mean for humanities a “railroading” into “scientificity”? How will research and publishing change with the advent of Big Data across academic disciplines?

Reyes (2015) shares her “skinny” approach to Big Data in education. She presents a comprehensive structure for educational institutions to shift “traditional” analytics to “learner-centered” analytics (p. 75) and identifies the participants in the Big Data process in the organization. The model is applicable for library use.

Being a new and unchartered territory, Big Data and Big Data analytics can pose ethical issues. Willis (2013) focusses on Big Data application in education, namely the ethical questions for higher education administrators and the expectations of Big Data analytics to predict students’ success.  Daries, Reich, Waldo, Young, and Whittinghill (2014) discuss rather similar issues regarding the balance between data and student privacy regulations. The privacy issues accompanying data are also discussed by Tene and Polonetsky, (2013).

Privacy issues are habitually connected to security and surveillance issues. Andrejevic and Gates (2014) point out in a decision making “generated by data mining, the focus is not on particular individuals but on aggregate outcomes” (p. 195). Van Dijck (2014) goes into further details regarding the perils posed by metadata and data to the society, in particular to the privacy of citizens. Bail (2014) addresses the same issue regarding the impact of Big Data on societal issues, but underlines the leading roles of cultural sociologists and their theories for the correct application of Big Data.

Library organizations have been traditional proponents of core democratic values such as protection of privacy and elucidation of related ethical questions (Miltenoff & Hauptman, 2005). In recent books about Big Data and libraries, ethical issues are important part of the discussion (Weiss, 2018). Library blogs also discuss these issues (Harper & Oltmann, 2017). An academic library’s role is to educate its patrons about those values. Sugimoto et al (2012) reflect on the need for discussion about Big Data in Library and Information Science. They clearly draw attention to the library “tradition of organizing, managing, retrieving, collecting, describing, and preserving information” (p.1) as well as library and information science being “a historically interdisciplinary and collaborative field, absorbing the knowledge of multiple domains and bringing the tools, techniques, and theories” (p. 1). Sugimoto et al (2012) sought a wide discussion among the library profession regarding the implications of Big Data on the profession, no differently from the activities in other fields (e.g., Wixom, Ariyachandra, Douglas, Goul, Gupta, Iyer, Kulkami, Mooney, Phillips-Wren, Turetken, 2014). A current Andrew Mellon Foundation grant for Visualizing Digital Scholarship in Libraries seeks an opportunity to view “both macro and micro perspectives, multi-user collaboration and real-time data interaction, and a limitless number of visualization possibilities – critical capabilities for rapidly understanding today’s large data sets (Hwangbo, 2014).

The importance of the library with its traditional roles, as described by Sugimoto et al (2012) may continue, considering the Big Data platform proposed by Wu, Wu, Khabsa, Williams, Chen, Huang, Tuarob, Choudhury, Ororbia, Mitra, & Giles (2014). Such platforms will continue to emerge and be improved, with librarians as the ultimate drivers of such platforms and as the mediators between the patrons and the data generated by such platforms.

Every library needs to find its place in the large organization and in society in regard to this very new and very powerful phenomenon called Big Data. Libraries might not have the trained staff to become a leader in the process of organizing and building the complex mechanism of this new knowledge architecture, but librarians must educate and train themselves to be worthy participants in this new establishment.

 

Method

 

The study will be cleared by the SCSU IRB.
The survey will collect responses from library population and it readiness to use and use of Big Data.  Send survey URL to (academic?) libraries around the world.

Data will be processed through SPSS. Open ended results will be processed manually. The preliminary research design presupposes a mixed method approach.

The study will include the use of closed-ended survey response questions and open-ended questions.  The first part of the study (close ended, quantitative questions) will be completed online through online survey. Participants will be asked to complete the survey using a link they receive through e-mail.

Mixed methods research was defined by Johnson and Onwuegbuzie (2004) as “the class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, concepts, or language into a single study” (Johnson & Onwuegbuzie, 2004 , p. 17).  Quantitative and qualitative methods can be combined, if used to complement each other because the methods can measure different aspects of the research questions (Sale, Lohfeld, & Brazil, 2002).

 

Sampling design

 

  • Online survey of 10-15 question, with 3-5 demographic and the rest regarding the use of tools.
  • 1-2 open-ended questions at the end of the survey to probe for follow-up mixed method approach (an opportunity for qualitative study)
  • data analysis techniques: survey results will be exported to SPSS and analyzed accordingly. The final survey design will determine the appropriate statistical approach.

 

Project Schedule

 

Complete literature review and identify areas of interest – two months

Prepare and test instrument (survey) – month

IRB and other details – month

Generate a list of potential libraries to distribute survey – month

Contact libraries. Follow up and contact again, if necessary (low turnaround) – month

Collect, analyze data – two months

Write out data findings – month

Complete manuscript – month

Proofreading and other details – month

 

Significance of the work 

While it has been widely acknowledged that Big Data (and its handling) is changing higher education (http://blog.stcloudstate.edu/ims?s=big+data) as well as academic libraries (http://blog.stcloudstate.edu/ims/2016/03/29/analytics-in-education/), it remains nebulous how Big Data is handled in the academic library and, respectively, how it is related to the handling of Big Data on campus. Moreover, the visualization of Big Data between units on campus remains in progress, along with any policymaking based on the analysis of such data (hence the need for comprehensive visualization).

 

This research will aim to gain an understanding on: a. how librarians are handling Big Data; b. how are they relating their Big Data output to the campus output of Big Data and c. how librarians in particular and campus administration in general are tuning their practices based on the analysis.

Based on the survey returns (if there is a statistically significant return), this research might consider juxtaposing the practices from academic libraries, to practices from special libraries (especially corporate libraries), public and school libraries.

 

 

References:

 

Adams Becker, S., Cummins M, Davis, A., Freeman, A., Giesinger Hall, C., Ananthanarayanan, V., … Wolfson, N. (2017). NMC Horizon Report: 2017 Library Edition.

Andrejevic, M., & Gates, K. (2014). Big Data Surveillance: Introduction. Surveillance & Society, 12(2), 185–196.

Asamoah, D. A., Sharda, R., Hassan Zadeh, A., & Kalgotra, P. (2017). Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course. Decision Sciences Journal of Innovative Education, 15(2), 161–190. https://doi.org/10.1111/dsji.12125

Bail, C. A. (2014). The cultural environment: measuring culture with big data. Theory and Society, 43(3–4), 465–482. https://doi.org/10.1007/s11186-014-9216-5

Borgman, C. L. (2015). Big Data, Little Data, No Data: Scholarship in the Networked World. MIT Press.

Bruns, A. (2013). Faster than the speed of print: Reconciling ‘big data’ social media analysis and academic scholarship. First Monday, 18(10). Retrieved from http://firstmonday.org/ojs/index.php/fm/article/view/4879

Bughin, J., Chui, M., & Manyika, J. (2010). Clouds, big data, and smart assets: Ten tech-enabled business trends to watch. McKinsey Quarterly, 56(1), 75–86.

Chen, X. W., & Lin, X. (2014). Big Data Deep Learning: Challenges and Perspectives. IEEE Access, 2, 514–525. https://doi.org/10.1109/ACCESS.2014.2325029

Cohen, J., Dolan, B., Dunlap, M., Hellerstein, J. M., & Welton, C. (2009). MAD Skills: New Analysis Practices for Big Data. Proc. VLDB Endow., 2(2), 1481–1492. https://doi.org/10.14778/1687553.1687576

Daniel, B. (2015). Big Data and analytics in higher education: Opportunities and challenges. British Journal of Educational Technology, 46(5), 904–920. https://doi.org/10.1111/bjet.12230

Daries, J. P., Reich, J., Waldo, J., Young, E. M., Whittinghill, J., Ho, A. D., … Chuang, I. (2014). Privacy, Anonymity, and Big Data in the Social Sciences. Commun. ACM, 57(9), 56–63. https://doi.org/10.1145/2643132

De Mauro, A. D., Greco, M., & Grimaldi, M. (2016). A formal definition of Big Data based on its essential features. Library Review, 65(3), 122–135. https://doi.org/10.1108/LR-06-2015-0061

De Mauro, A., Greco, M., & Grimaldi, M. (2015). What is big data? A consensual definition and a review of key research topics. AIP Conference Proceedings, 1644(1), 97–104. https://doi.org/10.1063/1.4907823

Dumbill, E. (2012). Making Sense of Big Data. Big Data, 1(1), 1–2. https://doi.org/10.1089/big.2012.1503

Eaton, M. (2017). Seeing Library Data: A Prototype Data Visualization Application for Librarians. Publications and Research. Retrieved from http://academicworks.cuny.edu/kb_pubs/115

Emanuel, J. (2013). Usability testing in libraries: methods, limitations, and implications. OCLC Systems & Services: International Digital Library Perspectives, 29(4), 204–217. https://doi.org/10.1108/OCLC-02-2013-0009

Graham, M., & Shelton, T. (2013). Geography and the future of big data, big data and the future of geography. Dialogues in Human Geography, 3(3), 255–261. https://doi.org/10.1177/2043820613513121

Harper, L., & Oltmann, S. (2017, April 2). Big Data’s Impact on Privacy for Librarians and Information Professionals. Retrieved November 7, 2017, from https://www.asist.org/publications/bulletin/aprilmay-2017/big-datas-impact-on-privacy-for-librarians-and-information-professionals/

Hashem, I. A. T., Yaqoob, I., Anuar, N. B., Mokhtar, S., Gani, A., & Ullah Khan, S. (2015). The rise of “big data” on cloud computing: Review and open research issues. Information Systems, 47(Supplement C), 98–115. https://doi.org/10.1016/j.is.2014.07.006

Hwangbo, H. (2014, October 22). The future of collaboration: Large-scale visualization. Retrieved November 7, 2017, from http://usblogs.pwc.com/emerging-technology/the-future-of-collaboration-large-scale-visualization/

Laney, D. (2001, February 6). 3D Data Management: Controlling Data Volume, Velocity, and Variety.

Miltenoff, P., & Hauptman, R. (2005). Ethical dilemmas in libraries: an international perspective. The Electronic Library, 23(6), 664–670. https://doi.org/10.1108/02640470510635746

Philip Chen, C. L., & Zhang, C.-Y. (2014). Data-intensive applications, challenges, techniques and technologies: A survey on Big Data. Information Sciences, 275(Supplement C), 314–347. https://doi.org/10.1016/j.ins.2014.01.015

Power, D. J. (2014). Using ‘Big Data’ for analytics and decision support. Journal of Decision Systems, 23(2), 222–228. https://doi.org/10.1080/12460125.2014.888848

Provost, F., & Fawcett, T. (2013). Data Science and its Relationship to Big Data and Data-Driven Decision Making. Big Data, 1(1), 51–59. https://doi.org/10.1089/big.2013.1508

Reilly, S. (2013, December 12). What does Horizon 2020 mean for research libraries? Retrieved November 7, 2017, from http://libereurope.eu/blog/2013/12/12/what-does-horizon-2020-mean-for-research-libraries/

Reyes, J. (2015). The skinny on big data in education: Learning analytics simplified. TechTrends: Linking Research & Practice to Improve Learning, 59(2), 75–80. https://doi.org/10.1007/s11528-015-0842-1

Schroeder, R. (2014). Big Data and the brave new world of social media research. Big Data & Society, 1(2), 2053951714563194. https://doi.org/10.1177/2053951714563194

Sugimoto, C. R., Ding, Y., & Thelwall, M. (2012). Library and information science in the big data era: Funding, projects, and future [a panel proposal]. Proceedings of the American Society for Information Science and Technology, 49(1), 1–3. https://doi.org/10.1002/meet.14504901187

Tene, O., & Polonetsky, J. (2012). Big Data for All: Privacy and User Control in the Age of Analytics. Northwestern Journal of Technology and Intellectual Property, 11, [xxvii]-274.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society; Newcastle upon Tyne, 12(2), 197–208.

Waller, M. A., & Fawcett, S. E. (2013). Data Science, Predictive Analytics, and Big Data: A Revolution That Will Transform Supply Chain Design and Management. Journal of Business Logistics, 34(2), 77–84. https://doi.org/10.1111/jbl.12010

Weiss, A. (2018). Big-Data-Shocks-An-Introduction-to-Big-Data-for-Librarians-and-Information-Professionals. Rowman & Littlefield Publishers. Retrieved from https://rowman.com/ISBN/9781538103227/Big-Data-Shocks-An-Introduction-to-Big-Data-for-Librarians-and-Information-Professionals

West, D. M. (2012). Big data for education: Data mining, data analytics, and web dashboards. Governance Studies at Brookings, 4, 1–0.

Willis, J. (2013). Ethics, Big Data, and Analytics: A Model for Application. Educause Review Online. Retrieved from https://docs.lib.purdue.edu/idcpubs/1

Wixom, B., Ariyachandra, T., Douglas, D. E., Goul, M., Gupta, B., Iyer, L. S., … Turetken, O. (2014). The current state of business intelligence in academia: The arrival of big data. CAIS, 34, 1.

Wu, X., Zhu, X., Wu, G. Q., & Ding, W. (2014). Data mining with big data. IEEE Transactions on Knowledge and Data Engineering, 26(1), 97–107. https://doi.org/10.1109/TKDE.2013.109

Wu, Z., Wu, J., Khabsa, M., Williams, K., Chen, H. H., Huang, W., … Giles, C. L. (2014). Towards building a scholarly big data platform: Challenges, lessons and opportunities. In IEEE/ACM Joint Conference on Digital Libraries (pp. 117–126). https://doi.org/10.1109/JCDL.2014.6970157

 

+++++++++++++++++
more on big data





case study

Feagin, J. R., Orum, A. M., & Sjoberg, G. (1991). A Case for the case study. Chapel Hill: University of North Carolina Press.

https://books.google.com/books/about/A_Case_for_the_Case_Study.html?id=7A39B6ZLyJQC

or ILL MSU,M Memorial Library –General Collection HM48 .C37 1991

p. 2 case study is defined as an in-depth

Multi-faceted investigation, using qualitative research methods, of a single social phenomenon.
use of several data sources.

Some case studies have made use of both qualitative and quantitative methods.

Comparative framework.

The social phenomenon can vary: it can be an organization, it can be a role, or role-occupants.

p. 3Quantitative methods: standardized set of q/s

intro to stat modeling

Introduction to Statistical Modelling (bibliography)

These are the books available at the SCSU library with their call #s:

Graybill, F. A. (1961). An introduction to linear statistical models. New York: McGraw-Hill. HA29 .G75

Dobson, A. J. (1983). Introduction to statistical modelling. London ; New York: Chapman and Hall. QA276 .D59 1983

Janke, S. J., & Tinsley, F. (2005). Introduction to linear models and statistical inference. Hoboken, NJ: Wiley. QA279 .J36 2005

++++++++++++++++++
resources from the Internet:

visuals (quick reference to terms and issues)

consider this short video:
http://blog.stcloudstate.edu/ims/2017/07/06/misleading-graphs/

++++++++++++++
more on quantitative and qualitative research in this IMS blog
http://blog.stcloudstate.edu/ims?s=quantitative
http://blog.stcloudstate.edu/ims?s=qualitative+research

document analysis methodology

document analysis – literature on the methodology

  • Bowen, G. A. (n.d.). Document Analysis as a Qualitative Research Method. Qualitative Research Journal, 9, 27–40.
    https://www.academia.edu/8434566/Document_Analysis_as_a_Qualitative_Research_Method
    Document analysis is a systematic procedure for reviewing or evaluating documents—both printed and electronic (computer-based and Internet-transmitted) material. Like other analytical methods in qualitative research, document analysis requires that data be examined and interpreted in order to elicit meaning, gain understanding, and develop empirical knowledge(Corbin&Strauss,2008;seealsoRapley,2007).
    Document analysis is often used in combination with other qualitative research methods as a means of triangulation—‘the combination of methodologies in the study of the same phenomenon’ (Denzin, 1970, p. 291)
    The qualitative researcher is expected to draw upon multiple (at least two) sources of evidence; that is, to seek convergence and corroboration through the use of different data sources and methods. Apart from documents, such sources include interviews, participant or non-participant observation, and physical artifacts (Yin,1994).By triangulating data, the researcher attempts to provide ‘a confluence of evidence that breeds credibility’ (Eisner, 1991, p. 110). By examining information collected through different methods, the researcher can corroborate findings across data sets and thus reduce the impact of potential biases that can exist in a single study. According to Patton (1990), triangulation helps the researcher guard against the accusation that a study’s findings are simply an artifact of a single method, a single source, or a single investigator’s bias. Mixed-method studies (which combine quantitative and qualitative research techniques)sometimes include document analysis. Here is an example: In their large-scale, three-year evaluation of regional educational service agencies (RESAs), Rossman and Wilson (1985) combined quantitative and qualitative methods—surveys (to collect quantitative data) and open ended, semi structured interviews with reviews of documents (as the primary sources of qualitative data). The document reviews were designed to identify the agencies that played a role in supporting school improvement programs.
  • Glenn A. Bowen, (2009) “Document Analysis as a Qualitative Research Method”, Qualitative Research Journal, Vol. 9 Issue: 2, pp.27-40, doi: 10.3316/QRJ0902027
    http://www.emeraldinsight.com/action/showCitFormats?doi=10.3316%2FQRJ0902027
  • Document Review and Analysis
    https://www.bcps.org/offices/lis/researchcourse/develop_docreview.html

Qualitative

  • Semiotics (studies the life of signs in society; seeks to understand the underlining messages in visual texts; forms basis for interpretive analysis)
  • Discourse Analysis (concerned with production of meaning through talk and texts; how people use language)
  • Interpretative Analysis (captures hidden meaning and ambiguity; looks at how messages are encoded or hidden; acutely aware of who the audience is)
  • Conversation Analysis (concerned with structures of talk in interaction and achievement of interaction)
  • Grounded Theory (inductive and interpretative; developing novel theoretical ideas based on the data)

Document Analysis
Document analysis is a form of qualitative research in which documents are interpreted by the researcher to give voice and meaning around an assessment topic. Analyzing documents incorporates coding content into themes similar to how focus group or interview transcripts are analyzed. A rubric can also be used to grade or score a document. There are three primary types of documents:

• Public Records: The official, ongoing records of an organization’s activities. Examples include student transcripts, mission statements, annual reports, policy manuals, student handbooks, strategic plans, and syllabi.

• Personal Documents: First-person accounts of an individual’s actions, experiences, and beliefs. Examples include calendars, e-mails, scrapbooks, blogs, Facebook posts, duty logs, incident reports, reflections/journals, and newspapers.

• Physical Evidence: Physical objects found within the study setting (often called artifacts). Examples include flyers, posters, agendas, handbooks, and training materials.

As with all research, how you collect and analyse the data should depend on what you want to find out. Since you haven’t told us that, it is difficult to give you any precise advice. However, one really important matter in using documents as sources, whatever the overall aim of your research, is that data from documents are very different from data from speech events such as interviews, or overheard conversations.So the first analytic question you need to ask with regard to documents is ‘how are these data shaped by documentary production ?’  Something which differentiates nearly all data from documents from speech data is that those who compose documents know what comes at the end while still able to alter the beginning; which gives far more opportunity for consideration of how the recepient of the utterances will view the provider; ie for more artful self-presentation. Apart from this however, analysing the way documentary practice shapes your data will depend on what these documents are: for example your question might turn out to be ‘How are news stories produced ?’ – if you are using news reports, or ‘What does this bureaucracy consider relevant information (and what not relevant and what unmentionable) ? if you are using completed proformas or internal reports from some organisation.

An analyse technique is just like a hardware tool. It depends where and with what you are working to choose the right one. For a nail you should use a hammer, nad there are lots of types of hammers to choose, depending on the type of nail.

So, in order to tell you the bettet technique, it is important to know the objectives you intend to reach and the theoretical framework you are using. Perhaps, after that, We could tell you if you should use content analysis, discourse or grounded theory (which type of it as, like the hammer, there are several types of GTs).

written after Bowen (2009), but well chewed and digested.

1. Introduction: Qualitative vs. Quantitative Research?

excellent guide to the structure of a qualitative research

++++++++++++++++
more on qualitative research in this IMS blog
http://blog.stcloudstate.edu/ims?s=qualitative+research

digititorium 2017

Digitorium 2017

The conference welcomes proposals for papers and interactive presentations about research or teaching approaches using digital methods. For the first time in 2017, Digitorium also seeks to provide training opportunities for scholars of all levels keen to learn new digital techniques to advance their work, whether by learning a new digital mapping tool, discovering simple ways of visualizing research findings, using computers to conduct large-scale qualitative research, or experimenting with big data approaches at your desktop. There will be a stream of hands-on workshops running throughout the conference enabling participants both to share their own work, and also to expand their portfolio.

Digitorium 2017 will take place from Thursday 2nd to Saturday 4th March, and again, our primary focus is on digital methods, as this has provided fertile ground for interdisciplinary conversations to grow. There will be “tracks” through the conference based on: methods; early modern studies; American studies; and digital pedagogy. We welcome presentations on any topics engaging digital methods for scholarly purposes, whether for research, teaching, or community projects.

In 2017, the conference is expanding once more to offer not only multiple plenary sessions, panels, papers, and roundtables, but also a concerted series of workshops offering training for delegates in a variety of Digital Humanities techniques for research and teaching, from mapping to text encoding, digital data analysis, and more, to support enhanced professional development opportunities at the conference for faculty, staff, and graduate students.

This year, we are proud to present two plenary sessions and our first-ever plenary hackathon! Professor Scott Gwara (Univ. of South Carolina) will be presenting on MS-Link, a database that he created reunifying scattered manuscripts into full digital codices. Additionally, joint principal investigators of the Isabella D’Este Archive (IDEA) Project, Professor Anne MacNeil (Univ. of North Carolina at Chapel Hill) and Professor Deanna Shemek (Univ. of California Santa Cruz) will be presenting their work on a digital archive uniting music, letters, and ceramics, and will lead our first live hackathon, engaging participants in the new virtual reality component of their project.

There will once again be a discounted “group rate” for registration to enable participants to bring their team with them, as collaboration is such a hallmark in digital scholarship, and it would be great to be able to hear about projects from multiple different perspectives from the people working together on them. There are also discounted rates available for graduate student presenters, and UA faculty. I do not mean to impose, but if this is an event which would be of interest to colleagues and collaborators, I would be enormously grateful if you might be able to circulate our CFP or a link to our website with them, we really want to let as many people as possible know about the conference to ensure it will be a real success.

Here is a link to the website which includes the full-length CFP:

https://apps.lib.ua.edu/blogs/digitorium/

Methods provide the focus for our conference, both in a pragmatic sense in terms of the use of different techniques to achieve particular DH projects, but also the ways in which sharing digital methods can create new links between disciplines in the humanities and social sciences. The idea powering Digitorium is to build on the community which has emerged in the course of the previous two years’ events in order to create a space for conversations to take place between scholars, graduate students, and practitioners from many different disciplines about their shared methods and techniques which unite them in their digital work.

++++++++++++++++++

more on digital humanities in this IMS blog:
http://blog.stcloudstate.edu/ims?s=digital+humanities

bibliography on open access

bibliography on “open access”
permanent link to the search: http://scsu.mn/2dtGtUg

Tomlin, P. (2009). A Matter of Discipline: Open Access, the Humanities, and Art History. Canadian Journal Of Higher Education, 39(3), 49-69.

Recent events suggest that open access has gained new momentum in the humanities, but the slow and uneven development of open-access initiatives in humanist fields continues to hinder the consolidation of efforts across the university. Although various studies have traced the general origins of the humanities’ reticence to embrace open access, few have actually considered the scholarly practices and disciplinary priorities that shape a discipline’s adoption of its principles. This article examines the emergence, potential and actualized, of open access in art history. Part case study, part conceptual mapping, the discussion is framed within the context of three interlocking dynamics: the present state of academic publishing in art history; the dominance of the journal and self-archiving repository within open-access models of scholarly production; and the unique roles played by copyright and permissions in art historical scholarship. It is hoped that tracing the discipline-specific configuration of research provides a first step toward both investigating the identity that open access might assume within the humanities, from discipline to discipline, and explaining how and why it might allow scholars to better serve themselves and their audiences.

Solomon, D. J., & Björk, B. (2012). A study of open access journals using article processing charges. Journal Of The American Society For Information Science & Technology, 63(8), 1485-1495. doi:10.1002/asi.22673

Article processing charges ( APCs) are a central mechanism for funding open access (OA) scholarly publishing. We studied the APCs charged and article volumes of journals that were listed in the Directory of Open Access Journals as charging APCs. These included 1,370 journals that published 100,697 articles in 2010. The average APC was $906 U.S. dollars (USD) calculated over journals and $904 USD calculated over articles. The price range varied between $8 and $3,900 USD, with the lowest prices charged by journals published in developing countries and the highest by journals with high-impact factors from major international publishers. Journals in biomedicine represent 59% of the sample and 58% of the total article volume. They also had the highest APCs of any discipline. Professionally published journals, both for profit and nonprofit, had substantially higher APCs than journals published by societies, universities, or scholars/researchers. These price estimates are lower than some previous studies of OA publishing and much lower than is generally charged by subscription publishers making individual articles OA in what are termed hybrid journals.

Beaubien, S., & Eckard, M. (2014). Addressing Faculty Publishing Concerns with Open Access Journal Quality Indicators. Journal Of Librarianship & Scholarly Communication, 2(2), 1-11. doi:10.7710/2162-3309.1133

BACKGROUND The scholarly publishing paradigm is evolving to embrace innovative open access publication models. While this environment fosters the creation of high-quality, peer-reviewed open access publications, it also provides opportunities for journals or publishers to engage in unprofessional or unethical practices. LITERATURE REVIEW Faculty take into account a number of factors in deciding where to publish, including whether or not a journal engages in ethical publishing practices. Librarians and scholars have attempted to address this issue in a number of ways, such as generating lists of ethical/unethical publishers and general guides. DESCRIPTION OF PROJECT In response to growing faculty concern in this area, the Grand Valley State University Libraries developed and evaluated a set of Open Access Journal Quality Indicators that support faculty in their effort to identify the characteristics of ethical and unethical open access publications. NEXT STEPS Liaison librarians have already begun using the Indicators as a catalyst in sparking conversation around open access publishing and scholarship. Going forward, the Libraries will continue to evaluate and gather feedback on the Indicators, taking into account emerging trends and practices.

Husain, S., & Nazim, M. (2013). Analysis of Open Access Scholarly Journals in Media & Communication. DESIDOC Journal Of Library & Information Technology, 33(5), 405-411.

he paper gives an account of the origin and development of the Open Access Initiative and explains the concept of open access publishing. It also highlight various facets related to the open access scholarly publishing in the field of Media & Communication on the basis of data collected from the most authoritative online directory of open access journals, i.e., Directory of Open Access Journals (DOAJ). The DOAJ covers 8492 open access journals of which 106 journals are listed under the subject heading ‘Media & Communication’. Most of the open access journals in Media & Communication were started during late 1990s and are being published from 34 different countries on 6 continents in 13 different languages. More than 80 % open access journals are being published by the not-for-profit sector such as academic institutions and universities.

Reed, K. (2014). Awareness of Open Access Issues Differs among Faculty at Institutions of Different Sizes. Evidence Based Library & Information Practice, 9(4), 76-77.

Objective — This study surveyed faculty awareness of open access (OA) issues and the institutional repository (IR) at the University of Wisconsin. The authors hoped to use findings to inform future IR marketing strategies to faculty. Design — Survey. Setting — University of Wisconsin-Eau Claire, a small, regional public university (approximately 10,000 students). Subjects — 105 faculty members. Methods — The authors contacted 397 faculty members inviting them to participate in an 11 question online survey. Due to anonymity issues on a small campus, respondents were not asked about rank and discipline, and were asked to not provide identifying information. A definition of OA was not provided by the authors, as survey participants were queried about their own definition. Main Results — Approximately 30% of the faculty were aware of OA issues. Of all the definitions of OA given by survey respondents, “none … came close” to the definition favoured by the authors (p. 145). More than 30% of the faculty were unable to define OA at a level deemed basic by the authors. A total of 51 (48.57%) of the survey respondents indicated that there are OA journals in their disciplines. Another 6 (5.71%) of the faculty members claimed that there are no OA journals in their disciplines, although most provided a definition of OA and several considered OA publishing to be “very important.” The remaining 48 participants (46%) were unsure if there are OA journals in their disciplines. Of these survey respondents, 38 answered that they have not published in an OA journal, 10 were unsure, and 21 believed that their field benefits or would benefit from OA journals. Survey respondents cited quality of the journal, prestige, and peer review as extremely important in selecting a journal in which to publish. Conclusion — The authors conclude that the level of awareness related to OA issues must be raised before IRs can flourish. They ponder how university and college administrators regard OA publishing, and the influence this has on the tenure and promotion process

KELTY, C. (2014). BEYOND COPYRIGHT AND TECHNOLOGY: What Open Access Can Tell Us about Precarity, Authority, Innovation, and Automation in the University Today. Cultural Anthropology (Society For Cultural Anthropology), 29(2), 203-215. doi:10.14506/ca29.2.02

In this interview, we discuss what open access can teach us about the state of the university, as well as practices in scholarly publishing. In particular the focus is on issues of labor and precarity, the question of how open access enables or blocks other innovations in scholarship, the way open access might be changing practices of scholarship, and the role of technology and automation in the creation, evaluation, and circulation of scholarly work

Armbruster, C. (2008). Cyberscience and the Knowledge-Based Economy. Open Access and Trade Publishing: From Contradiction to Compatibility with Non-Exclusive Copyright Licensing. Policy Futures In Education, 6(4), 439-452.

Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge distribution and scientific publishing. It is argued, on the one hand, that for the academy there principally is no digital dilemma surrounding copyright and there is no contradiction between open science and the knowledge-based economy if profits are made from non-exclusive rights. On the other hand, pressure for the “digital doubling” of research articles in open access repositories (the “green road”) is misguided and the current model of open access publishing (the “gold road”) has not much future outside biomedicine. Commercial publishers must understand that business models based on the transfer of copyright have not much future either. Digital technology and its economics favour the severance of distribution from certification. What is required of universities and governments, scholars and publishers, is to clear the way for digital innovations in knowledge distribution and scholarly publishing by enabling the emergence of a competitive market that is based on non-exclusive rights. This requires no change in the law but merely an end to the praxis of copyright transfer and exclusive licensing. The best way forward for research organisations, universities and scientists is the adoption of standard copyright licences that reserve some rights, namely Attribution and No Derivative Works, but otherwise will allow for the unlimited reproduction, dissemination and re-use of the research article, commercial uses included.

Kuth, M. (2012). ‘Deswegen wird kein Buch weniger verkauft!’ Hybride Publikation von MALIS Praxisprojekten an der Fachhochschule Köln. (German). Bibliothek Forschung Und Praxis, 36(1), 103-109.

The article reports on a library and information science project at the Fachhochschule Köln (University of Applied Sciences, Cologne), Germany, to produce a hybrid, print and online research publication, “MALIS Praxisprojekte 2011,” which is available at http://www.b-i-t-online.de/daten/bitinnovativ.php#band35. It discusses the publishing process from writing to distribution and the implications of combining open access and for-fee publishing models for value chains in the publishing industry.

Riedel, S. (2012). Distanz zu Wissenschaftlern und Studenten verringern. (German). Bub: Forum Bibliothek Und Information, 64(7/8), 491-492.

A report from the International Bielefeld Conference on April 24-26, 2012 in Bielefeld, Germany is presented. Presentations discussed include the role of information storage and retrieval in libraries, Open Access publishing and content licenses, and the increased automation of the Bielefeld University library.

Ramirez, M., Dalton, J. j., McMillan, G. g., Read, M., & Seamans, N.. (2013). Do Open Access Electronic Theses and Dissertations Diminish Publishing Opportunities in the Social Sciences and Humanities? Findings from a 2011 Survey of Academic Publishers. College & Research Libraries, 74(4), 368-380.

n increasing number of higher education institutions worldwide are requiring submission of electronic theses and dissertations (ETDs) by graduate students and are subsequently providing open access to these works in online repositories. Faculty advisors and graduate students are concerned that such unfettered access to their work could diminish future publishing opportunities. This study investigated social sciences, arts, and humanities journal editors’ and university press directors’ attitudes toward ETDs. The findings indicate that manuscripts that are revisions of openly accessible ETDs are always welcome for submission or considered on a case-by-case basis by 82.8 percent of journal editors and 53.7 percent of university press directors polled.

Schuurman, N. (2013). Editorial /Éditorial. Canadian Geographer, 57(2), 117-118. doi:10.1111/cag.12027

The author reflects on the use of the Open Access (OA) publishing for publications. She states that in OA publishing, an un-blinded peer review format is used wherein the authors’ names are known to the reviewer. She mentions that the countries such as Great Britain and Canada passed legislations which mandates the use of OA journals in university publications and health research. She also relates the impact of the changes in publishing to the print versions of journals.

Bazeley, J. W., Waller, J., & Resnis, E. (2014). Engaging Faculty in Scholarly Communication Change: A Learning Community Approach. Journal Of Librarianship & Scholarly Communication, 2(3), 1-13. doi:10.7710/2162-3309.1129

As the landscape of scholarly communication and open access continues to shift, it remains important for academic librarians to continue educating campus stakeholders about these issues, as well as to create faculty advocates on campus. DESCRIPTION OF PROGRAM Three librarians at Miami University created a Faculty Learning Community (FLC) on Scholarly Communication to accomplish this. The FLC, composed of faculty, graduate students, staff, and librarians, met throughout the academic year to read and discuss topics such as open access, journal economics, predatory publishing, alternative metrics (altmetrics), open data, open peer review, etc. NEXT STEPS The members of the FLC provided positive evaluations about the community and the topics about which they learned, leading the co-facilitators to run the FLC for a second year. The library’s Scholarly Communication Committee is creating and implementing a scholarly communication website utilizing the structure and content identified by the 2012-2013 FLC

Deutsche Forschungsgemeinschaft, (2010). Freier Zugang zu Forschungsergebnissen. Bub: Forum Bibliothek Und Information, 62(1), 7.

The article reports that the research society Deutsche Forschungsgemeinschaft (DFG) has expanded their support of open access publishing so that universities can now request that the DFG finance publication of their scientific works in open access journals.

Ottina, D. (2013). From Sustainable Publishing To Resilient Communications. Triplec (Cognition, Communication, Co-Operation): Open Access Journal For A Global Sustainable Information Society, 11(2), 604-613.

In their opening reflection on Open Access (OA) in this special section, Fuchs and Sandoval (2013) argue the current policy debate on Open Access publishing is limited by a for-profit bias which blinds it to much of the most innovative activity in Open Access. They further argue for a refocusing of the policy debate within a public service, commons based perspective of academic knowledge production. I pick up these themes by looking at another key term, sustainable publishing, in an effort to contextualize the policy debate on OA within the broader context of the privatization of the university. From this perspective, the policy debate reveals an essential tension between top-down and bottomup cultures in legitimizing knowledge. This is a tension that has profound implications for scholarly practices mediated through digital networked communications. Explicitly acknowledging this fundamental tension gives additional insight into formulating strategies for maintaining an academic culture of free and open inquiry. I suggest that the frame of resilient communications expresses the dynamic nature of scholarly communications better than that of sustainable publishing, and that empowering scholars through practice-based OA initiatives is essential in broadening grass roots support for equitable Open Access amongst scholars

Stevens, L. M. (2013). From the Editor: Getting What You Pay For? Open Access and the Future of Humanities Publishing. Tulsa Studies In Women’s Literature, 32(1), 7-21.

The article discusses the potential impact of the open access publishing movement on humanities scholarship and publishing. It is suggested that although the free circulation of knowledge is a positive goal, scholars and activists must be careful not to undermine the value of the scholarly and editorial labor which makes quality humanities publications possible. The author also suggests that authors who post their articles for open access or on university commons should pay journals a fee.

Thatcher, S. (2009). From the University Presses–Open Access and the Future of Scholarly Communication. Against the Grain, 21(5), 78-81.

The article presents a speech by the author, delivered on September 23, 2009 as part of the Andrew Neilly Lecture Series at the University of Rochester, in which he discussed open access publishing in terms of university presses and scholarly communication. He presented an overview of the history of such issues, and a forecast of likely future developments.

Dunham, G., & Walters, C. (2014). From University Press to the University’s Press: Building a One-Stop Campus Resource for Scholarly Publishing. Against The Grain, 26(6), 28-30.

The article examines the Office of Scholarly Publishing (OSP) at Indiana University (IU) in Bloomington, Indiana. Topics discussed include the role played in the OSP by Indiana University Press (IU Press), the role played by IUScholarWorks (IUSW), which is an open access publishing initiative administered by IU Libraries, and the location of the university’s publishing activities, which is the Herman B. Wells Library at IU.

Abadal, E. (2013). Gold or green: the debate on Open Access policies. International Microbiology, 16(3), 199-203. doi:10.2436/20.1501.01.194

The movement for open access to science seeks to achieve unrestricted and free access to academic publications on the Internet. To this end, two mechanisms have been established: the gold road, in which scientific journals are openly accessible, and the green road, in which publications are self-archived in repositories. The publication of the Finch Report in 2012, advocating exclusively the adoption of the gold road, generated a debate as to whether either of the two options should be prioritized. The recommendations of the Finch Report stirred controversy among academicians specialized in open access issues, who felt that the role played by repositories was not adequately considered and because the green road places the burden of publishing costs basically on authors. The Finch Report’s conclusions are compatible with the characteristics of science communication in the UK and they could surely also be applied to the (few) countries with a powerful publishing industry and substantial research funding. In Spain, both the current national legislation and the existing rules at universities largely advocate the green road. This is directly related to the structure of scientific communication in Spain, where many journals have little commercial significance, the system of charging a fee to authors has not been adopted, and there is a good repository infrastructure. As for open access policies, the performance of the scientific communication system in each country should be carefully analyzed to determine the most suitable open access strategy.

Bargheer, M., & Schmidt, B. (2008). Göttingen University Press: Publishing services in an Open Access environment. Information Services & Use, 28(2), 133-139.

The article presents a round table discussion that focuses on publishing services in an open access environment that are offered by Göttingen University Press. Begun as an additional service for the Göttingen State and University Library repository, it offers a publication consulting service on behalf of the university. It covers diverse topics such as sciences, life sciences, and humanities.

Jubb, M. (2011). Heading for the Open Road: Costs and Benefits of Transitions in Scholarly Communications. Liber Quarterly: The Journal Of European Research Libraries, 21(1), 102-124.

This paper reports on a study — overseen by representatives of the publishing, library and research funder communities in the UK — investigating the drivers, costs and benefits of potential ways to increase access to scholarly journals. It identifies five different but realistic scenarios for moving towards that end over the next five years, including gold and green open access, moves towards national licensing, publisherled delayed open access, and transactional models. It then compares and evaluates the benefits as well as the costs and risks for the UK. The scenarios, the comparisons between them, and the modelling on which they are based, amount to a benefit-cost analysis to help in appraising policy options over the next five years. Our conclusion is that policymakers who are seeking to promote increases in access should encourage the use of existing subject and institutional repositories, but avoid pushing for reductions in embargo periods, which might put at risk the sustainability of the underlying scholarly publishing system. They should also promote and facilitate a transition to gold open access, while seeking to ensure that the average level of charges for publication does not exceed circa £2,000; that the rate in the UK of open access publication is broadly in step with the rate in the rest of the world; and that total payments to journal publishers from UK universities and their funders do not rise as a consequence.

Tickell, A. (2013). Implementing Open Access in the United Kingdom. Information Services & Use, 33(1), 19-26. doi:10.3233/ISU-130688

Since July 2012, the UK has been undergoing an organized transition to open access. As of 01 April 2013, revised open access policies are coming into effect. Open access implementation requires new infrastructures for funding publishing. Universities as institutions increasingly will be central to managing article-processing charges, monitoring compliance and organizing deposit. This article reviews the implementation praxis between July 2012 and April 2013, including ongoing controversy and review, which has mainly focussed on embargo length

Hawkins, K. K. (2014). How We Pay for Publishing. Against The Grain, 26(6), 35-36.

The article examines the financial aspects of scholarly publishing. Topics discussed include the impact of these financial aspects on academic libraries and university presses, the concept of open access publishing and the financial considerations related to it, and the use of article processing charges (APC) in open access publishing.

Butler, D. (2013). Investigating journals: The dark side of publishing. Nature, 495(7442), 433-435. doi:10.1038/495433a

The article focuses on the investigation of Jeffrey Beall, academic librarian and university researcher at the University of Colorado in Denver regarding the practices of open-access publishing. It says that Beall who became a watchdog for open-access publishers criticizes them on his blog Scholarly Open Access. Beall adds that he was not prepared for the exponential growth of the occurrence of questionable publishers. The insights of publishers on the approach of Beall are also discussed.

2012 was basically the year of the predatory publisher; that was when they really exploded,” says Beall. He estimates that such outfits publish 5–10% of all open-access articles.
Beall’s list and blog are widely read by librar – ians, researchers and open-access advocates,
many of whom applaud his efforts to reveal shady publishing practices —
Wilson, K. k. (2013). Librarian vs. (Open Access) Predator: An Interview with Jeffrey Beall. Serials Review, 39(2), 125-128.
In February 2013, Kristen Wilson interviewed Jeffrey Beall, scholarly initiatives librarian at the University of Colorado Denver. Beall discusses “predatory” open access and its implications for scholarly publishing

Richard, J., Koufogiannakis, D., & Ryan, P. (2009). Librarians and Libraries Supporting Open Access Publishing. Canadian Journal Of Higher Education, 39(3), 33-48

As new models of scholarly communication emerge, librarians and libraries have responded by developing and supporting new methods of storing and providing access to information and by creating new publishing support services. This article will examine the roles of libraries and librarians in developing and supporting open access publishing initiatives and services in higher education. Canadian university libraries have been key players in the development of these services and have been bolstered by support from librarians working through and within their professional associations on advocacy and advancement initiatives, and by significant funding from the Canadian Foundation for Innovation for the Synergies initiative–a project designed to allow Canadian social science and humanities journals to publish online. The article also reflects on the experiences of three librarians involved in the open access movement at their libraries, within Canadian library associations, and as creators, managers, and editors in two new open access journals in the field of library and information studies: Evidence-based Library and Information Practice published out of the University of Alberta; and Partnership: the Canadian Journal of Library and Information Practice and Research hosted by the University of Guelph. As active participants in the creation of open access content within their own field, the authors are able to lend their experience to faculty in other disciplines and provide meaningful and responsive library service development.
Hansson, J., & Johannesson, K. (2013). Librarians’ Views of Academic Library Support for Scholarly Publishing: An Every-day Perspective. Journal Of Academic Librarianship, 39(3), 232-240. doi:10.1016/j.acalib.2013.02.002
This article reports on a study of academic librarians’ views of their work and possibilities regarding support for researchers’ publishing. Institutional repositories and Open Access are areas being dealt with in particular. Methods used are highly qualitative; data was gathered at two Swedish university libraries over a six month period through focus group interview sessions and personal logs by informants. Findings indicate that attitudes are often in collision with practicalities in the daily work in libraries. Even though they have a high degree of knowledge and awareness of scholarly publication patterns, librarians often feel insecure in the approach of researchers. There is a felt redirection in the focus of academic librarianship, from pedagogical information seeking tasks towards a more active publication support, a change which also includes a regained prominence for new forms of bibliographical work. Although there are some challenges, proactive attitudes among librarians are felt as being important in developing further support for researchers’ publishing.
Pinter, F. (2012). Open Access for Scholarly Books?. Publishing Research Quarterly, 28(3), 183-191. doi:10.1007/s12109-012-9285-0
Over the past two decades, sales of monographs have shrunk by 90 % causing prices to rise dramatically as fewer copies are sold. University libraries struggle to assemble adequate collections, and students and scholars are deprived access, especially in the developing world. Open access can play an important role in ensuring both access to knowledge and encouraging the growth of new markets for scholarly books. This article argues that by facilitating a truly global approach to funding the up-front costs of publishing and open access, there is a sustainable future for the specialist academic ‘long form publication’. Knowledge Unlatched is a new initiative that is creating an international library consortium through which publishers will be able to recover their fixed costs while at the same time reducing prices for libraries
Bauer, B., & Stieg, K. (2010). OPEN ACCESS PUBLISHING IN AUSTRIA: DEVELOPMENT AND FUTURE PERSPECTIVES. Bulletin Of The Transilvania University Of Brasov, Series IV: Philology & Cultural Studies, 3(52), 271-278.
The following article provides an overview of Open Access Publishing in Austria in 2010. First of all, the participation of Austrian institutions in signing Open Access declarations and Open Access events in Austria are presented. Secondly, the article shows the development of both the Green Road to Open Access (repositories) as well as the Golden Road (Open Access Journals) in Austria. The article also describes the Open Access policies of the most important funding agency in Austria, the biggest university of the country as well as Universities Austria, the association of the 21 public universities in Austria. Finally, the paper raises the question of how Open Access is to be financed and explains the legal framework conditions for Open Access in Austria.
Nariani, R. r., & Fernandez, L. l. (2012). Open Access Publishing: What Authors Want. College & Research Libraries, 73(2), 182-195.
 Campus-based open access author funds are being considered by many academic libraries as a way to support authors publishing in open access journals. Article processing fees for open access have been introduced recently by publishers and have not yet been widely accepted by authors. Few studies have surveyed authors on their reasons for publishing open access and their perceptions of open access journals. The present study was designed to gauge the uptake of library support for author funding and author satisfaction with open access publishing. Results indicate that York University authors are increasingly publishing in open access journals and are appreciative of library funding initiatives. The wider implications of open access are discussed along with specific recommendations for publishers.
Stanton, K. V., & Liew, C. L. (2011). Open Access Theses in Institutional Repositories: An Exploratory Study of the Perceptions of Doctoral Students. Information Research: An International Electronic Journal, 16(4),
We examine doctoral students’ awareness of and attitudes to open access forms of publication. Levels of awareness of open access and the concept of institutional repositories, publishing behaviour and perceptions of benefits and risks of open access publishing were explored. Method: Qualitative and quantitative data were collected through interviews with eight doctoral students enrolled in a range of disciplines in a New Zealand university and a self-completion Web survey of 251 students. Analysis: Interview data were analysed thematically, then evaluated against a theoretical framework. The interview data were then used to inform the design of the survey tool. Survey responses were analysed as a single set, then by disciple using SurveyMonkey’s online toolkit and Excel. Results: While awareness of open access and repository archiving is still low, the majority of interview and survey respondents were found to be supportive of the concept of open access. The perceived benefits of enhanced exposure and potential for sharing outweigh the perceived risks. The majority of respondents were supportive of an existing mandatory thesis submission policy. Conclusions: Low levels of awareness of the university repository remains an issue, and could be addressed by further investigating the effectiveness of different communication channels for promotion.
Mussell, J. (2013). Open Access. Journal Of Victorian Culture (Routledge), 18(4), 526-527. doi:10.1080/13555502.2013.865980

An introduction is presented to the articles within the issue on the theme of open access publishing in Great Britain during the early 2010s, including topics on the economic aspects of and the British government’s policy on open access publishing and its impact on university libraries.

Open access is not new: there is a thriving culture of open access in the sciences and
scholars in the digital humanities have been advocating open publication of research
for some time to share methods, results and data. However, the British Government’s
recent endorsement of the Finch Report (officially titled ‘Accessibility, sustainability, excellence: how to expand access to research publications: Report of the Working Group on Expanding Access to Published Research Findings’), has made open access a central concern for all researchers in UK higher education. The underlying economics and politics of journal publication arc now under scrutiny as never before.
an author-pays version of ‘gold’ open access publishing, where costs of publishing were shifted from the customer (university libraries) onto the producer (scholars), was seen by many as a way of implementing open access without disturbing the status quo. Instead of purchasing research once it has been published, universities will pay for research to be published.
While this model ensures an income stream for publishers (and it always costs something to publish), it reconfigures the relationship between scholars, their research and their institution.
The so-called ‘green’ route to publishing, where articles are made open access after their initial publication in a traditional,subscription-based journal, usually by means of deposit in an institutional repository, has focused attention on the embargo periods demanded by publishers.
Leptin, M. (2012, March 16). Open Access–Pass the Buck. Science. p. 1279.
The author reflects on open access as a model for scientific publishing. She notes that most scientists support open access despite continued controversy about the economics and political consequences of open access among various groups, including researchers, publishers, and universities. Also discussed are the financial implications of open access from the author’s point of view as an editor of the non-profit publishing group the European Molecular Biology Organization
Peters, M. A. (2009). Open Education and the Open Science Economy. Yearbook Of The National Society For The Study Of Education, 108(2), 203-225.
Openness as a complex code word for a variety of digital trends and movements has emerged as an alternative mode of “social production” based on the growing and overlapping complexities of open source, open access, open archiving, open publishing, and open science. This paper argues that the openness movement with its reinforcing structure of overlapping networks of production, access, publishing, archiving, and distribution provide an emerging architecture of alterative educational globalization not wedded to existing neoliberal forms. The open education movement and paradigm has arrived: it emerges from a complex historical background and its futures are intimately tied not only to open source, open access and open publishing movements but also to the concept of the “open society” itself which has multiple, contradictory, and contested meanings. This paper first theorizes the development and significance of “open education” by reference to the Open University, OpenCourseWare (OCW) and open access movements. The paper takes this line of argument further, arguing for a conception of “open science economy” which involves strategic international research collaborations and provides an empirical and conceptual link between university science and the global knowledge economy.
Adam, M. (2013). Open-Access-Publizieren in der Medizin – im Fokus der Bibliometrie an der SLUB Dresden. GMS Medizin-Bibliothek-Information, 13(3), 1-11. doi:10.3205/mbi000291
Since 2012, the Team Bibliometrics in the Electronic Publishing Group at the SLUB Dresden has been supporting scientists but also institutes at the Technical University Dresden in bibliometric issues. Open access (OA) publishing is one of the main topics. The recent analysis identified OA journals in the field of medicine indexed in the Web of Science (WoS) database on the basis of the Directory of Open Access Journals. Subsequently, the journal titles were examined according to their importance in the selected subject categories and the geographical distribution of editorial countries in the first part. The second part dealt with the articles in these journals and the citations contained therein. The results show an amount of 9.7 per cent of OA journals in relation to the total amount of all journals in the selected WoS subject categories. 14 per cent could be assigned to the upper quartile Q1 (Top 25 per cent). For most of the OA journals Great Britain was determined as the publishing country. The analysis of articles with German participation reveals interesting methods to obtain information in the participating authors, institutions, networks and their specific subjects. The result of citation analysis of these articles shows, that articles from traditional journals are the most cited ones.
Kersting, A., & Pappenberger, K. (2009). Promoting open access in Germany as illustrated by a recent project at the Library of the University of Konstanz. OCLC Systems & Services, 25(2), 105-113. doi:10.1108/10650750910961901
With the illustration of a best practice example for an implementation of open access in a scientific institution, the paper will be useful in fostering future open access projects. Design/methodology/approach – The paper starts with a brief overview of the existing situation of open access in Germany. The following report describes the results of a best practice example, added by the analysis of a survey on the position about open access by the scientists at the University of Konstanz. Findings – The dissemination of the advantages of open access publishing is fundamental for the success of implementing open access in a scientific institution. For the University of Konstanz, it is shown that elementary factors of success are an intensive cooperation with the head of the university and a vigorous approach to inform scholars about open access. Also, some more conditions are essential to present a persuasive service: The Library of the University of Konstanz offers an institutional repository as an open access publication platform and hosts open journal systems for open access journals. High-level support and consultation for open access publishing at all administrative levels is provided. The integration of the local activities into national and international initiatives and projects is pursued for example by the joint operation of the information platform open-access.net. Originality/value – The paper offers insights in one of the most innovative open access projects in Germany. The University of Konstanz belongs to the pioneers of the open access movement in Germany and is currently running a successful open access project.
Beals, M. H. (2013). Rapunzel and the Ivory Tower: How Open Access Will Save the Humanities (from Themselves). Journal Of Victorian Culture (Routledge), 18(4), 543-550. doi:10.1080/13555502.2013.865977
The author argues in favor of open access publishing, contending that it will bridge university academics and academic scholarship’s relationship with the public sphere. An overview of open access publishing’s impact on academic journals, including in regard to periodical subscriptions, membership fees and the discourse on history within society, is provided. An overview of digital access to open access publishing is also provided.
crisis of authorship has centred on the charging of Article Processing Charges (APCs) and how best to accommodate the shift from pay-to-read to pay-to-publish models.
Pochoda, P. (2008). Scholarly Publication at the Digital Tipping Point. Journal Of Electronic Publishing, 11(2), 8.

The article presents information on a joint publishing project “Digitalculturebooks” between the University of Michigan Press and the Scholarly Publishing Office of Michigan University Library in Michigan. The aim of the project was to publish books about new media in a printed version and an open access (OA) online version. It is mentioned that the project not only intended to publish innovative and accessible work about the social, cultural, and political impact of new and to collect data about the variation in reading habits and preferences across different scholarly reading communities, but also to explore the opportunities and the obstacles involved in a press working in a partnership with a technologically abled library unit with a business model.

Scientific Publishing: the Dilemma of Research Funding Organisations. (2009). European Review, 17(1), 23-31.

Present changes in scientific publishing, especially those summarised by the term ?Open Access? (OA), may ultimately lead to the complete replacement of a reader-paid to an author, or funding-paid, publication system. This transformation would shift the financial burden for scientific publishing from the Research Performing Organisations (RPOs), particularly from scientific libraries, universities, etc, to the Research Funding Organisations (RFOs). The transition phase is difficult; it leads to double funding of OA publications (by subscriptions and author-sponsored OA) and may thus increase the overall costs of scientific publishing. This may explain why ? with a few exceptions ? RFOs have not been at the forefront of the OA paradigm in the past. In 2008, the General Assembly of EUROHORCs, the European organisation of the heads of research councils, agreed to recommend to its member organisations at least a minimal standard of Open Access based on the Berlin Declaration of 2003 (green way of OA). In the long run, the publishing system needs some fundamental changes to reduce the present costs and to keep up its potential. In order to design a new system, all players have to cooperate and be ready to throw overboard some old traditions, lovable as they may be.

Kennan, M. A. (2010). The economic implications of alternative publishing models: views from a non-economist. Prometheus, 28(1), 85-89. doi:10.1080/08109021003676391

In this article the author discusses economic aspects of alternative economic models for scholarly publishing with reference to a report by J. Houghton and C. Oppenheim. The author present information on the economic models discussed in Houghton and Oppenheim report to the Great Britain’s Joint Information Systems Committee (JISC). He discusses the open access (OA) publishing and suggests that mandates should be made by universities for OA.

I cannot respond to their paper in either of these roles. Instead, I propose to respond both as an academic who conducts research, writes about it and tries to get it published, and as a researcher interested in scholarly communication, publishing and open access.
To continue with a system (of scholarly publishing or anything else) without regularly investigating and analyzing the alternatives, is neither common sense nor scholarly.
Hawkins, K. S. (2014). The Evolution of Publishing Agreements at the University of Michigan Library. Journal Of Librarianship & Scholarly Communication, 2(4), 90-94. doi:10.7710/2162-3309.1175
Taking as an example an open-access journal with a single editor, this article discusses the various configurations of rights agreements used by the University of Michigan Library throughout the evolution of its publishing operation, the advantages of the various models, and the reasons for moving from one to another.
Bankier, J., & Perciali, I. (2008). The Institutional Repository Rediscovered: What Can a University Do for Open Access Publishing?. Serials Review, 34(1), 21-26. doi:10.1016/j.serrev.2007.12.003
Universities have always been one of the key players in open access publishing and have encountered the particular obstacle that faces this Green model of open access, namely, disappointing author uptake. Today, the university has a unique opportunity to reinvent and to reinvigorate the model of the institutional repository. This article explores what is not working about the way we talk about repositories to authors today and how can we better meet faculty needs. More than an archive, a repository can be a showcase that allows scholars to build attractive scholarly profiles, and a platform to publish original content in emerging open-access journals. Serials Review 2008; 34:21-26.
Collister, L. B., Deliyannides, T. S., & Dyas-Correia, S. (2014). The Library as Publisher. Serials Librarian, 66(1-4), 20-29.
This article describes a half-day preconference that focused on the library as publisher. It examined how the movement from print to online publication has impacted the roles of libraries and their ability to take on new roles as publishers. The session explored the benefits of libraries becoming publishers, and discussed Open Access, what it is and is not and its importance to libraries and scholarly communication. A detailed case study of the publishing operations of the University Library System at the University of Pittsburgh was presented as an example of a successful library publishing program. The session provided an opportunity for participants to discover ways that libraries can be involved in publishing
OA literature is digital, online, free of charge, and free of most copyright and licensing restrictions. OA works are still covered by copyright law, but spe- cial license terms such as Creative Commons licenses are applied to allow sharing and reuse. All major OA initiatives for scientific and scholarly litera- ture insist on the importance of peer review. OA is therefore compatible with copyright, peer review, revenue (even profit), print, preservation, prestige,
quality, career advancement, indexing, and supportive services associated with conventional scholarly literature. OA is not Open Source, which applies to computer software, nor Open Content, which applies to non-scholarly content, nor Open Data, which is a movement to support sharing of research data, nor free access, which carries no monetary charges for access, yet all rights may be reserved.
Changing laws, like the Digital Millennium Copyright Act (DMCA) and the Research Works Act, as well as the Google Books copyright settlement and its aftermath, have also had an important impact on scholarly communication.
The changing scholarly communication environment has led to chang- ing economic models, including the advent of the “Big Deal” for the purchaseof journals and e-books, the creation of the pay-per-view model and other alternative purchasing models. It has also led to the creation of OA publish- ing models, the Hybrid OA publishing model, and self-publishing. Today,
over 150 universities around the world mandate OA deposits of faculty works and the Directory of Open Access Journals (DOAJ) lists 9,437 OA journals in 119 countries.The Directory of Open Access Repositories (OpenDOAR) lists 2,284 open archives in 103 countries.
Potvin, S. (2013). The Principle and the Pragmatist 1 [1] The title draws on David Lewis’s comment: “Open access journals claim two advantages: the first is pragmatic and the second is principled.” See David W. Lewis, “The Inevitability of Open Access,” College &Research Libraries 73:5 (September 2012): 493–506. : On Conflict and Coalescence for Librarian Engagement with Open Access Initiatives. Journal Of Academic Librarianship, 39(1), 67-75.
This article considers Open Access (OA) training and the supports and structures in place in academic libraries in the United States from the perspective of a new librarian. OA programming is contextualized by the larger project of Scholarly Communication in academic libraries, and the two share a historical focus on journal literature and a continued emphasis on public access and the economics of scholarly publishing. Challenges in preparing academic librarians for involvement with OA efforts include the evolving and potentially divergent nature of the international OA movement and the inherent tensions of a role with both principled and pragmatic components that serves a particular university community as well as a larger movement.
Bastos, F., Vidotti, S., & Oddone, N. (2011). The University and its libraries: Reactions and resistance to scientific publishers. Information Services & Use, 31(3/4), 121-129.
 This paper addresses the relationship of copyright and the right of universities on scientific production. Information and Communication Technologies (ICTs) are causing many changes in the system of scientific communication, such as the creation of Institutional Repositories that aim to gather scientific production in digital format. The University needs quicker ways of spreading academic production and many questions are emerging due to contexts such as the Open Access movement. Thus, this paper questions the positioning of Universities, especially Public Universities, which despite having policies related to intellectual property to protect the transferring forms of research results to society; many times do not have a positioning or a mechanism that regulates the self-deposit of scientific production in these Institutional Repositories. In order to develop this paper, the following issues are addressed: lack of interest of the University in storing scientific production; reports on the relationship of the library with scientific publishing houses; the participation of faculty members and students in supporting the Free Access movement; and initiatives aimed at greater flexibility of copyright to the context of scientific production. In order to follow the development of these issues at international level, it was opted for qualitative research with non-participating direct observation to carry out the identification and description of copyright policy of important publishers from the ROMEO SHERPA site; therefore, it can be observed that there are changes regarding the publishers’ flexibility before self-archiving of authors in open access institutional repositories in their universities. Given this scenario, we present reflections and considerations that involve the progress and mainly the integration of the University and its faculty members; the institution should recommend and guide its faculty members not to transfer their copyrights, but to defend their right of copy to Institutional Repositories along with Publishing Houses
Jagodzinski, C. M. (2008). The University Press in North America: A Brief History. Journal Of Scholarly Publishing, 40(1), 1-20. doi:10.3138/jsp.40.1.1
Simon-Ritz, F. (2012). Warten auf die Wissenschaftsschranke. Bub: Forum Bibliothek Und Information, 64(9), 562-564.
An article on the debate over copyright law and Open Access publishing in Germany is presented. The author describes the demands for noncommercial secondary usage rights by schools, libraries, and universities, as well as detailing the sections of the copyright laws which he considers most damaging to the larger research community
O’Donnell, M. P. (2014). What is the future of scholarly journals in an open access environment?. American Journal Of Health Promotion, 29(1), v-vi. doi:10.4278/ajhp.29.1.v
This editorial provides an overview of journey of the journal American Journal of Health Promotion. This journal would continue to be allowed to publish these articles but wanted me to understand the public would also have free access to them online. This university was following the lead of the Harvard Law School Open Access Policy, which was adopted by faculty at Harvard and Stanford in 2008, at MIT in 2009, and at many other prestigious universities and colleges since then. The traditional publishers want to maximize subscriber satisfaction so they can sell more subscriptions and minimize the number of accepted manuscripts to reduce the cost of printing, whereas the fee-based online publishers want to increase the number of accepted manuscripts to maximize publishing fees. The cost of this subscription is $895/y. The subscription must be in place before the article is typeset.
Armato, D. (2012). What Was a University Press?. Against The Grain, 24(6), 58-62.
Hall, R. (2014). You Say You Want a Publishing Revolution. Progressive Librarian, (43), 35-46.
A recent study published in PLoS ONE estimated that 27 million, or 24%, of the 114 million English-language scholarly documents available through Google Scholar and Microsoft Academic Search are freely available on the web (Khabsa & Giles, 2014). While this is not nearly as much as open access advocates would like, it shows a significant step in the right direction. Though the authors of this study fail to acknowledge the sources of this free
information, it can be surmised that library publishing initiatives—including open access journals and institutional repositories—have contributed greatly.

TurnitIn

We know that many of you have been interested in exploring Turnitin in the past, so we are excited to bring you an exclusive standardized price and more information on the roll out of Feedback Studio, replacing the Turnitin you have previously seen. We would like to share some exciting accessibility updates, how Feedback Studio can help faculty deliver formative feedback to students and help students become writers. Starting today thru December 31st non-integrated Feedback Studio will be $2.50 and integrated Feedback Studio will be $3 for new customers! Confused by the name? Don’t be! Turnitin is new and improved! Check out this video to learn about Feedback Studio!

Meet your exclusive Turnitin Team!

Ariel Ream – Account Executive, Indianapolis aream@turnitin.com – 317.650.2795
Juliessa Rivera – Relationship Manager, Oakland jrivera@iparadigms.com – 510.764.7698

Juan Valladares – Account Representative, Oakland
jvalladares@turnitin.com – 510.764.7552
To learn more, please join us for a WebEx on September 21st. We will be offering free 30 day pilots to anyone who attends!
Turnitin Webinar
Wednesday, September 21, 2016
11:00 am | Central Daylight Time (Chicago) | 1 hr
Meeting number (access code): 632 474 162
https://mnscu.webex.com/mnscu/j.php?MTID=mebaec2ae9d1d25e6774d16717719008d

+++++++++++++++++++

my notes from the webinar

I am prejudiced against TI and I am not hiding it; that does not mean that I am wrong.
For me, TurnitIn (TI) is an anti-pedagogical “surfer,” using the hype of “technology” to ride the wave of overworked faculty, who hope to streamline increasing workload with technology instead of working on pedagogical resolutions of not that new issues.

Low and behold, Juan, the TI presenter is trying to dazzle me with stuff, which does not dazzle me for a long time.
WCAG 2.0 AA standards of the W3C and section 508 of the rehabilitation act.
the sales pitch: 79% of students believe in feedback, but only %50+ receive it. HIs source is TurnitIn surveys from 2012 to 2016 (very very small font size (ashamed of it?))
It seems to me very much like “massaged” data.
Testimonials: one professor and one students. Ha. the apex of qualitative research…

next sales pitch: TurnitIn feedback studio. Not any more the old Classic. It assesses the originality. Drag and drop macro-style notes. Pushing rubrics. but we still fight for rubrics in D2L. If we have a large amount of adjuncts. Ha. another gem. “I know that you are, guys, IT folks.” So the IT folks are the Trojan horse to get the faculty on board. put comments on
This presentation is structured dangerously askew: IT people but no faculty. If faculty is present, they will object that they ARE capable of doing the same which is proposed to be automated.
More , why do i have to pay for another expensive software, if we have paid already Microsoft? MS Word can do everything that has been presented so far. Between MS Word and D2L, it becomes redundant.
why the heck i am interested about middle school and high school.

TI was sued for illegal collection of paper; paper are stored in their database without the consent of the students’ who wrote it. TI goes “great length to protect the identity of the students,” but still collects their work [illegally?}

November 10 – 30 day free trial

otherwise, $3 per student, prompts back: between Google, MS Word and D2L (which we already heftily pay for), why pay another exuberant price.

D2L integration: version, which does not work. LTI.
“small price to pay of such a beauty” – it does not matter how quick and easy the integration is, it is a redundancy, which already can be resolved with existing tools, part of which we are paying hefty price for

https://d2l.custhelp.com/app/answers/detail/a_id/1668/

Play recording (1 hr 4 min 19 sec)
https://mnscu.webex.com/mnscu/ldr.php?RCID=a9b182b4ca8c4d74060f0fd29d6a5b5c

1 2 3