Learning analytics

Last updated

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. [1] The growth of online learning since the 1990s, particularly in higher education, has contributed to the advancement of Learning Analytics as student data can be captured and made available for analysis. [2] [3] [4] When learners use an LMS, social media, or similar online tools, their clicks, navigation patterns, time on task, social networks, information flow, and concept development through discussions can be tracked. The rapid development of massive open online courses (MOOCs) offers additional data for researchers to evaluate teaching and learning in online environments. [5]

Contents

Definition

Although a majority of Learning Analytics literature has started to adopt the aforementioned definition, the definition and aims of Learning Analytics are still contested.

George Siemens is a writer, theorist, speaker, and researcher on learning, networks, technology, analytics and visualization, openness, and organizational effectiveness in digital environments. He is the originator of Connectivism theory and author of the article Connectivism: A Learning Theory for the Digital Age and the book Knowing Knowledge - an exploration of the impact of the changed context and characteristics of knowledge. He is the founding President of the Society for Learning Analytics Research (SoLAR). George Siemens at TEDxNYED.jpg
George Siemens is a writer, theorist, speaker, and researcher on learning, networks, technology, analytics and visualization, openness, and organizational effectiveness in digital environments. He is the originator of Connectivism theory and author of the article Connectivism: A Learning Theory for the Digital Age and the book Knowing Knowledge – an exploration of the impact of the changed context and characteristics of knowledge. He is the founding President of the Society for Learning Analytics Research (SoLAR).

Learning Analytics as a prediction model

One earlier definition discussed by the community suggested that Learning Analytics is the use of intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning. [8] But this definition has been criticised by George Siemens [9] [ non-primary source needed ] and Mike Sharkey. [10] [ non-primary source needed ]

Learning Analytics as a generic design framework

Dr. Wolfgang Greller and Dr. Hendrik Drachsler defined learning analytics holistically as a framework. They proposed that it is a generic design framework that can act as a useful guide for setting up analytics services in support of educational practice and learner guidance, in quality assurance, curriculum development, and in improving teacher effectiveness and efficiency. It uses a general morphological analysis (GMA) to divide the domain into six "critical dimensions". [11]

Learning Analytics as data-driven decision making

The broader term "Analytics" has been defined as the science of examining data to draw conclusions and, when used in decision-making, to present paths or courses of action. [12] From this perspective, Learning Analytics has been defined as a particular case of Analytics, in which decision-making aims to improve learning and education. [13] During the 2010s, this definition of analytics has gone further to incorporate elements of operations research such as decision trees and strategy maps to establish predictive models and to determine probabilities for certain courses of action. [12]

Learning Analytics as an application of analytics

Another approach for defining Learning Analytics is based on the concept of Analytics interpreted as the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. [14] [15] From this point of view, Learning Analytics emerges as a type of Analytics (as a process), in which the data, the problem definition and the insights are learning-related.

In 2016, a research jointly conducted by the New Media Consortium (NMC) and the EDUCAUSE Learning Initiative (ELI) -an EDUCAUSE Program- describes six areas of emerging technology that will have had significant impact on higher education and creative expression by the end of 2020. As a result of this research, Learning analytics was defined as an educational application of web analytics aimed at learner profiling, a process of gathering and analyzing details of individual student interactions in online learning activities. [16]

Dragan Gasevic is a pioneer and leading researcher in learning analytics. He is a founder and past President (2015-2017) of the Society for Learning Analytics Research (SoLAR). Dragan Gasevic asks questions about learning analytics (31193792351).jpg
Dragan Gašević is a pioneer and leading researcher in learning analytics. He is a founder and past President (2015-2017) of the Society for Learning Analytics Research (SoLAR).

Learning analytics as an application of data science

In 2017, Gašević, Коvanović, and Joksimović proposed a consolidated model of learning analytics. [17] The model posits that learning analytics is defined at the intersection of three disciplines: data science, theory, and design. Data science offers computational methods and techniques for data collection, pre-processing, analysis, and presentation. Theory is typically drawn from the literature in the learning sciences, education, psychology, sociology, and philosophy. The design dimension of the model includes: learning design, interaction design, and study design. In 2015, Gašević, Dawson, and Siemens argued that computational aspects of learning analytics need to be linked with the existing educational research in order for Learning Analytics to deliver its promise to understand and optimize learning. [18]

Learning analytics versus educational data mining

Differentiating the fields of educational data mining (EDM) and learning analytics (LA) has been a concern of several researchers. George Siemens takes the position that educational data mining encompasses both learning analytics and academic analytics, [19] the former of which is aimed at governments, funding agencies, and administrators instead of learners and faculty. Baepler and Murdoch define academic analytics as an area that "...combines select institutional data, statistical analysis, and predictive modeling to create intelligence upon which learners, instructors, or administrators can change academic behavior". [20] They go on to attempt to disambiguate educational data mining from academic analytics based on whether the process is hypothesis driven or not, though Brooks [21] questions whether this distinction exists in the literature. Brooks [21] instead proposes that a better distinction between the EDM and LA communities is in the roots of where each community originated, with authorship at the EDM community being dominated by researchers coming from intelligent tutoring paradigms, and learning anaytics researchers being more focused on enterprise learning systems (e.g. learning content management systems).

Regardless of the differences between the LA and EDM communities, the two areas have significant overlap both in the objectives of investigators as well as in the methods and techniques that are used in the investigation. In the MS program offering in learning analytics at Teachers College, Columbia University, students are taught both EDM and LA methods. [22]

Historical contributions

Learning Analytics, as a field, has multiple disciplinary roots. While the fields of artificial intelligence (AI), statistical analysis, machine learning, and business intelligence offer an additional narrative, the main historical roots of analytics are the ones directly related to human interaction and the education system. [5] More in particular, the history of Learning Analytics is tightly linked to the development of four Social Sciences' fields that have converged throughout time. These fields pursued, and still do, four goals:

  1. Definition of Learner, in order to cover the need of defining and understanding a learner.
  2. Knowledge trace, addressing how to trace or map the knowledge that occurs during the learning process.
  3. Learning efficiency and personalization , which refers to how to make learning more efficient and personal by means of technology.
  4. Learner – content comparison, in order to improve learning by comparing the learner's level of knowledge with the actual content that needs to master. [5] (Siemens, George (2013-03-17). Intro to Learning Analytics. LAK13 open online course for University of Texas at Austin & Edx. 11 minutes in. Retrieved 2018-11-01.)

A diversity of disciplines and research activities have influenced in these 4 aspects throughout the last decades, contributing to the gradual development of learning analytics. Some of most determinant disciplines are Social Network Analysis, User Modelling, Cognitive modelling, Data Mining and E-Learning. The history of Learning Analytics can be understood by the rise and development of these fields. [5]

Social Network Analysis

Social network analysis (SNA) is the process of investigating social structures through the use of networks and graph theory. [23] It characterizes networked structures in terms of nodes (individual actors, people, or things within the network) and the ties, edges, or links (relationships or interactions) that connect them.[ citation needed ] Social network analysis is prominent in Sociology, and its development has had a key role in the emergence of Learning Analytics. One of the first examples or attempts to provide a deeper understanding of interactions is by Austrian-American Sociologist Paul Lazarsfeld. In 1944, Lazarsfeld made the statement of "who talks to whom about what and to what effect". [24] That statement forms what today is still the area of interest or the target within social network analysis, which tries to understand how people are connected and what insights can be derived as a result of their interactions, a core idea of Learning Analytics. [5]

Citation analysis

American linguist Eugene Garfield was an early pioneer in analytics in science. In 1955, Garfield led the first attempt to analyse the structure of science regarding how developments in science can be better understood by tracking the associations (citations) between articles (how they reference one another, the importance of the resources that they include, citation frequency, etc). Through tracking citations, scientists can observe how research is disseminated and validated. This was the basic idea of what eventually became a "page rank", which in the early days of Google (beginning of the 21st century) was one of the key ways of understanding the structure of a field by looking at page connections and the importance of those connections. The algorithm PageRank -the first search algorithm used by Google- was based on this principle. [25] [26] American computer scientist Larry Page, Google's co-founder, defined PageRank as "an approximation of the importance" of a particular resource. [27] Educationally, citation or link analysis is important for mapping knowledge domains. [5]

The essential idea behind these attempts is the realization that, as data increases, individuals, researchers or business analysts need to understand how to track the underlying patterns behind the data and how to gain insight from them. And this is also a core idea in Learning Analytics. [5]

Digitalization of Social network analysis

During the early 1970s, pushed by the rapid evolution in technology, Social network analysis transitioned into analysis of networks in digital settings. [5]

  1. Milgram's 6 degrees experiment. In 1967, American social psychologist Stanley Milgram and other researchers examined the average path length for social networks of people in the United States, suggesting that human society is a small-world-type network characterized by short path-lengths. [28]
  2. Weak ties . American Sociologist Mark Granovetter's work on the strength of what is known as weak ties; his 1973 article "The Strength of Weak Ties" is one of the most influential and most cited articles in Social Sciences. [29]
  3. Networked individualism . Towards the end of the 20th century, Sociologist Barry Wellman's research extensively contributed the theory of social network analysis. In particular, Wellman observed and described the rise of "networked individualism" – the transformation from group-based networks to individualized networks. [30] [31] [32]


During the first decade of the century, Professor Caroline Haythornthwaite explored the impact of media type on the development of social ties, observing that human interactions can be analyzed to gain novel insight not from strong interactions (i.e. people that are strongly related to the subject) but, rather, from weak ties. This provides Learning Analytics with a central idea: apparently un-related data may hide crucial information. As an example of this phenomenon, an individual looking for a job will have a better chance of finding new information through weak connections rather than strong ones. [33] (Siemens, George (2013-03-17). Intro to Learning Analytics. LAK13 open online course for University of Texas at Austin & Edx. 11 minutes in. Retrieved 2018-11-01.)

Her research also focused on the way that different types of media can impact the formation of networks. Her work highly contributed to the development of social network analysis as a field. Important ideas were inherited by Learning Analytics, such that a range of metrics and approaches can define the importance of a particular node, the value of information exchange, the way that clusters are connected to one another, structural gaps that might exist within those networks, etc. [5]

The application of social network analysis in digital learning settings has been pioneered by Professor Shane P. Dawson. He has developed a number of software tools, such as Social Networks Adapting Pedagogical Practice (SNAPP) for evaluating the networks that form in [learning management systems] when students engage in forum discussions. [34]

User modelling

The main goal of user modelling is the customization and adaptation of systems to the user's specific needs, especially in their interaction with computing systems. The importance of computers being able to respond individually to into people was starting to be understood in the decade of 1970s. Dr Elaine Rich in 1979 predicted that "computers are going to treat their users as individuals with distinct personalities, goals, and so forth". [35] This is a central idea not only educationally but also in general web use activity, in which personalization is an important goal. [5]

User modelling has become important in research in human-computer interactions as it helps researchers to design better systems by understanding how users interact with software. [36] Recognizing unique traits, goals, and motivations of individuals remains an important activity in learning analytics. [5]

Personalization and adaptation of learning content is an important present and future direction of learning sciences, and its history within education has contributed to the development of learning analytics. [5] Hypermedia is a nonlinear medium of information that includes graphics, audio, video, plain text and hyperlinks. The term was first used in a 1965 article written by American Sociologist Ted Nelson. [37] Adaptive hypermedia builds on user modelling by increasing personalization of content and interaction. In particular, adaptive hypermedia systems build a model of the goals, preferences and knowledge of each user, in order to adapt to the needs of that user. From the end of the 20th century onwards, the field grew rapidly, mainly due to that the internet boosted research into adaptivity and, secondly, the accumulation and consolidation of research experience in the field. In turn, Learning Analytics has been influenced by this strong development. [38]

Education/cognitive modelling

Education/cognitive modelling has been applied to tracing how learners develop knowledge. Since the end of the 1980s and early 1990s, computers have been used in education as learning tools for decades. In 1989, Hugh Burns argued for the adoption and development of intelligent tutor systems that ultimately would pass three levels of "intelligence": domain knowledge, learner knowledge evaluation, and pedagogical intervention. During the 21st century, these three levels have remained relevant for researchers and educators. [39]

In the decade of 1990s, the academic activity around cognitive models focused on attempting to develop systems that possess a computational model capable of solving the problems that are given to students in the ways students are expected to solve the problems. [40] Cognitive modelling has contributed to the rise in popularity of intelligent or cognitive tutors. Once cognitive processes can be modelled, software (tutors) can be developed to support learners in the learning process. The research base on this field became, eventually, significantly relevant for learning analytics during the 21st century. [5] [41] [42]


Epistemic Frame Theory

While big data analytics has been more and more widely applied in education, Wise and Shaffer [43] addressed the importance of theory-based approach in the analysis. Epistemic Frame Theory conceptualized the "ways of thinking, acting, and being in the world" in a collaborative learning environment. Specifically, the framework is based on the context of Community of Practice (CoP), which is a group of learners, with common goals, standards and prior knowledge and skills, to solve a complex problem. Due to the essence of CoP, it is important to study the connections between elements (learners, knowledge, concepts, skills and so on). To identify the connections, the co-occurrences of elements in learners' data are identified and analyzed.

Shaffer and Ruis [44] pointed out the concept of closing the interpretive loop, by emphasizing the transparency and validation of model, interpretation and the original data. The loop can be closed by a good theoretical sound analytics approaches, Epistemic Network Analysis.

Other contributions

In a discussion of the history of analytics, Adam Cooper highlights a number of communities from which learning analytics has drawn techniques, mainly during the first decades of the 21st century, including: [45]

  1. Statistics, which are a well established means to address hypothesis testing.
  2. Business intelligence, which has similarities with learning analytics, although it has historically been targeted at making the production of reports more efficient through enabling data access and summarising performance indicators.
  3. Web analytics, tools such as Google Analytics report on web page visits and references to websites, brands and other key terms across the internet. The more "fine grain" of these techniques can be adopted in learning analytics for the exploration of student trajectories through learning resources (courses, materials, etc.).
  4. Operational research, which aims at highlighting design optimisation for maximising objectives through the use of mathematical models and statistical methods. Such techniques are implicated in learning analytics which seek to create models of real world behaviour for practical application.
  5. Artificial intelligence methods (combined with machine learning techniques built on data mining) are capable of detecting patterns in data. In learning analytics such techniques can be used for intelligent tutoring systems, classification of students in more dynamic ways than simple demographic factors, and resources such as "suggested course" systems modelled on collaborative filtering techniques.
  6. Information visualization, which is an important step in many analytics for sensemaking around the data provided, and is used across most techniques (including those above). [45]


Learning analytics programs

The first graduate program focused specifically on learning analytics was created by Ryan S. Baker and launched in the Fall 2015 semester at Teachers College, Columbia University. The program description states that

"(...)data about learning and learners are being generated today on an unprecedented scale. The fields of learning analytics (LA) and educational data mining (EDM) have emerged with the aim of transforming this data into new insights that can benefit students, teachers, and administrators. As one of world's leading teaching and research institutions in education, psychology, and health, we are proud to offer an innovative graduate curriculum dedicated to improving education through technology and data analysis." [46]


Masters programs are now offered at several other universities as well, including the University of Texas at Arlington, the University of Wisconsin, and the University of Pennsylvania.

Analytic methods

Methods for learning analytics include:

Applications

Learning Applications can be and has been applied in a noticeable number of contexts.

General purposes

Analytics have been used for:

Benefits for stakeholders

There is a broad awareness of analytics across educational institutions for various stakeholders, [12] but that the way learning analytics is defined and implemented may vary, including: [15]

  1. for individual learners to reflect on their achievements and patterns of behaviour in relation to others. Particularly, the following areas can be set out for measuring, monitoring, analyzing and changing to optimize student performance: [50]
    1. Monitoring individual student performance
    2. Disaggregating student performance by selected characteristics such as major, year of study, ethnicity, etc.
    3. Identifying outliers for early intervention
    4. Predicting potential so that all students achieve optimally
    5. Preventing attrition from a course or program
    6. Identifying and developing effective instructional techniques
    7. Analyzing standard assessment techniques and instruments (i.e. departmental and licensing exams)
    8. Testing and evaluation of curricula. [50]
  2. as predictors of students requiring extra support and attention;
  3. to help teachers and support staff plan supporting interventions with individuals and groups;
  4. for functional groups such as course teams seeking to improve current courses or develop new curriculum offerings; and
  5. for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency and effectiveness measures. [15]

Some motivations and implementations of analytics may come into conflict with others, for example highlighting potential conflict between analytics for individual learners and organisational stakeholders. [15]

Software

Much of the software that is currently used for learning analytics duplicates functionality of web analytics software, but applies it to learner interactions with content. Social network analysis tools are commonly used to map social connections and discussions. Some examples of learning analytics software tools include:

Ethics and privacy

The ethics of data collection, analytics, reporting and accountability has been raised as a potential concern for learning analytics, [11] [59] [60] with concerns raised regarding:

As Kay, Kom and Oppenheim point out, the range of data is wide, potentially derived from: [62]

Thus the legal and ethical situation is challenging and different from country to country, raising implications for: [62]

In some prominent cases like the inBloom disaster, [63] even full functional systems have been shut down due to lack of trust in the data collection by governments, stakeholders and civil rights groups. Since then, the learning analytics community has extensively studied legal conditions in a series of experts workshops on "Ethics & Privacy 4 Learning Analytics" that constitute the use of trusted learning analytics. [64] [ non-primary source needed ] Drachsler & Greller released an 8-point checklist named DELICATE that is based on the intensive studies in this area to demystify the ethics and privacy discussions around learning analytics. [65]

  1. D-etermination: Decide on the purpose of learning analytics for your institution.
  2. E-xplain: Define the scope of data collection and usage.
  3. L-egitimate: Explain how you operate within the legal frameworks, refer to the essential legislation.
  4. I-nvolve: Talk to stakeholders and give assurances about the data distribution and use.
  5. C-onsent: Seek consent through clear consent questions.
  6. A-nonymise: De-identify individuals as much as possible
  7. T-echnical aspects: Monitor who has access to data, especially in areas with high staff turn-over.
  8. E-xternal partners: Make sure externals provide highest data security standards

It shows ways to design and provide privacy conform learning analytics that can benefit all stakeholders. The full DELICATE checklist is publicly available. [66]

Privacy management practices of students have shown discrepancies between one's privacy beliefs and one's privacy related actions. [67] Learning analytic systems can have default settings that allow data collection of students if they do not choose to opt-out. [67] Some online education systems such as edX or Coursera do not offer a choice to opt-out of data collection. [67] In order for certain learning analytics to function properly, these systems utilize cookies to collect data. [67]

Open learning analytics

In 2012, a systematic overview on learning analytics and its key concepts was provided by Professor Mohamed Chatti and colleagues through a reference model based on four dimensions, namely:

Chatti, Muslim and Schroeder [70] note that the aim of open learning analytics (OLA) is to improve learning effectiveness in lifelong learning environments. The authors refer to OLA as an ongoing analytics process that encompasses diversity at all four dimensions of the learning analytics reference model. [68]

See also

Further reading

For general audience introductions, see:


Related Research Articles

Instructional scaffolding is the support given to a student by an instructor throughout the learning process. This support is specifically tailored to each student; this instructional approach allows students to experience student-centered learning, which tends to facilitate more efficient learning than teacher-centered learning. This learning process promotes a deeper level of learning than many other common teaching strategies.

<span class="mw-page-title-main">Social network analysis</span> Analysis of social structures using network and graph theory

Social network analysis (SNA) is the process of investigating social structures through the use of networks and graph theory. It characterizes networked structures in terms of nodes and the ties, edges, or links that connect them. Examples of social structures commonly visualized through social network analysis include social media networks, meme spread, information circulation, friendship and acquaintance networks, peer learner networks, business networks, knowledge networks, difficult working relationships, collaboration graphs, kinship, disease transmission, and sexual relationships. These networks are often visualized through sociograms in which nodes are represented as points and ties are represented as lines. These visualizations provide a means of qualitatively assessing networks by varying the visual representation of their nodes and edges to reflect attributes of interest.

Situated learning is a theory that explains an individual's acquisition of professional skills and includes research on apprenticeship into how legitimate peripheral participation leads to membership in a community of practice. Situated learning "takes as its focus the relationship between learning and the social situation in which it occurs".

<span class="mw-page-title-main">Problem-based learning</span> Learner centric pedagogy

Problem-based learning (PBL) is a student-centered pedagogy in which students learn about a subject through the experience of solving an open-ended problem found in trigger material. The PBL process does not focus on problem solving with a defined solution, but it allows for the development of other desirable skills and attributes. This includes knowledge acquisition, enhanced group collaboration and communication.

Collaborative learning is a situation in which two or more people learn or attempt to learn something together. Unlike individual learning, people engaged in collaborative learning capitalize on one another's resources and skills. More specifically, collaborative learning is based on the model that knowledge can be created within a population where members actively interact by sharing experiences and take on asymmetric roles. Put differently, collaborative learning refers to methodologies and environments in which learners engage in a common task where each individual depends on and is accountable to each other. These include both face-to-face conversations and computer discussions. Methods for examining collaborative learning processes include conversation analysis and statistical discourse analysis.

A learning management system (LMS) or virtual learning environment (VLE) is a software application for the administration, documentation, tracking, reporting, automation, and delivery of educational courses, training programs, materials or learning and development programs. The learning management system concept emerged directly from e-Learning. Learning management systems make up the largest segment of the learning system market. The first introduction of the LMS was in the late 1990s. LMSs have been adopted by almost all higher education institutions in the English-speaking world. Learning management systems have faced a massive growth in usage due to the emphasis on remote learning during the COVID-19 pandemic.

Educational technology is the combined use of computer hardware, software, and educational theory and practice to facilitate learning. When referred to with its abbreviation, "EdTech," it often refers to the industry of companies that create educational technology. In EdTech Inc.: Selling, Automating and Globalizing Higher Education in the Digital Age, Tanner Mirrlees and Shahid Alvi (2019) argue "EdTech is no exception to industry ownership and market rules" and "define the EdTech industries as all the privately owned companies currently involved in the financing, production and distribution of commercial hardware, software, cultural goods, services and platforms for the educational market with the goal of turning a profit. Many of these companies are US-based and rapidly expanding into educational markets across North America, and increasingly growing all over the world."

Technology integration is defined as the use of technology to enhance and support the educational environment. Technology integration in the classroom can also support classroom instruction by creating opportunities for students to complete assignments on the computer rather than with normal pencil and paper. In a larger sense, technology integration can also refer to the use of an integration platform and application programming interface (API) in the management of a school, to integrate disparate SaaS applications, databases, and programs used by an educational institution so that their data can be shared in real-time across all systems on campus, thus supporting students' education by improving data quality and access for faculty and staff.

"Curriculum integration with the use of technology involves the infusion of technology as a tool to enhance the learning in a content area or multidisciplinary setting... Effective technology integration is achieved when students can select technology tools to help them obtain information on time, analyze and synthesize it, and present it professionally to an authentic audience. Technology should become an integral part of how the classroom functions—as accessible as all other classroom tools. The focus in each lesson or unit is the curriculum outcome, not the technology."

An intelligent tutoring system (ITS) is a computer system that imitates human tutors and aims to provide immediate and customized instruction or feedback to learners, usually without requiring intervention from a human teacher. ITSs have the common goal of enabling learning in a meaningful and effective manner by using a variety of computing technologies. There are many examples of ITSs being used in both formal education and professional settings in which they have demonstrated their capabilities and limitations. There is a close relationship between intelligent tutoring, cognitive learning theories and design; and there is ongoing research to improve the effectiveness of ITS. An ITS typically aims to replicate the demonstrated benefits of one-to-one, personalized tutoring, in contexts where students would otherwise have access to one-to-many instruction from a single teacher, or no teacher at all. ITSs are often designed with the goal of providing access to high quality education to each and every student.

Computer-supported collaborative learning (CSCL) is a pedagogical approach wherein learning takes place via social interaction using a computer or through the Internet. This kind of learning is characterized by the sharing and construction of knowledge among participants using technology as their primary means of communication or as a common resource. CSCL can be implemented in online and classroom learning environments and can take place synchronously or asynchronously.

Adaptive hypermedia (AH) uses hypermedia which is adaptive according to a user model. In contrast to regular hypermedia, where all users are offered the same set of hyperlinks, adaptive hypermedia (AH) tailors what the user is offered based on a model of the user's goals, preferences and knowledge, thus providing links or content most appropriate to the current user.

<span class="mw-page-title-main">Open education</span> Educational movement

Open education is an educational movement founded on openness, with connections to other educational movements such as critical pedagogy, and with an educational stance which favours widening participation and inclusiveness in society. Open education broadens access to the learning and training traditionally offered through formal education systems and is typically offered through online and distance education. The qualifier "open" refers to the elimination of barriers that can preclude both opportunities and recognition for participation in institution-based learning. One aspect of openness or "opening up" education is the development and adoption of open educational resources in support of open educational practices.

Connectivism is a theoretical framework for understanding learning in a digital age. It emphasizes how internet technologies such as web browsers, search engines, wikis, online discussion forums, and social networks contributed to new avenues of learning. Technologies have enabled people to learn and share information across the World Wide Web and among themselves in ways that were not possible before the digital age. Learning does not simply happen within an individual, but within and across the networks.

Online tutoring is the process of tutoring in an online, virtual, or networked, environment, in which teachers and learners participate from separate physical locations. Aside from space, participants can also be separated by time.

Educational data mining (EDM) is a research field concerned with the application of data mining, machine learning and statistics to information generated from educational settings. At a high level, the field seeks to develop and improve methods for exploring this data, which often has multiple levels of meaningful hierarchy, in order to discover new insights about how people learn in the context of such settings. In doing so, EDM has contributed to theories of learning investigated by researchers in educational psychology and the learning sciences. The field is closely tied to that of learning analytics, and the two have been compared and contrasted.

<span class="mw-page-title-main">Pedagogical agent</span>

A pedagogical agent is a concept borrowed from computer science and artificial intelligence and applied to education, usually as part of an intelligent tutoring system (ITS). It is a simulated human-like interface between the learner and the content, in an educational environment. A pedagogical agent is designed to model the type of interactions between a student and another person. Mabanza and de Wet define it as "a character enacted by a computer that interacts with the user in a socially engaging manner". A pedagogical agent can be assigned different roles in the learning environment, such as tutor or co-learner, depending on the desired purpose of the agent. "A tutor agent plays the role of a teacher, while a co-learner agent plays the role of a learning companion".

<span class="mw-page-title-main">Online learning in higher education</span> Development in distance education that began in the mid-1980s

Online learning involves courses offered by primary institutions that are 100% virtual. Online learning, or virtual classes offered over the internet, is contrasted with traditional courses taken in a brick-and-mortar school building. It is a development in distance education that expanded in the 1990s with the spread of the commercial Internet and the World Wide Web. The learner experience is typically asynchronous but may also incorporate synchronous elements. The vast majority of institutions utilize a learning management system for the administration of online courses. As theories of distance education evolve, digital technologies to support learning and pedagogy continue to transform as well.

Language MOOCs are web-based online courses freely accessible for a limited period of time, created for those interested in developing their skills in a foreign language. As Sokolik (2014) states, enrolment is large, free and not restricted to students by age or geographic location. They have to follow the format of a course, i.e., include a syllabus and schedule and offer the guidance of one or several instructors. The MOOCs are not so new, since courses with such characteristics had been available online for quite a lot of time before Dave Cormier coined the term 'MOOC' in 2008. Furthermore, MOOCs are generally regarded as the natural evolution of OERs, which are freely accessible materials used in Education for teaching, learning and assessment.

<span class="mw-page-title-main">Dragan Gasevic</span>

Dragan Gašević is Professor of Learning Analytics at Monash University. He is a researcher in learning analytics and co-developed several software systems such as P3, rBPMN Editor, LOCO-Analyst, OnTask, OVAL, and ProSolo. He is recognized as Australia's field leader in educational technologies.

<span class="mw-page-title-main">Learning engineering</span> Interdisciplinary academic field

Learning Engineering is the systematic application of evidence-based principles and methods from educational technology and the learning sciences to create engaging and effective learning experiences, support the difficulties and challenges of learners as they learn, and come to better understand learners and learning. It emphasizes the use of a human-centered design approach in conjunction with analyses of rich data sets to iteratively develop and improve those designs to address specific learning needs, opportunities, and problems, often with the help of technology. Working with subject-matter and other experts, the Learning Engineer deftly combines knowledge, tools, and techniques from a variety of technical, pedagogical, empirical, and design-based disciplines to create effective and engaging learning experiences and environments and to evaluate the resulting outcomes. While doing so, the Learning Engineer strives to generate processes and theories that afford generalization of best practices, along with new tools and infrastructures that empower others to create their own learning designs based on those best practices.

References

  1. "Call for Papers of the 1st International Conference on Learning Analytics & Knowledge (LAK 2011)" . Retrieved 12 February 2014.
  2. Andrews, R.; Haythornthwaite, Caroline (2007). Handbook of e-learning research. London, UK: Sage.
  3. Anderson, T. (2008). The theory and practice of online learning. Athabasca, Canada: Athabasca University Press.
  4. Haythornthwaite, Caroline; Andrews, R. (2011). E-learning theory and practice . London, UK: Sage.
  5. 1 2 3 4 5 6 7 8 9 10 11 12 13 Siemens, George (2013-08-20). "Learning Analytics: The Emergence of a Discipline". American Behavioral Scientist. 57 (10): 1380–1400. doi:10.1177/0002764213498851. ISSN   0002-7642. S2CID   145692984.
  6. Siemens, G., Connectivism: A learning theory for the digital age, International Journal of Instructional Technology and Distance Learning 2 (10), 2005.
  7. George., Siemens (2006). Knowing knowledge. [Place of publication not identified]: [publisher not identified]. ISBN   978-1-4303-0230-8. OCLC   123536429.
  8. Siemens, George. "What Are Learning Analytics?" Elearnspace, August 25, 2010. Archived 2018-06-28 at the Wayback Machine
  9. "I somewhat disagree with this definition—it serves well as an introductory concept if we use analytics as a support structure for existing education models. I think learning analytics—at an advanced and integrated implementation—can do away with pre-fab curriculum models." George Siemens in the Learning Analytics Google Group discussion, August 2010 Archived 2020-05-17 at the Wayback Machine
  10. "In the descriptions of learning analytics we talk about using data to "predict success". I've struggled with that as I pore over our databases. I've come to realize there are different views/levels of success." Mike Sharkey, Director of Academic Analytics, University of Phoenix, in the Learning Analytics Google Group discussion, August 2010 [ permanent dead link ]
  11. 1 2 Greller, Wolfgang; Drachsler, Hendrik (2012). "Translating Learning into Numbers: Toward a Generic Framework for Learning Analytics" (PDF). Educational Technology and Society. 15 (3): 42–57. S2CID   1152401. Archived from the original (PDF) on 2019-01-11. Retrieved 2018-11-01.
  12. 1 2 3 Picciano, Anthony G. (2012). "The Evolution of Big Data and Learning Analytics in American Higher Education" (pdf). Journal of Asynchronous Learning Networks. 16 (3): 9–20. doi:10.24059/olj.v16i3.267. S2CID   60700161.
  13. Elias, Tanya (January 2011). "Learning Analytics: Definitions, Processes and Potential" (PDF). Unpublished Paper: 19. S2CID   16906479. Archived from the original (PDF) on 2019-01-11. Retrieved 2018-11-02.
  14. Cooper, Adam (November 2012). "What is Analytics? Definition and Essential Characteristics" (PDF). The University of Bolton. ISSN   2051-9214. S2CID   14382238. Archived from the original (PDF) on 2019-01-11. Retrieved 2018-11-01.
  15. 1 2 3 4 Powell, Stephen, and Sheila MacNeill. Institutional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. "Archived copy" (PDF). Archived from the original (PDF) on 2013-05-02. Retrieved 2018-11-01.{{cite web}}: CS1 maint: archived copy as title (link).
  16. Johnson, Larry; Adams Becker, Samantha; Cummins, Michele (2016). NMC Horizon Report: 2016 Higher Education Edition (PDF). Texas, Austin, USA. p. 38. ISBN   978-0-9968527-5-3 . Retrieved 2018-10-30.{{cite book}}: |journal= ignored (help)CS1 maint: location missing publisher (link)
  17. Gašević, D.; Kovanović, V.; Joksimović, S. (2017). "Piecing the learning analytics puzzle: a consolidated model of a field of research and practice". Learning: Research and Practice. 3 (1): 63–78. doi:10.1080/23735082.2017.1286142. hdl: 20.500.11820/66801038-daf6-4065-b4a0-54848ad373ab . S2CID   115009983.
  18. Gašević, D.; Dawson, S.; Siemens, G. (2015). "Let's not forget: Learning analytics are about learning" (PDF). TechTrends. 59 (1): 64–71. doi:10.1007/s11528-014-0822-x. hdl:20.500.11820/037bd57b-858f-4d21-bd29-2c6ad4788b42. S2CID   60547215.
  19. G. Siemens, D. Gasevic, C. Haythornthwaite, S. Dawson, S. B. Shum, R. Ferguson, E. Duval, K. Verbert, and R. S. J. D. Baker. Open Learning Analytics: an integrated & modularized platform. 2011.
  20. Baepler, P.; Murdoch, C. J. (2010). "Academic Analytics and Data Mining in Higher Education". International Journal for the Scholarship of Teaching and Learning. 4 (2). doi: 10.20429/ijsotl.2010.040217 .
  21. 1 2 C. Brooks. A Data-Assisted Approach to Supporting Instructional Interventions in Technology Enhanced Learning Environments. PhD Dissertation. University of Saskatchewan, Saskatoon, Canada 2012.
  22. "Learning Analytics | Teachers College Columbia University". www.tc.columbia.edu. Retrieved 2015-10-13.
  23. Otte, Evelien; Rousseau, Ronald (2002). "Social network analysis: a powerful strategy, also for the information sciences". Journal of Information Science. 28 (6): 441–453. doi:10.1177/016555150202800601. S2CID   17454166.
  24. Lazarsfeld, Paul F. (January 1944). "The Election Is Over". Public Opinion Quarterly. 8 (3): 317. doi:10.1086/265692.
  25. Sullivan, Danny (2007-04-26). "What Is Google PageRank? A Guide For Searchers & Webmasters". Search Engine Land. Archived from the original on 2016-07-03.
  26. Cutts, Matt. "Algorithms Rank Relevant Results Higher". Archived from the original on July 2, 2013. Retrieved 19 October 2015.
  27. Page, Lawrence; Brin, Sergey; Motwani, Rajeev; Winograd, Terry (1999). "The PageRank Citation Ranking: Bringing Order to the Web" (PDF). Stanford InfoLab. Archived from the original (PDF) on 2020-03-10. Retrieved 2019-01-11.
  28. Milgram, Stanley (May 1967). "The Small World Problem". Psychology Today.
  29. Granovetter, Mark S. (May 1973). "The Strength of Weak Ties" (PDF). The American Journal of Sociology. 78 (6): 1360–1380. doi:10.1086/225469. JSTOR   2776392. S2CID   59578641.
  30. Wellman, Barry, ed. (1999). Networks in the global village: life in contemporary communities. Boulder, Colo: Westview Press. ISBN   978-0-8133-1150-0. OCLC   39498470.
  31. Wellman, Barry; Hampton, Keith (November 1999). "Living Networked in a Wired World" (PDF). Contemporary Sociology. 28 (6). doi:10.2307/2655535. JSTOR   2655535. S2CID   147025574 . Retrieved 2018-11-02.
  32. Barry Wellman, "Physical Place and Cyber Place: The Rise of Networked Individualism." International Journal of Urban and Regional Research 25,2 (June, 2001): 227-52.
  33. Haythornthwaite, Caroline; Andrews, Richard (2011). E-learning theory and practice . London, UK: Sage. doi:10.4135/9781446288566. ISBN   978-1-84920-471-2.
  34. Dawson, Shane. (2010). "'Seeing' the learning community: An exploration of the development of a resource for monitoring online student networking" (pdf). British Journal of Educational Technology. 41 (5): 736–752. doi:10.1111/j.1467-8535.2009.00970.x.
  35. Rich, Elaine (1979). "User modeling via stereotypes" (PDF). Cognitive Science. 3 (4): 329–354. doi: 10.1207/s15516709cog0304_3 .
  36. Fischer, Gerhard (2001). "User Modeling in Human^Computer Interaction". User Modeling and User-Adapted Interaction. 11: 65–86. doi: 10.1023/A:1011145532042 .
  37. Nelson, T. H. (1965-08-24). "Complex information processing: A file structure for the complex, the changing and the indeterminate". Proceedings of the 1965 20th national conference. ACM. pp. 84–100. doi:10.1145/800197.806036. ISBN   9781450374958. S2CID   2556127.
  38. Brusilovsky, Peter (2001). "Adaptive Hypermedia". User Modeling and User-Adapted Interaction. 11 (1/2): 87–110. doi: 10.1023/a:1011143116306 . ISSN   0924-1868.
  39. Burns, Hugh (1989). Richardson, J. Jeffrey; Polson, Martha C. (eds.). "Foundations of intelligent tutoring systems: An introduction" (PDF). Proceedings of the Air Force Forum for Intelligent Tutoring Systems.
  40. Anderson, John R.; Corbett, Albert T.; Koedinger, Kenneth R.; Pelletier, Ray (1995). "Cognitive tutors: Lessons learned" (PDF). Journal of the Learning Sciences. 4 (2): 167–207. doi:10.1207/s15327809jls0402_2. S2CID   22377178.
  41. Koedinger, Kenneth; Osborne, David; Gaebler, Ted (2018). Forbus, K.D.; Feltovich, P.J. (eds.). "Intelligent Cognitive Tutors as Modeling Tool and Instructional Model" (PDF). Smart Machines in Education: The Coming Revolution in Educational Technology: 145–168.
  42. Koedinger, Kenneth (2003). "Toward a Rapid Development Environment for Cognitive Tutors Interactive Event during AIED-03" (PDF). Artificial Intelligent in Education.
  43. Shaffer, David Williamson; Collier, Wesley; Ruis, A. R. (2016). "A Tutorial on Epistemic Network Analysis: Analyzing the Structure of Connections in Cognitive, Social, and Interaction Data". Journal of Learning Analytics. 3 (3): 9–45. doi: 10.18608/jla.2016.33.3 . ISSN   1929-7750.
  44. Shaffer, David Williamson; Ruis, A. R. (2017), "Epistemic Network Analysis: A Worked Example of Theory-Based Learning Analytics", Handbook of Learning Analytics, Society for Learning Analytics Research (SoLAR), pp. 175–187, doi: 10.18608/hla17.015 , ISBN   9780995240803
  45. 1 2 Cooper, Adam. A Brief History of Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, November 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Analytics-Brief-History-Vol-1-No9.pdf.
  46. "Learning Analytics". www.tc.columbia.edu. Retrieved 2015-11-03.
  47. Buckingham Shum, S. and Ferguson, R., Social Learning Analytics. Educational Technology & Society (Special Issue on Learning & Knowledge Analytics, Eds. G. Siemens & D. Gašević), 15, 3, (2012), 3-26. Open Access Eprint: http://oro.open.ac.uk/34092
  48. Brown, M., Learning Analytics: Moving from Concept to Practice. EDUCAUSE Learning Initiative Briefing, 2012. http://www.educause.edu/library/resources/learning-analytics-moving-concept-practice
  49. Buckingham Shum, S. and Deakin Crick, R., Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. In: Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May 2012). ACM: New York. pp.92-101. doi : 10.1145/2330601.2330629 Eprint: http://oro.open.ac.uk/32823
  50. 1 2 "Analytics for Achievement" (PDF). Ibm, S.a.: 4. February 2011. Retrieved 2018-11-01.
  51. "Archived copy". Archived from the original on 2013-11-10. Retrieved 2013-11-19.{{cite web}}: CS1 maint: archived copy as title (link)
  52. Ali, L.; Hatala, M.; Gaševic, D.; Jovanovic, J. (2012). "A qualitative evaluation of evolution of a learning analytics tool". Computers & Education. 58 (1): 470–489. CiteSeerX   10.1.1.462.4375 . doi:10.1016/j.compedu.2011.08.030.
  53. Ali, L.; Asadi, M.; Gaševic, D.; Jovanovic, J.; Hatala, M. (2013). "Factors influencing beliefs for adoption of a learning analytics tool: An empirical study" (PDF). Computers & Education. 62: 130–148. doi:10.1016/j.compedu.2012.10.023.
  54. "Billets pour le parc d'attraction disneyland Paris". Archived from the original on 2017-04-15. Retrieved 2011-11-27.
  55. "Social Networks in Action – Learning Networks @ UOW". Archived from the original on 2012-03-21.
  56. "Homepage".
  57. "Brightspace Performance Plus for Higher Education | Learning Analytics Features | Brightspace by D2L".
  58. Arastoopour, Golnaz; Chesler, Naomi; Shaffer, David; Swiecki, Zachari (2015). "Epistemic Network Analysis as a Tool for Engineering Design Assessment". 2015 ASEE Annual Conference & Exposition Proceedings. ASEE Conferences: 26.679.1–26.679.19. doi: 10.18260/p.24016 .
  59. Slade, Sharon and Prinsloo, Paul "Learning analytics: ethical issues and dilemmas" in American Behavioral Scientist (2013), 57(10), pp. 1509–1528. http://oro.open.ac.uk/36594
  60. Siemens, G. "Learning Analytics: Envisioning a Research Discipline and a Domain of Practice." In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 4–8, 2012. http://dl.acm.org/citation.cfm?id=2330605.
  61. Kristy Kitto, Towards a Manifesto for Data Ownership http://www.laceproject.eu/blog/towards-a-manifesto-for-data-ownership/
  62. "Privacy Fears Over Student Data Tracking Lead to InBloom's Shutdown". Bloomberg.com. 2014-05-01. Retrieved 2020-10-05.
  63. "Ethics and Privacy in Learning Analytics (#EP4LA)".
  64. Drachsler, H. & Greller, W. (2016). Privacy and Analytics – it's a DELICATE issue. A Checklist to establish trusted Learning Analytics. 6th Learning Analytics and Knowledge Conference 2016, April 25–29, 2016, Edinburgh, UK.
  65. "DELICATE checklist – to establish trusted Learning Analytics". 2016-01-25.
  66. 1 2 3 4 Prinsloo, Paul; Slade, Sharon (16 March 2015). "Student privacy self-management: Implications for learning analytics". Proceedings of the Fifth International Conference on Learning Analytics and Knowledge. pp. 83–92. doi:10.1145/2723576.2723585. ISBN   9781450334174. S2CID   1802559 . Retrieved 2020-07-05.{{cite book}}: |website= ignored (help)
  67. 1 2 Mohamed Amine Chatti, Anna Lea Dyckhoff, Ulrik Schroeder and Hendrik Thüs (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning (IJTEL), 4(5/6), pp. 318-331.
  68. Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, Iss. 10. http://eleed.campussource.de/archive/10/4035
  69. Mohamed Amine Chatti, Arham Muslim, and Ulrik Schroeder (2017). Toward an Open Learning Analytics Ecosystem. In Big Data and Learning Analytics in Higher Education (pp. 195-219). Springer International Publishing.
  70. Eli (2011). "Seven Things You Should Know About First Generation Learning Analytics". EDUCAUSE Learning Initiative Briefing.
  71. Long, P.; Siemens, G. (2011). "Penetrating the fog: analytics in learning and education". Educause Review Online. 46 (5): 31–40.
  72. Buckingham Shum, Simon (2012). Learning Analytics Policy Brief (PDF). UNESCO.
  73. Johnson, Larry; Adams Becker, Samantha; Cummins, Michele (2016). NMC Horizon Report: 2016 Higher Education Edition (PDF). Texas, Austin, USA. ISBN   978-0-9968527-5-3 . Retrieved 2018-10-28.{{cite book}}: |journal= ignored (help)CS1 maint: location missing publisher (link)