Talk:Learning analytics: Difference between revisions

The educational technology and digital learning wiki
Jump to navigation Jump to search
(-- ~~~~)
 
No edit summary
Line 22: Line 22:




Not edu, but possibly useful for orientation purposes
Not edu, but possibly useful for orientation purposes. They also help realize that the performance indicators typically collected by web analytics tools are very basic... and of an order different than the ones you would need to evaluate the "learning" that takes place when using web tools.
* Web metrics demystified --  http://www.kaushik.net/avinash/2007/12/web-metrics-demystified.html
* Web metrics demystified --  http://www.kaushik.net/avinash/2007/12/web-metrics-demystified.html
* Web analytics maturity model -- http://blog.immeria.net/2008/11/web-analytics-maturity-model-and.html
* Web analytics maturity model -- http://blog.immeria.net/2008/11/web-analytics-maturity-model-and.html
* How to Choose a Web Analytics Tool: A Radical Alternative --  http://www.kaushik.net/avinash/2006/09/how-to-choose-a-web-analytics-tool-a-radical-alternative.html
* How to Choose a Web Analytics Tool: A Radical Alternative --  http://www.kaushik.net/avinash/2006/09/how-to-choose-a-web-analytics-tool-a-radical-alternative.html
* Web analytics demystified -- http://www.kaushik.net/avinash/2007/12/web-analytics-demystified.html
* Web analytics demystified -- http://www.kaushik.net/avinash/2007/12/web-analytics-demystified.html
* Best Web Metrics / KPIs for a Small, Medium or Large Sized Business -- http://www.kaushik.net/avinash/best-web-metrics-kpis-small-medium-large-business/
* Six Web Metrics / Key Performance Indicators To Die For -- http://www.kaushik.net/avinash/rules-choosing-web-analytics-key-performance-indicators/
You mentioned LSA. What paper? LSA is great to explore semantic similarity. But in the context of learning analytics, that's just about replacing one black box with another (where black box = system too complex to understand).

Revision as of 14:44, 6 April 2012

-- widged 14:15, 6 April 2012 (CEST)

Hi Daniel,

I will be offline, visiting family, this week-end, so short reply for now.

Goals "One is LA overview (edutechwiki). My deep interest is improving write-to-learn/project-oriented edu environments."

Can you elaborate? I am familiar with project-oriented edu environment. But how do you know when such an environment is successful or not?

What are known markers of success? What are all types of data that you have available or that you could collect?

What do you want to know? Be warned as early as possible that some students may be in difficulty (for instance, haven't connected for 2 weeks)? Quantity of contributions? Quantity of interactions? Spread of interactions (talking to all members of the team)? Richness (whatever meaning you give to it) of interactions? Reveal patterns (the students that get the best grades are the ones that get it right on the first go, do little re-edits or the other way around)? Provide ways to establish the superior efficiency of a new approach?

Some resources on my hard disk that you may or may not have come across yet


Not edu, but possibly useful for orientation purposes. They also help realize that the performance indicators typically collected by web analytics tools are very basic... and of an order different than the ones you would need to evaluate the "learning" that takes place when using web tools.


You mentioned LSA. What paper? LSA is great to explore semantic similarity. But in the context of learning analytics, that's just about replacing one black box with another (where black box = system too complex to understand).