-- widged 14:15, 6 April 2012 (CEST)
I will be offline, visiting family, this week-end, so short reply for now.
Goals "One is LA overview (edutechwiki). My deep interest is improving write-to-learn/project-oriented edu environments."
Can you elaborate? I am familiar with project-oriented edu environment. But how do you know when such an environment is successful or not?
What are known markers of success? What are all types of data that you have available or that you could collect?
What do you want to know? Be warned as early as possible that some students may be in difficulty (for instance, haven't connected for 2 weeks)? Quantity of contributions? Quantity of interactions? Spread of interactions (talking to all members of the team)? Richness (whatever meaning you give to it) of interactions? Reveal patterns (the students that get the best grades are the ones that get it right on the first go, do little re-edits or the other way around)? Provide ways to establish the superior efficiency of a new approach?
Some resources on my hard disk that you may or may not have come across yet
- Academic Analytics -- http://www.educause.edu/ers0508 and http://www.educause.edu/Resources/Browse/AcademicAnalytics/16930
- The Sharing of Wonderful Ideas: Influence and Interaction in. Online Communities of Creators., PhD thesis -- http://www.media.mit.edu/~sylvan/SylvanDissertation2007.pdf
- Learning Analytics: Definitions, Processes and Potential (Master thesis?) -- http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
- Learner Analytics blog -- http://www.learninganalytics.net/
- Learner Analytics -- http://net.educause.edu/ir/library/pdf/NGLC003.pdf
- Performance Analytics in Education -- http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume43/ActionAnalyticsMeasuringandImp/162422
Not edu, but possibly useful for orientation purposes. They also help realize that the performance indicators typically collected by web analytics tools are very basic... and of an order different than the ones you would need to evaluate the "learning" that takes place when using web tools.
- Web metrics demystified -- http://www.kaushik.net/avinash/2007/12/web-metrics-demystified.html
- Web analytics maturity model -- http://blog.immeria.net/2008/11/web-analytics-maturity-model-and.html
- How to Choose a Web Analytics Tool: A Radical Alternative -- http://www.kaushik.net/avinash/2006/09/how-to-choose-a-web-analytics-tool-a-radical-alternative.html
- Web analytics demystified -- http://www.kaushik.net/avinash/2007/12/web-analytics-demystified.html
- Best Web Metrics / KPIs for a Small, Medium or Large Sized Business -- http://www.kaushik.net/avinash/best-web-metrics-kpis-small-medium-large-business/
- Six Web Metrics / Key Performance Indicators To Die For -- http://www.kaushik.net/avinash/rules-choosing-web-analytics-key-performance-indicators/
You mentioned LSA. What paper? LSA is great to explore semantic similarity. But in the context of learning analytics, that's just about replacing one black box with another (where black box = system too complex to understand).
Re: -- Daniel K. Schneider 14:54, 8 April 2012 (CEST)
Thanx a lot :) I am taking some time off too btw.
I mentioned LSA with respect to my aim of understanding what is going on in a wiki, e.g. comparing differences between various stages of a document. Using simple word lists (with a large stop list) probably would be easier and do a better job.
Re: --widged 21:32, 12 April 2012 (CEST)
Yes, stopwords, wordlist would give you most of what you want. I have done that before to analyse content from mailing list and it worked quite well. You also want to get your words through a porter stemming algorithm -- http://tartarus.org/~martin/PorterStemmer/. That will bring similar words to their common morphological root. I made one available on google code: http://code.google.com/p/flex-porter-stemmer/
I have limited internet access during the day (secure environment imposed by client, blah, blah). I will get another look over the week-end.