What did I do?
- I skimmed and watched all assigned readings for this week. I found them very diverse, touching upon deep technical topics as well as raising ethical dillemas, providing existing and inspiring samples, and research. This week's reading were also more diversed in format, like having video or sites and not just papers, and I found that mix worked out better for me.
- I enjoyed the long video on Hadoop, an open source technology that works out for solving complex, non prior asked questions to unstructured and vast data sets. It took me back to my days as an IT professional, when clustering (having more than one computer do stuff jointly and act as one big entity) was still way more infant. Big data indeeds needs a different approach. This video also made me stumble upon BigSheets (from my own company IBM, but hey, we do so much, I can't know everything up front).
- Some papers and articles fall into the category 'are you scared yet?'. As the article in the Economist puts it: analytics brings new benefits and new headaches. (In the same article I marked the phrase that data are becoming a new raw material for business.) I remember someone at a reception telling me (one glass in the hand) that as governments have kept voting results by district for all elections since the 50ies, analytics could now be used to make a spookily accurate guess of an individual household's political preference. (Really?) The few preview pictures of selected friends on your public Facebook profile page might already give away that you're gay. Etc.
- The WSJ article on what insurance firms are doing might be business as usual in the US, it scares the hell out of me. Besides that, I noted that we have to fight our gut feelings to draw conclusions and take actions on individuals when all you have is statistics on groups of people. With learning analytics and workforce analytics you can make accurate statements about groups, but not about individuals. And that is fine, we are developing and managing teams anyway.
- The Anderson article "The end of Theory" is very contested, you can judge that by just reading some comments under the article. It questions the need for the traditional scientific method, now we can cope with the data itself. While reading it, some alarm bells got off in my head too. I'm not totally opposite his views, but he limits the use of theoretic models to prediction. Any theoretic model is a reduction and simplification of reality to its relevant essence; and so for two purposes. The first is that models help understand reality. Second is they help to predict what will happen once we figure out the underlying mechanisms (simplified even as they are). Throwing lots of data and an algorithm can indeed replace models for understanding reality. (Like the example of the wine quality prediction in the Authors@Google talk). But it will not help in understanding anything, you still need models for that.
- The same video had interesting examples on matchmaking, and did its best to show that statistical predicitons do way better than human. It also suggests using analytics to find out what works best. Just test it out. Not sure what learning intervention works best? Test them both out and see. It seems by the way that you can't avoid Google in the Big Data topic. They come back in various articles and videos. Some more quotes I marked: "All models are wrong, and increasingly you can succeed without them." "with enough data, the numbers speak for themselves"
- Laughed with the Telegraph article. "Data from the dating site OKCupid.com shows that its users over-report their height by 2 inches and overreport salary by 20%". Another reason to look at the data instead of surveying people?
- I also participated in the Moodle forum. I'm a bit disappointed noone has entered new ideas for how to apply analytics for corporations. The theory and the definitions are are very good and well, but what are we going to do or dream of doing with it? Maybe we'll get some more in the next days or weeks as our understandings grow. For me, a key take-away from this course is a list of dreams for the application of learning analytics in the corporate world. While I was going over some forum posts,I made a few ramblings on Kirkpatrick as well. Normally, I don't post or write down thinking in progress or on stuff that I don't master enough, not on a public internet space anyway. But that's against the purpose of the MOOC format. Just hope it doesn't come to bite me back in the butt some day.
- Another concern I added to the forums was on the self-prophecy effect that actionable analytics might have.
- Then I went over to my own company's sites. I'm already familiar with IBM's Smart Planet strategy, and it has a specific one for education too. Some of the most interesting reading is the report "The New Path To Value" from the IBM Institute of Business Value. Analytically driven corporations outperform the others. Top performers are 5.4 times more likely to use analytics over intuition. I wonder if there has ever been a comparable study for education. Do the best educational institutions use more analytics too?
- I tried out the SNAPP tool too, and I ran it on the introduction forum of the course. It goes over all forum posts (should be good for the site's hits) and makes a visual map. Nice visualisation tool, and thanks to George Siemens to point me to more tools that are alike (I found one in my own company that is similar.) But how to use the visual graph of the social interaction? You would get the idea that outsiders are the bad learners, but hey, lurking is learning too.
- Finally I also attended the session on educational mining and smart tutor systems and their fight against people 'gaming the system'. (Took me a while to get the concept here as for the last years I've been trying to set up learning by gaming, and embedding the gaming aspect naturally into the learning process, and these guys are fighting it... Took me some time to get what they were talking about.) Very interesting talk and samples. Except for the definition discussion. I'm so not interested in sharp definitions, I like my definitions intentionally blurry.
- Just a few hours ago, by coincidence, I attended a Centra session of my company's (IBM) Global Learning Community on learning analytics. First there was a nice introduction on how business intelligence and predictive analysis are different (again, the definitions). Business intelligence is about 'what has happened' and 'why', whereas predictive analysis is about 'what will happen' and 'what to do'. Typical learning measurement in the form of Kirkpatrick is 'business intelligence' and on 'what has happened'. Now this group in IBM has build a tool for predictive analysis for learning. They made a tool that helps out and objectifies the instructional design of learning projects at the very beginning. It predicts for several potential learning activities or blends what the costs, benefits, ROI and associated risk would be. I found this an inspiring use of predictive analysis to predict the success of a particular training option.
What sense did I make of it all?
- Big data requires different technologies and methods, and different languages to query the data. Looks like I'm not only going to brush up my statistical knowledge, but also brush up my database and SQL query knowledge.
- If you can replace surveying by looking at the real data, you should do that. (Obviously you need to legally and morally have access to sufficient and good enough data, otherwise surveys are the plan B you should go for.) As MD House would say: Everybody lies. People on dating sites lie, but our learners as well. Maybe not even intentionally. It is one of the 3 'rules' of analytics that I made up based on 3 movies. You can find them on walkthe.net. My point is that analytics will start to challenge our baselines, as the 'truth' revealed by the data might conflict with our cherished beliefs, world views and values and rules of thumb.
- Learning analytics on big data only works ... well... if you have big data. So, would smaller corporations be able to apply it? Actually, only the massive training programs in corporations would probably qualify. How big does big data need to be?
- The course obviously has a central topic and associated definition that is already very large. Yet, I find myself constantly at its edges. A lot of discussion and focus goes to applying analytics to do better at the operational core of a training department or education institute. That's all great. But I see unique potential at applying analytics to move up the food chain, and link the learning black box itself with workforce analytics and business trends to better align training programs with its demand, and to link it with performance KPIs and business value. Traditionally, corporate training departments face criticism for not aligning enough with business and not providing sufficient hard evidence of their impact. Analytics could be well applied to address these shortcomings. I feel that if we focus analytics on those areas it will bring more value than only looking at the operational side of the learning intervention.
This post is way too long. No way I'm going to spell check this. It's time to watch "De Allerslimste Mens Ter Wereld".
Bert, I enjoyed your comprehensive take on week 2 of LAK11 but your comment on connectivist learning is the one that provoked most thought.
ReplyDeleteI too used to believe public posts needed some sort of crafted finality about them. Now I feel my learning improves and becomes more powerful if the messy transparent thought process becoomes just another artefact.
Your comment resonates strongly;
"Normally, I don't post or write down thinking in progress or on stuff that I don't master enough, not on a public internet space anyway. But that's against the purpose of the MOOC format. Just hope it doesn't come to bite me back in the butt some day"
I feel autonomous academia, or maybe just the 'owners of knowing', are equally hesitant in socially sharing emergent thoughts. Smart groups often sound dumb.
Sure, as you say, it may come back to bite, but should we not socially share learning, as ourselves, if those were our then beliefs?
Put it out there to garner comment, encourage debate, not use it against somebody later in a predictable denouncement or contested 'first idea ownership'. The competing divide of reflective, thoughtful, autonomous analysis often by defenders of status quo in monoliths and agile, responsive, experimental, innovative but unproven or even dangerous ideas, often in corporations, will need to find more common ground.
Maybe analytics can enable that.
Hi Bert! Enjoyed skimming through your reflection and sense-making.
ReplyDeleteCoincidently, I attended a talk today by local university here, on "Predictive Analytics using Data Mining". It was quite clear on the difference between data mining and any other 'analytics' concepts we are hearing nowadays, and I guess from the clues of predictive analytics, it is applicable to any industries as long as you know what you want from the data!
Thanks for the share.
- Shazz, Kuala Lumpur