How do we ensure inclusion of the forms of interpretive, critical, historical, and theoretical perspectives of the humanities in qualitative and quantitative research on digital learning that is predominantly carried out in the social sciences and information sciences?



I've been interested in "bridge building" between qual and quant, and one avenue I think is available is through highly visual models that have interfaces at both the visual and mathematico-operational levels. Today these are nascent forms, but I think they may someday develop into more powerful cross-disciplinary tools. One example of what I'm talking about is STELLA, which is a systems thinking toolset (it also has brothers and sisters - similar toolsets) from the systems dynamics literature. The problem with these at the outset, is that they are designed for systems engineers rather than humanistic systems thinkers. But i think this will be overcome eventually.

The qual software available such as NVivo is designed to capture and work with texts and allows some system design (but not with a rigorous connection to an underlying computational). The leverage point for development starting with these (I think) is automating some of the textual analysis using methods and tools such as those developed by ALG group at Illinois.

I'm not a proponent of getting humans out of the analysis business, but of getting computers and the global computational network into a working partnership with us. The interpretive approach in quantitative methods has recently been boosted by Steven Wolfram and the Wolfram Alpha site and Demonstration Projects.

Now the odd or perhaps interesting thing is that I think this is all connected to the future of assessment of digital media inquiry and expression.