….or, am I analyzing when I should be probing.

I’ve been reading about the Cynefin framework (watch a video in which Cynefin creator Dave Snowden describes it) recently and pondering what it might mean for my work as an e-learning administrator.
For the last several years my institution has participated in a grant-funded program. One of the program’s key precepts is that institutions must be data driven.  We have dutifully collected data, designed interventions, piloted them and seen improvements in our target indicators. Thinking about Cynefin causes me to consider two possibilities:

1)  We are working in a complicated environment.  In designing the intervention, our experts did well in identifying the cause-effect relationship.  This provides a good model for future program design.

2) The system we work in is complex.  We had a probe that worked well, and we are now trying to amplify it.  However, since the system is complex, our next probe may end up needing to be dampened.

This matters because if we have a complex system and treat it as complicated, if a probe doesn’t go well, we think our analysis is wrong or we followed the advice of the wrong experts.  This makes us less likely to continue to experiment.

Andrew Cerniglia also  writes about Cynefin in the classroom.

Of course picking the right quadrant is the key decision, and it seems quite possible that one would get it wrong.  My tendency is to think that if one were to err in this, it would be better to misidentify something as complex which was merely complicated that to go the other way round.

Advertisements