Creating the Conditions for Justice Innovation: How (NOT) to Solve Complex Problems

‘…now is not a good time for control freaks” – Eric Young

In my last post for Slaw I wrote about the importance of creating the conditions for justice innovation by building the skills needed to work in multidisciplinary teams and collaborate rather than “consult” with justice system users. In this post I want to focus on another important part of creating the conditions for justice innovation, in particular how we might support innovators by rethinking our problem solving approaches and the methods we use to evaluate justice innovation initiatives.

How we evaluate the success (or failure) of a project is intimately tied to our understanding of how problems should be solved.

In his recent book, Social Lab Revolution, Zaid Hassan argues that our current models of problem solving and evaluation support a linear and expert-planning approach. This approach typically looks something like this: 1) identify the problem, 2) develop a plan to deal with the problem, 3) implement the plan exactly as conceived 4) evaluate the results. Referred to by Hassan as the “business as usual” approach, this approach is guided by three key assumptions (and here I am paraphrasing Hassan):

  • That the problem solvers (i.e. experts) don’t have or don’t need “skin in the game”. Experts solve problems they don’t experience them.
  • That a plan to solve the problem can be put together in advance. Once execution has begun, the plan does. not. change.
  • That only those elements of the problem that can be measured and expressed on paper can be taken into consideration.

This model also makes particular assumptions about the problem itself, mainly that,

  • The problem is well understood.
  • The problem is bounded (either conceptually, geographically – or both).
  • There are limited solutions to the problem and one solution will be better than the others.
  • The problem is static and will remain unchanged as the solution is implemented.

In short, this linear problem solving method is neat, tidy and logical. But, it is also inflexible, unresponsive and unsuitable for tackling complex or “wicked” social problems – like those plaguing the justice system.

Complex problems laugh in the face of all of these assumptions because of one simple fact: they are unpredictable (yes, I said it!). Complex problems are not linear or bounded; they are dynamic, emergent, slippery and constantly in flux. They are the result of the continuous interactions between multiple independent actors and factors that are repeatedly pushing and pulling on the system in different ways.

The problems we face in the justice system – namely access to justice problems – are complex. This was made clear by both the Final Report of the Action Committee on Access to Justice in Civil and Family Matters, and the Equal Justice report by the Canadian Bar Association. As both reports highlight, access to justice is a complex problem that involves multiple actors, is interwoven with personal, social and political issues, and cannot be solved without working collaboratively. People’s legal issues are shaped by social and economic factors like poverty, marginalization, mental health issues, and familial relationships, among others, all of which contribute to the complexity of the problem. This means that business as usual problem-solving approaches that are linear and inflexible simply will not work.

Enter innovation. To tackle complex problems we need to rethink our approach to problem solving. Methods like design thinking, social innovation labs and change labs, emergent strategy planning are all tools that are increasingly being experimented with and used by justice innovators – whether they are individuals or organizations (e.g. Hiil, Stanford, Winkler Institute, etc.). Innovation approaches don’t fit neatly into our current problem-solving paradigm. For example, many of these approaches rely on “strategic learning”, which means that as a project unfolds, the “plan” is constantly being adapted in response to real-time feedback the innovators are receiving; innovators are frequently changing their plans as they learn “what works”. Innovation processes also rely heavily on creativity, ideation, problem definition and prototyping, all of which sit uncomfortably within more traditional planning models that ask project leaders to predict in advance what both the problem and the solution are.

Yet, even as we recognize that the work of justice innovation is typically very different than what we’ve done before, our evaluation methods – which often determine the “success” or “failure” of a project, and the potential for securing future funding – continue to align with business as usual problem solving approaches. We often ask innovators to predict, from the outset, what the outcome of their innovation process will be. We also ask them to provide a detailed plan of what they are going to do and how they are going to do it.

This can be frustrating for those working with innovation methods and techniques that don’t lend themselves to linear based approaches to problem solving or typical methods of evaluation. Often to receive funding innovators spend too much time trying to get their “round” projects to fit in the “square” holes of funding applications that use a planning based approach.

That said, traditional-based problem solving approaches can be as equally frustrating for funders who want to fund innovative projects but also need reliable ways to make sure that their limited funding resources are supporting projects that are making a difference.

My apologies if you are hoping that I am going to end this post with an answer – I don’t have one. What I do know is that, put simply, our heavy focus on past, linear, outcomes-based, business-as-usual approaches have not produced meaningful change. A shift in focus from outcomes to process – innovative process (including how we fund, execute and evaluate) – is required.

The good news: there are evaluation methods and strategies out there, which have helped me to start thinking differently about innovation, which are well suited to innovation.

The better news: many of these methods not only produce reliable data – they can actually improve the results of the innovation process. Enjoy!

  • “Strategic Philanthropy for a Complex World”. In this article, published in the Stanford Review of Social Innovation, the authors discuss why the predictive model of strategy planning traditionally used by funders needs to give way to a model of emergent strategy that “better aligns with the complex nature of social progress”.
  • “A Developmental Evaluation Primer”. This great resource by Jamie A.A. Gamble not only provides a guide on how to carry out developmental evaluation – an evaluation model that helps innovators continually improve their process – it also addresses many of the myths around developmental evaluation, including the myth that developmental evaluation is a “soft method”.
  • “Evaluating Collective Impact: Five Simple Rules”. Aside from providing “five simple rules” that practitioners, funders, and evaluators should consider in their own evaluation efforts, this article provides a great comparison of the principles that guide traditional evaluation models and complexity-based models.
  • Finally, for those who may have missed this gem when it was published in 2007– your next read should be “Getting to Maybe” an extremely readable and inspiring account of how to tackle complex problems in an increasing complex world. You will have no trouble seeing how it applies to justice innovation! For a specific look at evaluation see pages, 46-54.

 — Nicole Aylwin


  1. Nicole – I think the innovative approach to system design suggested by the Hassan book may be extremely valuable – and in fact, it is forming the basis of the current A2J review of family law in Alberta.

    Here’s the problem, however.

    As the saying goes, “When you are a hammer, every problem looks like a nail.”

    My frustration with the concept of the “Social Law” process, is how to deal with the shortcomings in knowledge that are brought to bear in the process.

    In other words, if we gather together a hundred extremely well-intentioned “hammers” and they are asked to brainstorm on how to deal with a problem – because of their own limitations, they are likely to create solutions that result in hammers doing what hammers do. Solutions? “Hammer things.”

    This is what I’m seeing currently in Alberta.

    So – to be more specific – we, in the legal field, have done a remarkably bad job in looking at what we do in a scientific manner – collecting verifiable data regarding aspects of the Justice paradigm – instead, generally relying upon anecdotal or experiential data to form the foundation of system advancement or change.

    In other words, most people in the Justice system, from lawyers, to law societies, to judges, to academics, and even the public at large – view “justice” from a paradigm of “what has been” as opposed to “what could be.”

    Worse, we often engage in talk about “innovation” when what we’re really doing is engaging in somewhat hackneyed “group speak”. Go ahead, sit down with an A2J advocate and see if someone uses the word “silo” or “triage” and you’ll get my drift.

    I think the Social Lab model has great promise – but only if the participants are helped to “understand what they don’t understand”. Or more precisely, before talking about brainstorming solutions – we need to first brainstorm what the problems are. And to understand those problems, we need information and data, not supposition and anecdotal experience.

    For example:

    a) How much does a trial cost?
    b) What are the constituent elements of that cost?
    c) Are those elements “necessary” for justice?
    d) How much “cost” in legal representation relates to “risk management” for lawyer liability?
    e) How much “cost” in legal representation relates to regulatory compliance – fees, reporting obligations, etc.?
    f) What are the costs to deliver legal service for a lawyer (staffing, technology, rent, etc.)?
    g) How could the costs of being a lawyer be managed more efficiently?
    h) What is the business “cost/benefit” of hiring an articling student – and would firms be more well served by hiring highly trained paralegal assistance?
    i) What is a law degree “worth” in high A2J concern areas (criminal, poverty, family) – in other words, what is the expected income for a family or criminal lawyer, and how reasonable is it that we expect them to engage in greater pro bono effort?
    j) Do clients have unrealistic expectations of the justice system that contributes to unnecessary litigation expense?
    k) When clients engage in questionable litigation – is that most likely in an atmosphere of a lawyer encouraging their effort or is it in spite of a lawyer discouraging that effort?
    l) To what extent are cost awards reflective of actual costs?
    m) What is the average cost award in an interlocutory matter, in a trial?
    n) Are cost awards more or less likely in certain types of litigation (custody/access/support)?
    o) How do potential cost awards influence client litigation decisions?
    p) How much do non-lawyer legal services (paralegal service) cost, on average?
    q) What are outcome differentials between litigation, mediation, collaboration based upon similar fact patterns (children, income, length of marriage)?
    r) What is the probability of obtaining accurate disclosure of financial information given a specified protocol (voluntary response, examination under oath, examination before a Judge)?
    p) Etc., Etc.,

    This is just off the top of my head – but the point is, I don’t think we’ve really spent any time to ask some very fundamental questions that go the root of our problems – but we’re trying to engage in a Social Lab to brainstorm solutions – which, I fear, are going to end up looking pretty much like what we’ve always done.

    Hammers and nails.

  2. First, thank you so much Nicole for an inspiring and comprehensive piece about new ways to address complex problems in the justice system. We intend to get this out to as broad an audience as possible!

    Second, I share many of Rob’s concerns about how we traditionally attempt “justice reform”. Three points:
    1. Most well-intentioned justice reform initiatives have involved experienced and respected lawyers, judges and legal academics studying the “problems” and then telling other people what to do in a report format. As wonderful as these recommendations are, we know that, for the most part, they are not implemented. The reports sit until the next round of reform efforts which takes the same approach. We need to break this cycle and try something new!
    2. We need to have participation beyond the insiders. In the social lab, for example, the first step is to figure out what is the broader system. It is NOT limited just to the formal justice system and those who work within it. It must also include those really affected by the justice processes (users, clients) and other associated systems (health, education, labour) and then seek participation from them as well.
    3. Rob has pointed out the key need for baseline empirical data (a very long list indeed). Should we wait until we have all of this information pinned down before starting meaningful prototyping? I don’t think we can afford that luxury. However, effective prototyping will “try stuff out” AND serve people AND collect key data. Robust developmental evaluation is a critical piece of any social innovation effort. Further, in other disciplines (Social sciences for example) it is very common to look for “bright spots” (situations in which things work well) and study what underlying conditions are necessary to make this happen instead of always starting with “what is the problem”.
    Keep up the great work Nicole!

  3. I agree with much of what Kari and Rob have said, and also with Nicole’s observations about the inability of the usual formative and summative approaches to the evaluation of novel programs. (These methods are indeed “inflexible, unresponsive and unsuitable,” as Nicole put it.) The social lab / developmental evaluation approach is likely the smartest way to go about attempting to address complex social problems, with the baseline empirical data Rob mentioned and the inclusion of system users suggested by Kari.

    However, I am concerned that the social lab approach may inherently circumscribe our ability to be genuinely creative in developing new approaches to systems that are complex, multidisciplinary, massive in size and scope, expensive in execution and involve multiple stakeholders with varying commitments to reform. The implementation / evaluation / feedback / adjust / re-implemention loop required by the prototyping methodology appears limited to taking the bits and pieces of the existing system and rearranging them. It seems to me that this approach may be incapable of contemplating a fundamental redesign of existing complex systems; rather than a renovation taking the house back to the studs and starting over, the prototyping approach seems to be limited to reconfiguring the furniture to see if that works better.

    I raise this point as it is not at all clear to me that the way we manage disputes within the present family justice system has any necessary or intrinsic merit. While I suspect there will always be a need for authoritarian and perhaps adversarial court processes to address truly intractable individuals and problems of genuine urgency, I worry that a reconceptualization of the system may call for more than triage processes and the co-locatation of social services, both of which are reconfigurations of existing services, but may demand a fundamental reconsideration of our basic assumptions and a critically examination of alternatives such as inquisitorial processes, abridged trial procedures, non-adversarial judging techniques, the embedding of mental health professionals in decision-making processes and so forth.

    A social lab / developmental evaluation approach may be the most effective way to pilot new ideas and new procedures, but I suggest that it may not be suitable to addressing the more fundamental systemic issues that underpin family justice processes, assuming of course that those issues need to be addressed at all.

  4. A friend and colleague of my emailed me privately to ask the obvious question begged my comment: if not the social lab approach, then what? This is, more or less, my reply…

    “I think the social lab prototyping methodology would be an excellent way of implementing reform after the fundamental redesign of the justice system has been conceptualized, because I really am proposing a completely new way of doing family law.

    “I think that first there needs to be some big-picture brainstorming. Hand-pick a dozen of the best and the brightest thinkers on family justice, focussing on people with practice in the trenches who are genuinely creative, out-of-the-box thinkers with a deep understanding of justice issues and family law. Send them off to a secluded place outside of Banff with a box of notepads, whiteboards, markers and a case or two of good wine, and tell them to keep at it until they reach a consensus about a completely new model of family justice.

    “Bring them back, and get them to write up their model in two statements, one for legal professionals and government and another in plain language for the public and the media. Spend a year criss-crossing the province talking to the bench, bar, community groups, court staff, social workers and so on, like what Professors Rollie Thompson and Carol Rogerson did when they were pitching the Spousal Support Advisory Guidelines, and issue a final version reflecting the insights gained from touring the province.

    “The final version then is sent to a larger, more comprehensive group of stakeholders in one justice centre for implementation and adaptation through social lab style prototyping and an iterative developmental evaluation process. When processes in that centre seems to more or less ironed out, the roll out the model to other centres for the same implementation process, starting with the established model and certain unalterable core principles from that model, with to adapt the model to the specific needs of each community.”

  5. A couple of quick comments.

    1. Somewhere in my archived print library are two copies of a book I still treasure. Last year saw the publication of the sixth edition of Douglas Comer’s “Internetworking with TCP/IP Volume 1”. No doubt there are other sources that recount the process by which the Internet was designed. It was incremental. An initially very small team grew as they experimented with countless ideas. What didn’t work was discarded. The architecture that evolved was successful where individual enterprises like IBM and Digital Equipment Corporation (DEC) had tried and failed. Maybe the legal profession should consider consulting a few Internet gurus.

    2. Yesterday I submitted a complaint to the Canadian Judicial Council. This is my third. It was triggered in part by a recent development in a judicial review case to which the Law Society of BC is the respondent. The LSBC has secured a court order “sealing” the record. They also responded to the petition with a claim for “special costs” against me and my associate. Thus the record so far is one of overt recklessness.

    Mr. Harvie’s list of questions focusses on “costs”. I concede that it’s an important factor. But I’ve been contending with the legal system since a judicial review (that was initially determined in my favour) in 2003 and I have a list of very serious concerns that in my view are more consequential than the costs issues.

  6. I have become increasingly dissatisfied with my two previous comments . Although I’m sure there’s a way to edit them, correct typos and clarify my thinking, I have simply decided to rewrite my comments as a somewhat better thought out post on my Access to Justice blog:

  7. The Austin Center for Design (AC4D) has published “Wicked Problems: Problems Worth Solving”. It’s a handbook for developing social entrepreneurs, but contains detailed information on applying the design approach to wicked problems. What it adds to the social lab process is truly participatory decision making, made possible by building empathy and establishing trust with those affected by the problem – in our case, litigants, witnesses, and others affected by court proceedings or with problems currently dealt with in courts. The book is available free at
    See also