Preventing system crashes

Talent development systems - Part one
For the past week, I've been designing a prototype talent development system. It will take the next several posts on this blog to sketch out the design of this possibility adequately. In the process of presenting this design, you'll see several principles of system design in use by me.

A system design responds to the context it which it will function. The system needs as much complexity as the context to respond to varied changes in and extended ranges of input. When a system lacks sufficient complexity (requisite variety) , it malfunctions, overloads itself or crashes. The system cannot handle what it is asked to do. The system design over-simplified the problem, made gross assumptions or overlooked significant issues.

This prototype design anticipates four potential system crashes spawned by dysfunctional experiences of the system users:
  • Disoriented system users: feeling lost, overwhelmed by too many options, confused by what is expected of them, unfamiliar with the procedural steps, expecting something very different than this, stressed out by this imposition on them, conflicted about seeing this through, becoming suspicious of the value proposition, regressing into childish behaviors
  • Inhibited system users: trapped in the idea stage, incapable of follow through, all talk and no show, self-sabotaging their success, missing the milestones, stuck in the starting blocks, in no shape to shape up, going nowhere quickly
  • Defeated system users: derailed by the adversity, shot down by friendly fire, disheartened by too many obstacles, turning opportunities into unwelcome threats, voicing their "customer complaints", reduced to "fight or flight" mode, put on the defensive
  • Over-zealous system users: gone overboard, addicted to the system, compulsively doing more immediately, losing sight of the mission, fallen for excessive devotion, tactically fixated, possessed by the urge to try harder, going to a reckless extreme
Robust system designs incorporate added functionality to safe guard against system crashes. The incoming problem is "no problem" because a solution is already designed into the system. The negative experience of the user is transformed into feeling respected, included, understood and well-served. This often yields some fallout like more buy-in, deeper commitment, and better buzz. This prototype design handles the negative user experiences as follows:
  • Disoriented system users enter a "help subsystem" to become more oriented. Their disorientation is captured by menus, questionnaires or interviews. A response is generated that addresses their concern and expects the user to refine the search, question or problem definition. This cycle repeats until the user feels capable of making informed choices.
  • Inhibited system users enter a mentoring session to transform their emotional baggage, hot buttons, toxic introject or chronic childishness. Some version of cognitive-behavioral therapy will reframe the presenting problem and resolve the underlying issues.
  • Defeated system users go through an after-action review with a coach. The situations and user responses will be rehashed to consider other useful perceptions, interpretations and interactions. Help will be provided for troubleshooting breakdowns, solving problems and reformulating strategies.
  • Over-zealous system users take a break to explore a "big-picture" process. The development of their talent is put in perspective with other valid goals. Balance is restored as other objectives are brought into the game plan. Tempo and timing issues are reconsidered in light of the overall mission.
As a system learns from what happens to its users, it discovers what additional functionality is needed to avoid dysfunctional user experiences. A talent development system may experience contextual pressure to become multi-lingual or offer it's responses 24/7. The system may get diagnosed as over-responsive to particular constituencies and inaccessible to others. The system itself may create or feed the problems that appear as disoriented, inhibited, defeated, or over-zealous users. There may be other dysfunctional user experiences that crash the system (violent behavior, erotic misconduct, medical crises, weather-related disruptions, etc.) The design may need modification if the time it takes to prevent system crashes proves too costly. The quality of the responses during the "time outs" may be widely varied, inadequate or excessive.

As you replicate or modify the line of reasoning I've used here, you could also refine your designs for other systems that generate functional and dysfunctional user experiences.

No comments:

Post a Comment