Skip to content

Statistical Process Control Applied To Design Mentorship, Part 1

Data Visualization, Statistical Process Control, UX Design, UX Research5 min read

Since September, 2020, I've continually worked on improving the quality of my off-hours design mentorship with my UX students. Given that this was never my top priority day to day, though, my improvements over time were only incremental.

In May - eight months into my mentoring - I was intreaged by a concept I learned about called statistical process control. The next month I decided I would start applying statistical process control to my process to see what I could learn about it.

One immediate discovery got me hooked.


Disparity Between Call Booking Rates of Women and Men

Discovering the following disparity between the number of calls my male and female students were booking with me felt like the spreadsheet reached out of the computer screen and gave me a hard, slap-to-the-face wake up call.




(These are the running average total number of calls for all my students, including graduated and new students.)

How did this not occur to me before looking at the data?... Here are my three theories behind the disparity:

  1. Something about me repels women, but not men. (...But to that extent? 😅)
  2. Women perform more naturally during a self-paced and -directed course. Men need extra support.
  3. Women are more hesitant to take other people’s time. Men more naturally increase their presence.

More on this later...


So what is statistical process control?

SPC is defined by Wikipedia as, "a method of quality control which employs statistical methods to monitor and control a process. This helps to ensure that the process operates efficiently, producing more specification-conforming products with less waste (rework or scrap)."


My Desired Outcomes

Ideally over time my graduating students are continually achieving:

  1. Higher course completion rates
  2. Higher hiring rates
  3. Shorter times until their first job offer
  4. Higher starting salaries

The challenging thing about this context is that it takes a fully-engaged student between six and ten months to complete the course - and other students may take more than a year to complete it. And then after graduating, the job search takes up to six months most of the time. So I won't get good data on the results of my mentorship until January, 2022, at the earliest. Regardless, I won't reliably improve my results until I start measuring them.


My Key Performance Indicator

The number of portfolio reviews during the course is constant. The pace that students submit their work is something I have little control over. The number of calls, however, that students choose to book with me is the main variable during their course completion.


So have I been increasing the number of calls per student?

Here is the month-to-month average number of calls per student:



When I first saw this I was disappointed that the months of improvements I made to my mentorship weren't increasing the number of calls per student as much as I thought they were.

Regardless, there are valuable takeaways from this data:


1. I Need to Message My Students More

Apparently it was the increased engagement with my students during my less-busy months that mattered most. The number of calls students have booked with me correlates with my availability on any given month more than anything else. November to May were my busiest months both in my full-time job and personal life. Unfortunately it takes time to tend my students - and that's time off the books. I don't get paid for sending my students messages... So I'll have to figure out an efficient approach for that.


2. My Process Improvements in June Did Move the Needle

In June I spent as much of my off-hours CareerFoundry time working on my spreadsheet and visualizing the data as I did actually mentoring students. As a result, July saw the greatest percentage increase of calls of any month.




May also looks like I made a significant process improvement, but not so. I was just so slammed all of April that I had very little time for my students. The demand was simply higher in May.


Process Improvements I've Made

1. (June) I Matched My Availability To My Students' Scheduling Patterns

After a tedious, hour-long email inbox inventory of CareerFoundy email notifications, I gathered the data on when students were booking calls with me during the week. Then I drew the totals in my spreadsheet for booking and calls for each day of the week for the previous ten months of data. This is what I found:




Note: this does reflect student time preferences more than it does my availability.
For example, I've always had a couple Saturday morning times open, but they're
very rarely reserved.


So for my calendar moving forward, I shifted hours available for CF calls from Mondays to Wednesday and Thursday evenings. Now my availability better matches the times students prefer to schedule calls.


2. (June) I Developed Out a Spreadsheet and Dashboard

In June, I spent more time on this ugly spreadsheet than actual mentorship. I did so in the belief that it would set me and my students up for higher returns for in the months to come. The metrics from July gave me a taste of that reality.



3. (July) I Encouraged The Female Students to Be Assertive

July saw the greatest precentage increase in percentage of women booking calls with me. It also saw the lowest disparity of any month mentoring my students.




Interpretting this Data

Let's revisit those three theories I had behind the disparity:

  1. Something about me repels women, but not men. Based on September's data, it might be more accurate to say that people prefer mentors of their same sex.

    The only month where women were more likely to book calls than men were was the month where a woman mentor needed me to substitue for her. She likely had the opposite disparity among her male and female students's calls that I do - and that carried over into my month with her students.

    Do I think we should segregate students based on demographics and pair them with mentors of their same match? No. A good alternative might be to let student choose their mentors, but people should be expected to make an effort to connect with people of different demographics.

  2. Women perform more naturally during a self-paced and -directed course. Men need extra support. I am seeing this to some degree. Multiple other studies are, too.

  3. Women are more hesitant to take other people’s time. Men more naturally increase their presence. My students and I do believe this is true, and addressing this during the month of July has produced positive results, so it is true to some degree.

Bottom Line: The reason July's results with my students were best is because I made it a point to bring up this data with all my women students during our calls and I encouraged them to claim the time I've made available to them.


4. (July) I Distributed a Recommended Calls List

Tracking key metrics implicitly asks the question, 'how might I improve the process?' When focusing more on this, it prompted more urgency to complete a task I had floating in my to do list for months - to give my students a recommended calls list in order to clarify how they can more fully utilize me as their mentor. Of the four process improvements made, this is the only one that received a positive verbal response from my students. It's also the single most impactful process adjustment I've made so far (other than actually implementing SPC).

See it on my CareerFoundry portfolio case study.

© 2022 by Jordan Clive. All rights reserved.