I’m on the faculty at UVA Darden, a graduate business school which offers an MBA and an MSBA (MS in Business Analytics). In these programs, I teach classes on digital design and development (and also online courses in agile and product management).
That’s what I do, but the reason I do it is that I’m super interested in how we create the general manager of the future- the person that knows how to create customer-centric focus and then translate it into high-functioning, collaborative practice across design, development, analytics, and delivery. Basically, the ideal product manager. From a skills perspective, I think it looks something like the diagram you see here.
But does that actually produce good work in practice? And, if so, how do individuals get there? For the last couple of years, I’ve been collaborating with a colleague who teaches data science (Casey Lichtendahl- currently on leave at Google) to offer independent studies for students who are interested in applied practice integrating these skills as a kind of capstone experience at the end of their second year at Darden. We’ve seen a lot of great work, and this year we have five students do a joint project that I thought was particularly interesting.
Without a substantial problem of interest, including and especially engagement with human subjects, these projects are just an exercise in isolated technical learning. That’s OK in general, but not consistent with what a general manager does in a traction role like product management. Their job will be to drive validated learning about the user that they can use to focus subsequent work across domains.
This team was interested in the problem of how students going into their second year of the MBA program choose their courses. How do different segments of students think about this? How important are career-related learning vs. subjects of interest vs. schedule vs. recommendations from peers vs. studying with a particular faculty member? They started out by conducting subject interviews to learn about this through open-ended questions. They then used a concierge MVP to better understand how real students would engage with a more structured and facilitated selection process. Finally, they created interactive prototypes in Balsamiq and ran user testing to decide on the right approach to a user interface.
This built on their foundation courses in Product Management and Software Design and I would say maps to the Continuous Design part of an agile development process in that they were focused on driving to validated learning about their user and what propositions they might deliver that would be better than the alternatives.
Like any good product manager, then took their validated learnings and began to structure and automate the process. The tool they built stepped users through a series of questions and generated a suggested schedule. You can see the solution at work in this demo:
The version you see above was the result of iterative testing with users where the team modified the interface. It leverages a predictive model based on the last few years of student registration data as well as a linear optimization program that deals with the more deterministic aspects of the recommended course list, like scheduling.
This built on their foundation courses in Software Development and Data Science and I would say maps to the Agile Development part of an agile development process in that they were focused on iteratively figuring out how to deliver a great solution to the problem they identified based on their validated learning.
While the team led with validated learnings on their user, they also did all their own modeling and coding, a great experience in what it is to iteratively transition from idea to design to code to analytics and back again. As team member Michael Osborne put it ‘‘Conceptually, we understood the models we wanted to run, but we wanted practice integrating them into a product.”
Ultimately, this is a story about a crew of creative individuals who got interested in a problem and used it as an opportunity to both test a solution as well as sharpen their ability to execute. They are:
|Ab Boxley||Ab is finishing a dual degree (MBA + MS in Data Science), Ab worked in research and started a technology services firm. After graduation, he’s headed to Deloitte Consulting in the DC area.|
|Eric Franklin||Prior to Darden, Eric worked in strategy and analytics at Angie’s List, LRAP Association, and Cars.com. This fall he’s headed to McKinsey in Cleveland.|
|Alfonso Orozco||Prior to Darden, Alfonso worked in technology program management, enterprise solutions, and digital analytics. After Darden, he’s headed to JPMorgan Chase in Philadelphia|
|Michael Osborne||Pre-Darden, Michael worked at Capital One in analytics with a focus on data visualization. Post-Darden, he’s headed to Wayfair in Boston.|
|Thomas Regan||Before Darden, Tom worked in accounting, audit, and valuation both in industry and for EY. After Darden, he’s headed to Accenture in Atlanta.|
My prediction is that they’re going to kill it at their jobs, post-graduation!
The team learned a lot about how to integrate what they learned in the program around a digital innovation project. No big items emerged from their retrospective. That said, the individuals on the team are always looking to refine their practice.
One small but notable item from their retrospective was the challenge of working out the details of the API between the web application and the supporting services (predictive model and linear optimization). While they used Postman to test the API, using a test-driven approach with contract tests as a working specification for how the web application would interface to the supporting services is something they’d consider starting with on similar projects.
In this, they have a lot in common with the larger community of practice. Working out how to substitute working software (including tests) for complex but uncertain up front requirements is something I see lots of agile teams working out as they move to more integrated and continuous delivery.