The main sales floor at Clearlink took thousands of calls and orders a day through a legacy system that was a patchwork of third-party apps, homegrown UIs, and API skins. The problems were numerous:
Reps needed multiple log-ins to access their brand’s order entry flow, and the apps regularly timed them out, often in the middle of a call.
Reps were completely blind to the marketing department’s campaigns and what the caller might be viewing on the website.
Reps skilled in several brands could barely make the disparate parts work together in order to switch brands and offer customers a variety of products.
Sales floor attrition was significantly higher than industry average.
Workarounds were frequent on the user side -- the development team had no insight into how reps made sales go through, and rep support tickets often went unanswered.
There were no analytics to speak of. Daily sales reports were available, but any kind of system failure or technical glitch was reported verbally or by email and often shut sections of the sales floor down for hours.
A backend development team was asked to build a new, custom order entry system that could be scaled across multiple sales teams and multiple brands. And to bake in analytics for continuous monitoring and improvement.
The developers were asked to start small, with one brand and one sales team to test.
I was asked to join the project after early attempts to begin development kept getting tangled up in information logistics. I was partnered with a brand new product manager who had been told to simply “own the product.” We had a few awkward encounters until I asked her to sit with me and work through a Venn diagram of our responsibilities. This is what we ended up with:
We agreed that we were working toward the same vision, but that I needed to handle user-oriented activities, and she needed to handle the development side goals and timelines. From there, we were able to collaboratively bring the developers and sales SMEs together to outline business and user requirements for the system, an expectation for MVP, and a phased approach to features and releases.
I spent half my time on the floor with sales reps, performing contextual observation, interviewing reps to understand workarounds and assumptions, and providing therapy -- the reps were so utterly frustrated with the old system, they were grateful to have a listening ear.
I spent the other half of my time with the development teams. They were working in dev mobs -- three or four folks talking through lines of code, one person typing in the final agreed upon constructs. From here, I worked to understand our technical constraints and capabilities, mapping brand requirements and business logic, surfacing any questions I could clear up from the sales floor, and QAing the software as it reached milestones.
Together, the product manager and I compared what we were planning to industry gold standards; built sprint boards, backlogs, and business logic maps; paired sales rep input with experience I had from the Insurance Order Entry project, and started wireframing flows.
Validate & Iterate
Just as I was getting ready to show test flows to users, we were assigned a designer (huzzah!). I quickly brought him up to speed on our research and the structure behind the wireframes, and he took over the branding and interface design work. We prototyped pieces of the system, and I ran usability tests with the sales reps. Their excitement over a few stitched-together wireframes was infectious -- things started moving smoothly.
We ended up working out a single sign-on (SSO) system with role-based permissions. The logic was based on training skills and a tree of roles that recognized the sales rep and moved them into a branched flow: if a rep was up-to-date on training, they would go to the live order entry form and start taking calls. If there were any outstanding training sessions, they were directed to the test side of the system and automatically into the training flow.
We built roles for admins to observe live analytics and sales reporting, roles for Operations to check on live individual rep performance and team stats, and roles for executives who just wanted high level reporting dashboards.
The sales reps were able to seamlessly log in, easily access their call histories and stats, cross-sell to all the brands they were skilled for without ever leaving the main screen, and hand off calls they weren’t skilled for to a queue for the appropriate brand. Reps really loved that they could view real-time stats for their daily performance (and we built in a little friendly competition by letting them see where they stood in team and brand standings for the week, too).
The system communicated with reps when there was an error (and simultaneously flagged it with development) - giving users advice on a fix or assuring them the bug was reported. And all visual system cues were cross-checked to be accessible for color blind and visually impaired users.
We also built out a phased rollout of features:
Flagging a returning caller and prepping the sales rep with any information we had about them in our system.
Providing a screenshot of the website page a caller had dialed in from so the rep was seeing what the caller was seeing (and therefore aware of Marketing's campaigns and offers).
Offering predictive sales recommendations based on the caller's location, demographics, and promotions we had available that week.
"Smart" comparison tables that did quick math for the reps to show which package would save the caller money, meet all their channel and entertainment requirements, be most quickly installed -- all based on common criteria we collected from call recordings and past sales data.
A real-time dashboard of rep stats to help Operations and team leads keep everyone working smoothly, taking breaks, and getting an equal share of the calls coming in.
We launched an alpha test with one team, transitioning the system to taking live calls, and I sat with the reps for the first week. Somehow, news spread fast that the test software was live, and within days, I had reps from other teams watching over my test team’s shoulders, asking when they could have the new system and would it do X, Y, and Z. We moved to beta within two weeks, and scaled to testing with three teams within the first month.
Training times decreased, training logs and rep skilling were error-free. Bug fix times were almost zero - we caught most before they went live to the floor or soon enough to warn the reps about them. Sales call times went down, and sales rep moods on the test teams were high -- all of a sudden, our chats were no longer therapy, they were brainstorms; and the reps were overflowing with ideas to keep the improvements coming.