In the U.S. Army, readiness is a term that communicates the ability to quantify force preparedness for mission success. A key component to gauging readiness is the ability to measure success accurately and efficiently across warfighter training and task proficiency.
While the Army has moved in recent years to standardize mission essential tasks (MET), which are measured using a Training and Evaluation Outline (TE&O), it has been a challenge to modernize the process beyond pen and paper. Until now.
Peraton’s Soldier Performance Application for Readiness and Talent Assessment (SPARTA) is an elegant, scalable, and secure digital mobile platform that will change the way the armed forces conduct training exercises and gauge readiness for years to come.
Brandon Mitchell, senior manager, Digital Experiences and Strategy, discusses how Peraton used design thinking to conceptualize, prototype, and develop an innovative solution that addressed the Army’s complex training and evaluation needs.
What challenges did the Army face when trying to modernize their training and evaluation process?
Most readiness training takes place in austere field environments that lack connectivity infrastructure, so evaluators have always used paper-based TE&Os—manually recording, collecting, and transporting grade sheets to a central tactical operations center and entering that data into a spreadsheet. As you can imagine, those grade sheets were subject to the elements in the field or in transit.
And that’s just the collection method. Historically, the level of measurement collected from readiness training events is insufficient to determine gaps in training proficiencies. Many times, this data was reflected in a simple Go/No-Go structure. Between those delays and a lack of transparency, Army leadership was limited in its ability to make timely and accurate decisions.
The Army has dealt with this challenge for as long as they’ve conducted training. Talk about how SPARTA came into being.
Well, this project didn’t start out as SPARTA as we know it today. Our customer asked us to develop web applications for use at training events. Initially, they were looking for a way to enter information into a laptop-based server.
Before we started, we used design thinking to get down to the user level. We started asking questions, including: Where do you need these applications? Who is going to have this app in their hands? What’s their workflow? How many of these types of events do you hold?
We didn’t really want to talk to a stakeholder about what the users needed—we wanted to see who they were, where they were, and how they’d use a system. With a limited budget, we wanted to get that right before we started development. We all know the cost of changing things later can be exponential.
How did that design thinking approach lead to better outcomes?
Design thinking isn’t just about aesthetics or delivering a beautiful product. It’s about the approach to finding a solution, working with the customer to understand their needs—engaging with not only what’s required, but what’s possible.
We knew we didn’t want to spend three months on user research and analytics. Instead, we immersed ourselves. We went to different Army installations like Fort Hood and Fort Benning and watched the events and exercises where soldiers are trained and evaluated. We saw the tools they used, identified some gaps and areas for improvement, and understood how they’d use our solution.
Our team identified that we could develop a single application that was dynamically adaptive and could support multiple types of tests—individual or collective—and multiple types of data collection methods. We followed the data, from the time the soldier or team arrived to when they left, mirroring the evaluation workflow to make it as discoverable and intuitive as possible.
We got a good idea of what we thought the customers needed and quickly moved into ideating a prototype on an iPad and putting a demo in their hands with visual click/tap-throughs. They needed to see our ideas quickly, and we wanted to get it right before we started building the actual solution.
The design thinking approach here meant a lot of rapid prototyping. Sometimes we’d iterate as soon as the next day. We’d go back and change the prototype and show them again quickly. We were able to move forward very quickly, even if we weren’t in the same location. We could show our changes on a virtual iPad, an online representation that allowed us to share changes almost immediately.
Can you talk about how developing for mobile devices fits in with digital transformation efforts, especially with military customers?
Mobility is a key part of digital transformation—not just the technology, but in how organizations are thinking about culture and customer experience. When you think about how we do business these days, it’s a natural progression. We have to be able to transform with them, to keep current and relevant.
In today’s military, there is a large user base of digital natives, accustomed to using their own devices and expecting a certain ‘polished’ user experience. More and more of today’s leaders grew up with devices in their hands, so they have different expectations. They’re not going to want to switch to just any digital product. It has to feel like a step forward. If it’s not intuitive, it’s less likely to be accepted.
In this case, we knew our solution was going to be on an iPad, so we closely followed iOS design guidelines to ensure user adoption and understanding. We wanted this to be a solution that is 100% compliant with both design standards and user experience, which ensures that it’s intuitive, discoverable, and accessible anywhere.
But SPARTA is a platform, not just an application. It was developed with DevSecOps principles, and is cloud-agnostic, so it can be adapted to any customer’s requirements. We can create new versions without reinventing the entire wheel. In fact, we learned after a number of demonstrations with other military leaders that nearly any organization that needs to evaluate personnel could have SPARTA configured to fit their needs.
What are the biggest takeaways from your experience with SPARTA?
If we had built toward the initial requirements, the result would have been a small and valuable step forward, but our work accomplished much more than digitizing an analog process. We’ve learned that it’s okay to say to customers, “Are you sure that’s how you want to do this? There might be a better way.”
We could make a one-generation jump—and risk suppressing adoption and user satisfaction—or we could use design thinking and stand up a field-tested solution that is able to meet customer needs and end-user expectations for the next decade.
We’re moving the military forward on the user experience front—and we got to a better solution fast. That’s what people want to see today.