You have questions, we have answers.
Ahead of our recent webinar, “Ask Jama: Tips and Tricks for More Effective Reviews,” we asked those who registered to quiz us with their toughest queries around reviewing requirements.
In response, we received hundreds of questions ranging from dealing with unresponsive reviewers to figuring out how requirements can best be reviewed collaboratively to getting the most out of Jama Connect™ Review Center.
Our experts who led the webinar, Erin Wilson, Senior Consultant, and Joel Hutchinson, Senior Product Manager, tried to answer as many questions as possible following their presentation.
For all the answers Erin and Joel provided during the webinar, they weren’t able to get to all the submissions. So we tracked Erin and Joel down following the webinar, locked them in a room, and grilled them about some of the questions that were still outstanding. You’ll find the output of those questions and answers below, and be sure to download our guide for more best practices around reviews.
Q: How can I review a large number of requirements effectively without consuming a lot of stakeholders’ time?
Erin: Choose the right participants and be thoughtful about the number of people you will invite into a review. Too many, and you risk not having enough time to incorporate the right feedback. Too few, and you risk not receiving enough feedback or missing critical stakeholder input. As far as the number of requirements, just performance-wise, we usually recommend 250 items in the review or less. I would like to see much less. I prefer to do things in more of an iterative fashion, where you’re sending things in and out of review as they become ready. I do understand that a lot of customers have to send something that resembles a document for review, and that might entail having many more requirements in there. I would hope that would be the exception rather than the norm. What do you think, Joel?
Joel: It’s a chicken and egg question. What’s the balance of too much content and not enough content? What’s the balance of too many people and not enough people? There’s no right or wrong answer.
I think there’s another consideration as well, in terms of electronic signatures, which is: how do you want to aggregate your signatures? Every review has the ability to configure electronic signatures. Let’s say I’ve got 250 requirements that I want approved. Do I need to show all their signatures in the same spot? Do I need to make sure that the same people reviewed all 250? That could influence whether you set up multiple reviews or one review. We have the ability to export the signatures that are made on the review, so if that’s enough, and you could stitch those documents together, great. If it’s not, and you need it all to be shown as the same general time frame that everything was approved, then you may want to lump them together.
Learn how much time and resources our customers save by using Jama Connect Review Center in this infographic.
Q: What are a few things I can do right now to conduct more effective reviews?
Erin: You can read our best practices for moderators and participants. Setting clear goals to make sure you tell people the type of feedback that you’re looking for so that people just don’t go down a rabbit hole. Are you looking for feasibility? Something grammatically correct, like, what is it that you’re looking for? Again, just picking the right participants is huge, and then making sure a team is prepared, so that they have been trained, that they understand why are we moving to the Review Center? Why are we doing this?
Joel: I think the tags of the comments is an instant improvement. When you come into a review, and you have a question on something, ask a question. If you’ve got an issue, raise an issue. That’s true in meetings, as well. If you have an issue, come out and say it. It’s the same thing in an online medium.
Q: How are we most likely to catch critical errors in requirements reviews?
Joel: Make sure you bring in the right people first off.
Erin: Making sure that people are focused on what they’re supposed to be reviewing for. If they are supposed to be reviewing as a mechanical engineer, make sure they’re looking at it through the lens of a mechanical engineer. Collaboration is huge, too. And again, if you see something that seems a little odd, don’t be afraid to say something. Your team should feel empowered to be able to speak up.
Joel: I think this is one of those questions where, just from a data perspective, smaller reviews are better. There are studies that have shown that, if you have to go through multiple pages of content, your attention span’s going to wane. Try to keep things down to a level that’s manageable, and then bring in the right people to actually unearth those critical errors. Smaller, more frequent reviews, I think would lend towards the ability to look at things more critically.
I think with a lot of things it’s a balance. When I was doing this in industry, we would mix together, there were certain things that we had to do by certain milestones. That’s forced, right? Those are bigger reviews. You have to do smaller, more collaborative reviews along the way, otherwise you’re never going to be ready for those big reviews. The same thing happens in a virtual environment. Really, we rely on the organization to know their product development process and what that right pace is. Those are the types of things that you should be thinking about. You should be thinking about when do we have to actually sign this stuff and move it, and what do we need to do to get ready for that?
Get an overview of the recent updates to Jama Connect Review Center on our blog.
Q: How do we get reviewers to move their requirements forward once approved?
Erin: It’s important to have the team understand what the process is: What now? What if? What happens here? And what happens when this review is done, and the requirements need to be transferred to some kind of end state. And then, after approval or acceptance of the requirements, you would want a way for participants and even non-participants to go back into the project and say, “Hey, now we’ve got all of these approved user needs. These are all ready for me to take action on.” We can set up filters, where we say, “Here’s all of our approved user needs, but show me just those approved user needs that are missing validation.” Maybe the validation team would know, “Hey, this is our bucket of requirements to go work on.” Stuff like that.
Q: How do you set a realistic review date, so you don’t have to keep opening the review?
Joel: There’s a balance of, when do I want people to actually look at this stuff? You set a date two months in advance. When is somebody going to look at it? Probably a week before it’s due.
That said, the reason we suggest a week is that usually, that’s the timeframe where somebody will actually think about needing to do something.
Erin: The expectations have to be clear. You have to be very clear, and this is where a Jama champion can come into play as well. You’ve got these Jama cheerleaders, so to speak, who are helping to coach team members along, and making sure that you’re monitoring the status, and monitoring the progress that people are making.
Whether you set the review for tomorrow or a week from now, you have to make sure that the expectation is understood, and that there’s something that’s going to happen at the end of that day.
If you continue to keep updating, and updating, and updating, because this one person is a holdout, then you’re going to keep doing that forever.
Joel: That’s also where the stats page comes in. As a moderator, you’re trying to drive the conversation and give people the environment to be successful. You can tell if somebody’s not participating. That’s where the stats page comes in. You could go take a look and see if there is an item, or is there something that we’re talking about that nobody understands and nobody’s reviewing? That’s something to ask questions about. Is there a person that just won’t play the part? That’s something else to ask questions about. Ultimately, the decisions of what you do with those, that information, that’s up to the team.
See how Jama Connect Review Center improves collaboration and increases efficiency in the approval process by watching our demo video.
Q: How can teams manage reviews with team members spread out across the globe, such as in the US, Europe, and China?
Erin: I recommend setting some kind of a cadence for reviewing items and giving feedback. When a moderator is providing feedback and making changes to the reviewed content, I would recommend that they get into the habit of publishing the revisions of the review at the same time, each day or every other day. That way, people can prioritize their own work and they can expect when a new version of the review is going to be published.
Jama Connect is inherently a really great collaboration tool. We just have to take a human approach, where we help people understand what’s going to come next and how to set those expectations.
Q: Can you talk a little bit about exporting review data from Jama Connect Review Center?
Joel: This is something we’ve been focusing on a lot recently, and we’ve been focusing on it for a particular reason. We want to be able to show that Review Center could be where you put your electronic signatures, and as a result of doing that, we’ve bought into certain things.
One is that we need to be able to get information out of the system that took place in the review. A review is a special place. It’s where all the collaboration happens. Depending on how much of that collaboration you want to share out, we have different types of exports for you.
We have an activity history export. That’s the audit trail. It’s just the facts: everything that’s happened in the course of all the versions of a review gets exported. That’s something that you could put in your document repository, you can send it to somebody. You can send that to Word or create a PDF.
In addition to that, we have an export that looks at a particular version of a review. It’s like, “Give me the content. Give me the people that signed and when they signed. Give me the roles that those people took as part of signing that thing, and then all of the comments that happened.” And then we put those comments out in almost like a blog post — a threaded style — so that you can follow along quickly and understand what took place in that review.
Q: What’s the one thing that people will take away from this webinar if they watch the full thing?
Joel: When a customer moves to Jama Connect in the first place, they need to have a heart-to-heart, and say, “What are we trying to do? How often should we talk?” That’s really what moving to an online medium for some of this content becomes important for. You can’t have all the conversations in Jama Connect, but you can have many of the conversations within the platform. How do you want to have conversations? How do you want to approve of stuff? When do you want to approve?
No tool is going to tell you the best way to manage your company. You have to think about that stuff before you move in. There’s a lot of choices, but once you make those choices, then the idea is that the conversation gets easier.
Learn how to get the most from Jama Connect Review Center by downloading our best practices guide.
- Leveraging Peer and Approval Workflows to Optimize your Peer and Approval Process - October 5, 2021
- Introducing Jama Connect for Companion MBSE - September 28, 2021
- What are Functional Requirements and How Do They Impact Product Development? - September 22, 2021