Transforming Requirements Engineering with AI to Enhance Clarity, Consistency, and Scalability
As systems grow more complex, traditional processes struggle to keep up, ultimately impacting requirements quality. AI can assist in processing the sheer volume of data, enhancing clarity, consistency, and scalability across workflows.
Join Katie Huckett, Product Line Manager for Advisor/AI at Jama Software, for an exclusive webinar exploring how AI is becoming an essential cognitive amplifier in requirements engineering. Discover how AI is redefining the way teams detect ambiguity, surface hidden conflicts, and maintain alignment at scale.
What You’ll Learn:
Understand why requirements quality is declining under modern system complexity.
Learn the hidden costs of poor requirements and why traditional practices fall short.
Discover how AI amplifies cognitive processing and improves requirements quality.
Explore practical steps for adopting AI in your engineering workflows.
Gain insights into the future of requirements engineering with AI.
The video below is a preview of this webinar, click HERE to watch it in its entirety
WEBINAR TRANSCRIPT PREVIEW
The Collapse of Requirements Quality Under System Complexity – How AI Can Help
Katie Huckett: Welcome, and thanks for joining. Today we’re going to talk about something many engineering organizations are experiencing, but rarely say out loud. Requirements quality is collapsing under the weight of modern system complexity. This session isn’t about tools, features, or automation for automation’s sake. It’s about why this problem exists, why traditional fixes are no longer sufficient, and why AI is becoming a necessity rather than a nice to have in requirements of engineering.
My name is Katie, and I lead product strategy focused on AI-driven capabilities and requirements management. I spend most of my time working with engineering teams in highly regulated complex industries, aerospace and defense, automotive, medical devices, and other systems where requirements quality is not optional. What I’m sharing today is based on what those teams are actually struggling with in practice, not theory.
Here’s how we’ll spend our time together. We’ll start looking at why requirements quality is breaking down despite increased process maturity. We’ll talk about the hidden costs of complexity and why traditional approaches no longer scale. Then we’ll look at how AI changes what’s possible, not as a replacement for engineers, but as a cognitive amplifier. And finally, we’ll discuss what this shift means for engineering organizations moving forward. We’ll have a brief Q&A portion before we conclude today. Let’s dive in.
Here’s the paradox we’re living in. Requirements practices are more mature than they’ve ever been. Teams have invested heavily in process, tooling, standards, and governance, and yet many organizations are seeing more rework, more late stage surprises, and more friction between teams than before. What’s important here is that this isn’t happening because teams stopped caring about quality. It’s happening because the nature of the systems we’re building has changed faster than the way we manage requirements. In other words, the rules of the game changed, but most practices did not.
Modern products are no longer confined to a single domain. A single system now routinely spans software behavior, physical components, data flows, safety constraints, regulatory requirements, and operational considerations. All of these elements evolve together, often on different timelines and often with different teams responsible for each part. As systems scale and change in parallel, the number of relationships between requirements increases dramatically, not linearly. And yet, many traditional approaches still assume that these relationships can be reasoned through manually during periodic reviews or checkpoints. The challenge isn’t capability or commitment. It’s that the structure of the work itself has fundamentally changed.
Huckett: Before we go further, I want to ground this discussion in your experience. We’re going to launch a poll. Please take a moment to answer honestly. What is the biggest contributor to requirements quality issues in your organization?
Looks like we have the results in. In nearly every organization I work with, the answer is rarely just one of these. These challenges stack on top of each other, and that compounding effect is exactly what overwhelms traditional requirements practice.
Traditional requirements practices were built for a world where change was slower, and systems were more predictable. Reviews happened at defined milestones. Documents were relatively stable. Dependencies were fewer and easier to reason about. Today, however, requirements are changing continuously, often across teams working in parallel. When you apply periodic document-centric review models to this environment, gaps are almost inevitable. The process itself isn’t wrong. It’s just being asked to operate outside the conditions it was designed for.
It’s important to say this clearly. This is not a lack of skill problem. It’s not a lack of effort problem. It’s not a lack of accountability problem. It’s a structural mismatch between human cognitive limits and the complexity of modern systems.
One of the most dangerous things about requirements quality issues is that they rarely fail loudly. A single ambiguous requirement doesn’t stop a project. It quietly creates multiple interpretations. Those interpretations propagate into design decisions, test cases, and validation activities. By the time the issue is discovered, multiple teams have already invested time and effort based on different assumptions. And at that point, the cost isn’t just fixing the requirement. It’s undoing everything that was built on top of it.
Huckett: Let’s do another quick poll. Where do requirements quality issues most often surface too late in your lifecycle?
Some interesting results here. Wherever this shows up in your lifecycle, the pattern is consistent. Humans don’t see the issue until it’s already costly. That’s not a vigilance problem, that’s a visibility problem. When quality issues surface, the instinctive response is to add more safeguards. That means more reviews, more sign-offs, more documentation. The problem is that these measures increase effort without increasing visibility. Teams end up spending more time checking artifacts, but not necessarily improving quality or alignment. In highly complex systems, quality doesn’t improve by adding friction. It improves by improving signal.
This is where AI fundamentally changes the equation. AI doesn’t get tired. It doesn’t lose focus. It doesn’t skip over sections because a document is long or familiar. It can continuously scan requirements, compare them, and look for patterns or anomalies across the entire system. That doesn’t replace human expertise. It supports it by ensuring that engineers are spending their time where judgment actually matters. In that sense, AI becomes part of the engineering infrastructure rather than a separate tool.
2026 Predictions for AECO: AI, Digital Twins, and the Path to Sustainable Transformation
As we step into 2026, the Architecture, Engineering, Construction, and Operations (AECO) industry is poised for a transformative leap. From the integration of AI and digital twins to the adoption of robotics and advanced materials, the sector is embracing innovation to tackle its most pressing challenges: sustainability, efficiency, and collaboration in a hybrid world.
This year’s predictions explore how emerging technologies like generative design, predictive analytics, and automation are reshaping the project lifecycle. We’ll dive into the role of advanced digital tools in achieving net-zero goals, the growing importance of cybersecurity in a connected ecosystem, and the long-term trends that will define the industry for years to come.
In part six of this year’s predictions series, we bring these insights to life with perspectives from Jama Software’s own AECO experts: Joe Gould – Senior Account Executive, and Michelle Solis – Associate Solutions Architect, who share their vision for the future. From AI-driven decision-making to the rise of modular construction and lifecycle optimization, this piece highlights the innovations and strategies that will shape 2026 and beyond.
Curious to read leading thought leaders’ predictions for their industries in 2026 and beyond? Dive into each blog below and stay tuned for part 6, the finale of this year’s series:
What specific emerging technologies (e.g., AI, digital twins, generative design, robotics) do you believe will have the most transformative impact on the AECO industry in the next five years? How can firms prepare to adopt and integrate these technologies effectively?
Joe Gould: AI and Machine Learning will become foundational across the entire project lifecycle.
Design & Planning: AI accelerates generative design by evaluating thousands of options against constraints like cost, performance, and sustainability—helping teams reach optimized solutions faster.
Predictive Insights: By analyzing large datasets, AI can forecast risks, schedule impacts, cost overruns, and potential failures, enabling earlier and more informed decisions.
Workflow Automation: Routine tasks such as data entry, document review, and quantity takeoffs are increasingly automated, allowing teams to focus on higher-value, strategic work.
Digital Twins extend these capabilities into operations.
Operational Optimization: Real-time digital replicas of assets enable continuous monitoring and simulation, improving energy performance, asset utilization, and long-term operating costs.
Predictive Maintenance: Simulating asset behavior under different conditions helps identify issues before failure, reducing downtime and extending asset life.
Collaboration: A shared, real-time data environment ensures all stakeholders are aligned on the most current information throughout the asset lifecycle.
Robotics and Automation have been moving from experimentation to real jobsite adoption.
On-Site Execution: AI-enabled robotics handle repetitive and high-risk tasks with greater precision and safety.
Autonomous Equipment: Drones and self-operating machinery are increasingly used for surveying, inspections, and material movement, improving efficiency while reducing labor constraints.
Sustainability and Net-Zero Goals
With the AECO industry under increasing pressure to meet sustainability and net-zero targets, what role do you see advanced software, materials innovation, and digital tools playing in achieving these goals? Are there specific technologies or strategies you think will lead the way?
Gould: Important question! Advanced digital tools allow teams to understand and manage environmental impact early in the process, long before construction begins.
At the core is Building Information Modeling (BIM), which provides a data-rich model that supports ongoing analysis of energy performance, material use, and constructability as designs evolve. Energy modeling and simulation extend this by forecasting real-world performance early, allowing teams to optimize efficiency and integrate renewables before decisions are locked in.
AI and machine learning add another layer by analyzing large datasets to improve decision-making, optimize resources, and surface risks earlier. Generative design helps teams evaluate thousands of design options that balance sustainability, cost, and performance. Digital twins, fed by real-time sensor data, carry this forward into operations—enabling predictive maintenance, smarter energy management, and continuous performance optimization over the life of the asset.
Life-cycle assessment tools tie it all together by informing material choices based on embodied carbon and long-term environmental impact, not just upfront cost.
Materials innovation focuses on reducing embodied carbon and supporting a more circular approach to construction.
This includes a shift toward low-carbon materials such as mass timber, green steel, and advanced concrete alternatives, along with greater use of recycled and reusable content. High-performance insulation and composites further improve operational efficiency by reducing long-term energy demand while maintaining durability and performance.
The real impact comes from integrating these tools into a single, data-driven approach—connecting design, construction, and operations.
Key strategies:
Data-driven decarbonization, using reliable project data for transparent reporting and continuous optimization
Prefabrication and modular construction, reducing waste, emissions, and schedule risk
Circular design principles, enabling reuse and recovery at end of life
Predictive maintenance, extending asset life and reducing long-term operational waste
By aligning digital tools, materials innovation, and lifecycle thinking, the industry can move beyond incremental gains and make measurable progress toward net-zero and long-term sustainability goals.
As hybrid and remote work models continue to evolve, how do you see these changes impacting collaboration, innovation, and project delivery in the AECO industry? What tools or processes will be critical for maintaining efficiency and creativity?
Gould: Hybrid and remote work are reshaping AECO, driving efficiency, expanding access to talent, and accelerating digital adoption—but they require more discipline around how teams collaborate and deliver work.
Collaboration has shifted from informal to intentional. Cloud-based platforms, shared models, and virtual design reviews are now standard, enabling distributed teams to stay aligned without being co-located. Innovation hasn’t slowed—it’s evolved. Access to broader talent pools and increased automation of routine tasks allow teams to spend more time on higher-value problem-solving.
From a delivery standpoint, hybrid models often reduce cycle times and costs. Work continues across time zones, travel is minimized, and documentation improves because communication has to be clearer by default.
Success in this environment depends less on tools alone and more on how they’re used. Cloud BIM, collaboration platforms, and project management systems form the backbone, but clear communication norms, standardized workflows, and outcome-based accountability are what keep teams productive.
To me, the shift isn’t about where people work—it’s about building repeatable, digital-first processes that support speed, clarity, and consistent project outcomes.
AI and Automation
How do you foresee AI and machine learning shaping decision-making, risk management, and project optimization in AECO? What are the biggest challenges or limitations the industry might face in scaling these technologies to automate processes?
Michelle Solis: While AI itself will make an impact on AECO companies, one additional area where we will see impact is in building the infostructure to handle the increase of AI usage across all industries. This will mean more jobs, job sites, data centers, and projects.
Gould: AI and machine learning are shifting AECO from reactive to proactive. When applied well, they improve decision-making, surface risk earlier, and optimize how projects are planned, built, and operated.
AI helps teams make better decisions by analyzing large volumes of historical and real-time data—highlighting patterns and risks humans typically miss. Generative design accelerates this by evaluating thousands of options against constraints like cost, performance, and sustainability. On the risk side, predictive analytics and real-time monitoring help identify schedule, cost, and safety issues before they escalate. AI also drives operational gains through task automation, smarter maintenance planning, and more resilient supply chains.
The challenge isn’t the technology—it’s scaling it. Most AECO firms struggle with fragmented data, limited system integration, and inconsistent standards. There are also a real skills gap and natural resistance to changing long-standing workflows. Add in high upfront costs, unclear use cases, unclear ROI, and legitimate concerns around data privacy and accountability, and adoption slows quickly.
The opportunity is real, but success depends on getting the fundamentals right: clean data, integrated systems, clear ownership, and practical use cases that tie directly to project and business outcome
Responsible AI Adoption
As AI and machine learning become more integrated into AECO workflows, what challenges or considerations should companies be mindful of to ensure successful implementation? How can firms address these challenges while maximizing the benefits of these technologies?
Gould: AI adoption in AECO isn’t a technology problem—it’s a fundamentals problem. Success depends on data, people, and how firms manage change.
Most organizations struggle with fragmented data, legacy systems, and limited AI-ready skills. Add natural resistance to new workflows, unclear ROI, and concerns around data security and accountability, and progress stalls quickly.
The path forward is straightforward:
Get the data right: standardize, govern it, and make it accessible
Upskill teams: treat AI as a productivity multiplier, not a replacement
Start small: focus on high-impact pilots that prove value fast
Modernize platforms: move toward cloud-based, integrated systems
Keep humans in the loop: clear ownership, transparency, and oversight matter
Firms that focus on these basics will scale AI effectively—and turn experimentation into measurable business outcomes.
Data-Driven Project Management
With the growing emphasis on predictive analytics, real-time monitoring, and data-driven decision-making, what strategies would you recommend for AECO firms to better harness data for optimizing project outcomes and resource allocation?
Gould: To use data effectively, AECO firms need to focus less on dashboards and more on fundamentals: integrated systems, clean data, and teams that actually trust and use it.
That starts with moving off siloed tools and spreadsheets and into cloud-based, integrated platforms that create a single source of truth across design, delivery, and operations. Strong data governance—clear ownership, standards, and quality controls—is non-negotiable. Without clean, consistent data, analytics don’t matter.
From there, predictive analytics should be embedded directly into project workflows, not buried in reports. Tracking the right KPIs and using data to flag schedule, cost, safety, and resource risks early shifts teams from reactive to proactive.
Finally, this only works if people are brought along. Start small with high-impact use cases, involve field teams early, and invest in basic data literacy, so insights drive decisions—not just meetings.
What upcoming regulatory changes or compliance requirements do you anticipate having the biggest impact on the AECO industry in 2026? How can companies stay ahead of these changes?
Gould: The biggest regulatory shifts hitting AECO in 2026 will center on ESG (Environmental, Social, and Governance), energy performance, and digital risk. ESG reporting is moving from “nice to have” to mandatory, with climate disclosure requirements cascading through supply chains. Energy codes will continue tightening, pushing firms toward higher-performance, low-carbon, and “zero-ready” buildings. At the same time, increased use of AI and cloud platforms is driving new expectations around transparency, governance, and cybersecurity.
The firms that stay ahead won’t treat this as a compliance exercise. They’ll lean on digital platforms to track energy, carbon, and materials from design through operations, put clear AI and data governance in place, and strengthen cybersecurity practices as reporting requirements tighten. Just as important, they’ll build regulatory awareness into project planning early—before requirements show up as cost, schedule, or risk surprises.
Cybersecurity in AECO
As digital tools and connected systems become more prevalent in AECO, what role do you see cybersecurity playing in protecting sensitive project data and ensuring operational continuity? Are there specific threats or solutions companies should prioritize?
Solis: As digital tools, connected platforms, and AI become more embedded in AECO workflows, cybersecurity will play a critical role in protecting sensitive project data and maintaining operational continuity. With the growing use of AI, firms must clearly define what data can and cannot be shared with AI models, particularly when working with proprietary designs, client information, or critical infrastructure data.
Beyond data leakage, organizations also need to address risks such as AI hallucinations, bias, and model misuse, which can directly impact design decisions, safety, and compliance if left unchecked. To mitigate these risks, companies should prioritize strong access controls, data governance policies, employee training, and secure AI deployments. Establishing clear guidelines around AI use, along with continuous monitoring and validation of outputs, will be essential to ensuring both cybersecurity and trust in digital systems as adoption accelerates.
Future of Innovation
What is the most innovative trend, tool, or process you’ve seen in the AECO industry recently? How do you anticipate it influencing the industry in the coming years?
Solis: One of the most impactful trends I’ve seen recently is the increased focus on Requirements Management across rail and broader AECO organizations. While this shift is often driven by hard lessons such as losing a contract or discovering unmet requirements late in a project, it signals a growing recognition that informal or disconnected requirement processes are no longer sustainable for complex, regulated projects.
Gould: The most meaningful innovation in AECO is the convergence of AI, digital twins, and integrated platforms. Together, they’re turning projects into connected, data-driven systems that move teams from static modeling to prediction, automation, and lifecycle optimization.
At the center is the digital thread. Requirements are no longer buried in PDFs and spreadsheets—they’re connected directly to BIM, schedules, costs, and real-time performance data. AI continuously validates designs against requirements, flags deviations early, and maintains traceability from concept through operations. That shift alone reduces rework, misalignment, and late-stage surprises.
AI-powered digital twins then extend this into delivery and operations, keeping stakeholders aligned and enabling smarter, faster decisions. The result is leaner execution, better compliance, and assets that actually perform as intended—not just on day one, but over their full lifecycle.
Long-Term Trends
What trends or technologies do you think will still be shaping the AECO industry five years from now? Ten years? How can companies position themselves to remain competitive in the long term?
Solis: I don’t think there’s one technology specifically that will shape the AECO industry. Companies who make an effort to welcome new technologies and not go against them will see success. This industry doesn’t want to evolve, but it will.
Gould: Over the next 5–10 years, AECO will be defined by digital maturity and industrialization. AI, BIM, and digital twins will move from tools to core infrastructure, while sustainability and offsite construction become standard, not optional.
In the next five years, BIM becomes the project command center—fully cloud-based and connected to schedule, cost, and lifecycle data. AI is embedded in planning and design to surface risk early, optimize decisions, and improve predictability. Modular and offsite construction scale quickly as firms respond to labor constraints and schedule pressure. Sustainability shifts from “nice-to-have” to a requirement.
Hard to say but looking ten years out I would predict that digital twins manage assets end-to-end, robotics handle more field execution, and buildings operate as connected systems within smart cities. Design, construction, and operations blur into a continuous, data-driven lifecycle.
The firms that win will invest early in integrated platforms, clean data, and workforce upskilling. They’ll focus on collaboration, specialization, and strong technology partnerships—turning digital capability into real project outcomes, not just innovation theater.
Engineering for the Cyber Resilience Act: Navigating Compliance Across the Product Lifecycle
Preparing for the Cyber Resilience Act: What Engineering Teams Need to Know Now
The EU Cyber Resilience Act (CRA) is setting new expectations for digital product development. It introduces mandatory requirements for vulnerability management, secure-by-design engineering, traceability, and post-market monitoring. For manufacturers of connected or software-enabled products, this represents a critical shift in how you build, document, and maintain your technology.
In this webinar,Patrick Garman, Manager of Solutions & Consulting at Jama Software, breaks down the complexities of the CRA, reviews enforcement timelines, and demonstrates how to integrate cybersecurity directly into your product lifecycle.
What You’ll Learn:
Deconstruct CRA Requirements: Gain a clear understanding of obligations for manufacturers, importers, and distributors, including secure development practices and vulnerability handling.
Operationalize Secure-by-Design: Learn practical strategies to embed security into your engineering workflows from day one.
Master Software Bill of Materials (SBOM) Transparency & Traceability: Discover how to maintain the rigorous documentation and traceability of the new regulation demands.
Navigate the Enforcement Timeline: Get a clear view of upcoming deadlines to help you prepare your organization strategically.
Leverage Jama Connect® for Compliance: Explore how a modern requirements management tool helps track threats, link mitigations to requirements, integrate testing, and prove compliance.
Don’t wait until the deadline approaches to address these critical changes. Watch now to ensure your team has the knowledge and tools to navigate the CRA successfully.
The video above is a preview of this webinar – Click HERE to watch it in its entirety!
TRANSCRIPT PREVIEW
Patrick Garman: Hi, everyone, and thank you for joining today. My name’s Patrick Garman, and I am the Solutions Manager for Energy, Industrial, and Consumer Electronics sectors here at Jama Software. Today, I’m going to be talking about the EU’s Cyber Resilience Act, or the CRA. I’ll explain what the CRA actually is, what it means for product developers, and how you can show evidence of secure by design without creating unnecessary overhead. I’m also going to briefly show how Jama Connect supports your CRA compliance. At a high level, the Cyber Resilience Act is an EU regulation that applies to products with digital elements, so hardware with software, firmware, or connectivity, and standalone software products as well. It’s not a technical standard, and it does not tell you how to implement security; it focuses on outcomes. Did you consider cybersecurity risks? Did you define mitigations? Can you show how those were implemented and maintained? It’s also worth saying what it’s not. It’s not saying that products must be perfectly secure, and it’s not trying to turn product teams into security researchers. It’s really about making cybersecurity part of normal product engineering, just integrating it into your process.
And the motivation behind the CRA is pretty straightforward: products today rely heavily on software, but cybersecurity practices across manufacturers vary a lot. Some teams are very disciplined, and others rely more on informal knowledge and experience. From a regulatory point of view, that makes it hard to assess product risk and hard to respond when vulnerabilities show up later, so the CRA is really about creating a consistent baseline, so cybersecurity is treated more like safety, reliability, or quality, something you design for, document, and revisit throughout the product lifecycle. And the penalties can be pretty stiff for non-compliance. You hear, for non-compliance, up to 15 million euros or 2.5% of your global annual turnover. Products can be barred from the EU market for non-compliance. It does include mandatory incident reporting, and it also establishes liability for manufacturers for unsafe or insecure products, so it is something that is very important to prepare for and be ready for. If you strip away the legal language, the CRA requirements really fall into a few practical buckets. First, you’re expected to identify cybersecurity risks that are relevant to your product and how it’s used.
Garman: Second, those risks should lead to actual security requirements, design constraints, controls, or behaviors that mitigate the risks. Third, there needs to be evidence, not just that you thought about security, but that the requirements were implemented and verified. And finally, the CRA expects manufacturers to manage vulnerabilities after release, things like intake, assessment, updates, and communication. And the challenge is doing it consistently and in a way that you can explain later, especially if this information is spread across different repositories. Before I jump into a demo in Jama Connect, I want to set up how to think about CRA compliance in Jama Connect. The CRA is ultimately asking for something pretty specific, can you prove a clean line from the cybersecurity risk to mitigation to verification, and then keep that story intact as the product changes? And Jama Connect’s a great tool for this because it’s designed for exactly this kind of lifecycle traceability with definable traceability information models that provide guardrails for your process. And the model I’m showing here, threats must link to one or more security requirements, and security requirements must link to verification evidence like test cases or analysis.
And if we want to go deeper, we can link into design and implementation artifacts as well. And the reason that this matters is that once these rules are in place, you’re not relying on memory or tribal knowledge. Jama Connect can guide teams towards consistent linking, and it becomes much easier to answer the questions that come up in audits and reviews, such as which risks are unmitigated, which mitigations aren’t verified, and what changed since the last release? And the other big benefit is the change impact. Sorry. When a new vulnerability pops up or a design decision shifts, Jama makes it practical to see what requirements, tests, and releases are affected without manually stitching it together across documents and spreadsheets. With that framing, what I’ll show next is a simple example. We’ll take a threat and author a requirement against it, and then see the verification evidence, so you’ll see how the relationship rule set keeps the traceability clean and reviewable. For this dem,o I’m going to keep the model intentionally simple. We’re going to start with a cybersecurity threat analysis, trace that to a security requirement, and then to a validation.
Garman: And in this scenario, I’m going to use the CVSS, which stands for the Common Vulnerability Scoring System, the 3.1 model, to score severity consistently. CVSS is traditionally used for vulnerabilities, but teams often use that same scoring structure for threat scenarios because it is familiar and repeatable. And I have a pre-created threat analysis item so that we can focus on the traceability aspects. But here you can see I have a place where I can provide a name, a description of the threat or vulnerability, and also select all of the appropriate vectors within the CVSS scoring model. And I’m also using Jama Connect Interchange™‘s Excel functions to calculate the base score and assign a severity rating, along with the temporal score and environmental score. Again, these are all calculated automatically on the backend as you define your threat vectors. And the reason I like capturing all of these attributes here in Jama Connect is it makes the assumptions explicit. Stakeholders can review the score, disagree with it, and adjust it, but we’re not hand-waving severity. And because it’s all on the same system as our requirements and validations, the cybersecurity story stays connected.
Requirements Elicitation: A Step-by-Step Approach to Defining the Right Requirements
The success of any new product or project hinges on a simple, yet challenging task: collecting requirements. When done well in a carefully controlled process that lives up to the more aptly named eliciting requirements, it leads to a product or project that meets everyone’s expectations. When done poorly in a haphazard manner, it results in costly rework, missed deadlines, and a final delivery that fails to satisfy anyone.
The process of gathering input from a diverse group of stakeholders—each with their own priorities and perspectives—poses multiple risks. Time and costs can quickly spiral, and the danger of missing a critical requirement is ever-present. This article explores the basics and benefits of following a systematic process for requirements elicitation.
The High Cost of Unstructured Requirements Collection
Product and project leads are under pressure to get requirements complete before anything else begins. Without a systematic process designed to ensure intended outcomes, project or program success is exposed to these significant risks:
Wasted Time and Resources: Ad-hoc soliciting, eliciting, tracking, and organizing requirements in documents and spreadsheets is incredibly time-consuming and prone to error. This inefficiency directly translates to higher project costs and slower time-to-market.
The Risk of Missing Requirements: A disorganized process makes it easy for critical requirements to fall through the cracks. Discovering these gaps late in the development cycle leads to expensive changes and frustrating delays.
Incomplete Stakeholder Input: Failing to identify and engage all relevant stakeholders—from internal teams like Sales and Product Management to external partners like customers and partners—can result in a product that is misaligned with market needs or technical constraints.
The key takeaway: An ad-hoc approach to collecting requirements is not just inefficient; it’s a direct threat to your project’s success.
How to Systematically Elicit Requirements: A 5-Step Process
To mitigate these risks, adopt a structured approach. These steps will help you gather, organize, and track requirements with greater clarity and efficiency.
Step 1: Define the Project or Project Scope and Objectives
Before you elicit a single requirement, ensure everyone has a shared understanding of the goals. What problem are you trying to solve? Who are the users, and what are their priorities? What does success look like? What industry or corporate standards will require documentation to demonstrate compliance?
A clear project charter or vision document is essential for keeping all subsequent requirements aligned with the core objectives. This document should be a living resource, regularly revisited, and carefully updated in a controlled manner based on learning throughout the process.
Step 2: Identify and Map Your Stakeholders
A stakeholder is anyone with an interest in or influence on your product or project. Missing input from a key stakeholder is a common point of failure. The lists below are some common stakeholders but are not an exhaustive list.
External Stakeholders: Customers, end-users, suppliers, partners, and regulatory bodies.
Create a stakeholder map to categorize individuals and groups based on their level of influence and interest. This helps you prioritize engagement and tailor your communication strategy.
Step 3: Choose Your Elicitation Techniques
There is no one-size-fits-all method for collecting requirements. Use a mix of techniques to gather comprehensive information:
Interviews: One-on-one conversations are great for understanding individual needs and complex details.
Observation: Ethnographic studies and usability analysis can expose current problems or identify opportunities that a product might solve, but that users and other stakeholders might not be able to see or articulate.
Focus Groups: Facilitated group sessions are effective for brainstorming, resolving conflicts, and building consensus among stakeholders.
Surveys: Use questionnaires to gather input from many stakeholders efficiently, as long as the requestions are articulated to avoid injecting bias and responses are interpreted carefully.
Document Analysis: Review existing business plans, market analysis, and technical specifications to extract relevant requirements.
All of these techniques are powerful but can be risky in the hands of inexperienced personnel.
Step 4: Document and Organize Requirements in a Centralized System
As you gather requirements, you must organize them in a way that is accessible, clear, and traceable. A scattered process makes it impossible to see dependencies, track changes, or ensure complete coverage.
The most important part of this step is moving away from manual methods and toward a single source of truth that applies a systematic approach and automation to maintain control and visibility.
Step 5: Review, Refine, and Validate
Collecting requirements is not a one-time event. It’s an iterative process, and work products can span generations of products and product lines. Once documented, requirements must be reviewed by stakeholders to ensure they are clear, accurate, and complete. This feedback loop is critical for refining the product or project definition and gaining formal sign-off before development begins.
Other Key Considerations
What is the difference between collecting, gathering, and eliciting requirements?
While often used interchangeably, “gathering” or “collecting” can imply simply accumulating information sitting around waiting to be picked up. “Eliciting” suggests a systematic and organized process of soliciting, documenting, and managing requirements from various sources to build a complete and validated set.
How can I ensure I haven’t missed any key stakeholders?
Start by brainstorming all possible groups and individuals affected by the project, both inside and outside your organization. Review past projects of a similar nature to see who was involved. A key practice is to ask the stakeholders you’ve already identified, “Who else should we talk to?”
What’s the biggest risk of a poor requirements collection process?
The biggest risk is building the wrong product. Missing or misunderstood requirements can lead to a final product that doesn’t meet customer needs or business goals, rendering the entire development effort a waste of time and money.
Can AI help speed up the process?
Yes, Generative AI can be useful in suggesting requirements and uncovering gaps in requirements already identified. Be prepared to store suggestions that are outside the scope of the current project for possible use in future ones.
To ensure that your process for eliciting requirements for complex products or projects goes smoothly, use a modern tool designed specifically for that purpose. Jama Connect® is designed to address the core pain points of requirements elicitation by providing a collaborative, single platform accessible to all your stakeholders from the start through the end of your project, as well as across product lines and product generations
With Jama Connect, you can:
Centralize Everything: Create, review, validate, and verify all requirements in one place, eliminating the chaos of documents and spreadsheets.
Improve Stakeholder Collaboration: Bridge silos between teams and provide all stakeholders with real-time visibility into goals, progress, and interdependencies.
Enhance Requirement Quality: Use the Jama Connect Advisor™ add-on to Jama Connect to author and analyze requirements for clarity and consistency against industry standards, including the EARS syntax. Natural language processing (NLP) helps you write better requirements from the start, avoiding ambiguity that leads to costly rework later.
Ensure Traceability: Easily track relationships between requirements, test cases, and risk analyses to understand the impact of any change.
Don’t let scattered documents and manual tracking derail your requirements elicitation activity. A systematic approach supported by the right tool is the key to developing complex products successfully.
Note: This article was drafted with the aid of AI. Additional content, edits for accuracy, and industry expertise by Mark Levitt and Sarah Crary Gregory.
2026 Predictions for Semiconductors: AI, Chiplets, and the Path to Sustainable Innovation
As we step into 2026, the semiconductor industry stands at the crossroads of unprecedented technological advancements and complex global challenges. From the rise of AI-driven chip design and heterogeneous integration to the growing emphasis on sustainability and geopolitical shifts, the sector is navigating a transformative era.
The next wave of innovation will be defined by breakthroughs in advanced lithography, chiplet architectures, and quantum computing, while sustainability efforts will reshape manufacturing processes to address energy efficiency, water usage, and materials recycling. At the same time, the industry faces critical hurdles, including talent shortages, supply chain realignments, and the need for robust cybersecurity measures.
In this year’s predictions series, we’ve gathered insights from leading semiconductor experts:
Together, they explore the trends and technologies shaping the future of semiconductors. From AI-driven automation and edge computing to the challenges of regulatory shifts and the promise of chiplet-based architectures, this piece highlights the innovations and strategies that will define 2026 and beyond.
Q: What emerging technologies (e.g., advanced lithography, AI-driven chip design, quantum computing, heterogeneous integration) will have the most transformative impact on the semiconductor industry in the next five years?
Simon Bennett: In the next five years, the semiconductor industry will continue to grow, almost doubling in size from today to $1Trillion by 2030. But to sustain that growth, the industry will go through some extreme changes and challenges. The first trend to note is actually due to a declining trend as Moore’s Law continues to slow. [Editor’s note: Moore’s law is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years.]
Moore’s Law has driven the growth of the Semiconductor industry for many decades, but it is bumping up against the fundamental laws of physics. The economics of scaling to the next node are increasingly prohibitive and taking longer and longer to reach fruition.
Whilst keeping an eye on what is coming out of China, there will be some more mundane but equally challenging technology trends that are emerging and will become increasingly important in 2026 and beyond. These are AI driven design, and both chiplet and wafer scale designs (two opposite ends of the spectrum, but both an engineering reaction to the slowing of Moore’s Law).
Neil Stroud: Given the ever-increasing innovation around AI and its associated deployment, chip development is under continued pressure to keep up. This is applicable across all architectures, including Central Processing Units (CPUs), Graphics Processing Units (GPUs), and Neural Processing Units (NPUs). Naturally, continued optimization will happen around acceleration and emerging technologies like process node shrinks (advanced lithography), AI-driven chip design, and the chiplet approach (heterogeneous integration). Process node shrinks will contribute. However, the chiplet approach will also drive heterogeneity across architectures and nodes. All these factors will intimately impact the next generation of chip families for AI in the datacenter and at the edge.
2: Sustainability and Manufacturing Efficiency
Q: How do you see sustainability influencing semiconductor manufacturing, particularly in areas like energy efficiency, water usage, and materials recycling? What strategies will help the industry achieve greener fabrication processes?
Bennett: This is a great question, and right now, the elephant in the room. From Fabs to datacenters, the environmental impact is huge. Water consumption alone is a huge factor. Twenty years ago, visionary realtors quietly purchased acres of land close to a bountiful supply of water and close to a large data pipe. Those realtors are now wealthy, and the secret is out. Now the price of that land is at a premium. So, the investors behind the fabs and the datacenters are using government subsidies and their own funds to find alternative sources of energy and resources. Nuclear is making a comeback, driven in part by the energy demands of the datacenters. Municipal areas like Phoenix are making guarantees of plentiful water to companies to attract them to their region; that will put them in direct conflict with farmers in California.
Most of this is happening off the radar of the mainstream media, and the political arena is presented as a battle for the best jobs. The concern over the environmental impact is not yet front and center. Two events will likely happen to change this:
The AI bubble will inevitably burst. Just like in the early days of the internet, there will be market correction as reality catches up to expectation. Just like the internet bubble, this doesn’t mean that AI is not going to be a societal change; it just means the market got too overheated.
Unfortunately, there will be some kind of accident related to the overbuild of the infrastructure around Datacenters and Fabs. A dam will burst (Phoenix – see Roosevelt Dam), or a multibillion fab will be damaged by a natural disaster (see fault lines in Taiwan). These two events will raise awareness of environmental costs relating to sustainability and manufacturing efficiency.
In other words, in the next five years, we will be forced to take a pause, a breath, and truly measure the value vs the cost. This isn’t a bad thing. Our human history of technology transformations is punctuated with these pauses and resets. Usually for the better.
Steve Rush: Sustainability is hugely influential and important. Energy demand is forecasted to accelerate with new data centers and the demand for AI. Semiconductor companies need a system to help manage their sustainability requirements and, very importantly, validate them. Implementation to hit targets and balance, power, efficiency, and sustainability will be a series of trade-offs – semiconductor organizations will need a tool to trace all of this information and prove that they meet sustainability targets and goals.
Sarah Crary Gregory: While the semiconductor industry is obviously fiercely competitive, it can match that intensity with fierce collaboration on critical issues. Sustainability is probably the most prominent area where industry consortia such as the Semiconductor Climate Consortium bring companies together to tackle common problems. Initiatives to enable water reclamation, reduce emissions, and produce data quantifying the return on investment of sustainability practices will be more critical with the burden placed on these resources from the exponential expansion of AI. The semiconductor industry is highly interdependent, and nobody believes that there’s a way to get a competitive advantage by monopolizing natural resources. The way forward is through innovations that decrease resource consumption and minimize waste, and initiatives for water reclamation/”net zero” resource use will continue to be essential investments.
Stroud: I think there are two parts to this. Firstly, the environmental impact of actually building the chips in foundries. A huge amount of effort and investment has gone into sustainability in semiconductor manufacturing, including energy efficiency, water usage, and materials recycling. semiconductor manufacturing and materials. A great example of this is massive recycling of water used in fab processes, as well as optimizing processes and the associated chemicals used, including minimizing atmospheric emissions.
Secondly, there is the environmental impact related to the deployment of the device itself, as it consumes power and emits heat. Of course, the extreme example of this is the data center where huge racks of GPUs or CPUs are deployed, collectively consuming Megawatts of power to both power them and cool them. Again, huge investment is going into driving data center efficiency. One way to contribute is through chip design optimization to improve ‘performance per Watt.’ That is simply a measure of how much computing can be done for a given Watt of power. This optimization can happen through design and architecture efficiencies as well as process node shrinks. Ensuring the software stack is also developed to drive efficient use of the underlying hardware platform also has a fundamental role to play. It’s easy to see that these steps can have a profound positive impact on the environment caused by the global electronics footprint.
Q: How is AI accelerating innovation in semiconductor design, verification, testing, and manufacturing? What challenges must companies overcome to fully leverage AI-driven automation?
Bennett: Natural language and agentic AI will continue to show up across the tool chain. But expect some resistance from SOC design engineers, who, ironically, since they are at the epicenter of the AI revolution, are traditionally conservative and slow to adopt new methods. Verification is the most in need of help with AI-driven automation, since there just aren’t enough engineers on the planet to drive the verification needs of an SOC. (see salaries on Glassdoor). It’s been estimated that with the use of AI, a team of 3 expert verification engineers can do the work of 5 traditional verification engineers with limited use of AI, in 3 to 5x less time. This is a compelling message to an (open-minded – see below for a caveat) engineering VP struggling to find the resources to deliver a fully validated product on time. These engineers and the tools they use will be in high demand in the next five years.
Beyond design, AI will show up in yield and manufacturing analytics. The challenge of inventory and yield management in the era of disaggregated chiplet-based designs is magnified. It’s essential that all the chiplets deliver the yield and volume needed at the exact same time. The overall package is only as good as the weakest tile. This is an underserved opportunity within the big three EDA companies, and the packaging OEMS tend to jealously protect their homegrown investments in solving these challenges. Expect emerging startups to come forward as disruptors in this particular segment in 2026 and beyond.
Rush: Every company is looking for ways to utilize AI in their organization. AI can play an important role in managing traceability, especially from siloed systems that are isolated from one another. Agentic experiences that improve engineer productivity really are key. The main challenge that AI has in the semiconductor space, in particular, is adoption with the engineering team. AI experiences must improve engineering productivity; they must be accurate, and they cannot be an impediment to use. If AI-generated content is of questionable quality or if the AI experiences become too burdensome to use, AI initiatives risk dying on the vine.
4: Supply Chain and Geopolitical Shifts
Q: How are global supply chain realignments and geopolitical factors shaping semiconductor strategy? What can companies do to mitigate risk and ensure resilience in developing complex products on their own or with co-development partners?
Bennett: A global supply chain developed over the past thirty years has delivered $1T in cost savings. This $1T is now under serious threat as the world is a very different place compared to when this globally interconnected environment was first conceived. In the next five years, expect China to become more self-sufficient as it replicates every aspect of what it previously relied on from overseas, from EDA to IP to fab equipment. Expect to see semiconductor-based products from coffee machines to phones to servers to (even) EVs sourced almost exclusively from China with little to no reliance on anything beyond the shores of China. This will trigger protectionist measures in the US and the EU as they work to protect homegrown industries from what will become increasingly consumer appealing products from the Chinese factories.
A more optimistic view may be that the tensions ease as the US / EU recognize the need for open trade with China, and continue to see its designs realized in Chinese factories (but I’m not holding my breath). In semiconductors, companies will be most susceptible to this shift in China as they move to homegrown alternatives. As the geopolitics ramp up, the focus on Provenance in the West will become a C-suite / US Senate / EU Parliament level of attention. Knowing where every component or piece of code originates, its genealogy will become paramount. A counterforce will emerge where the information is “buried” as the realization hits that we can’t possibly trace the root of every bit of code, every nanometer of design. Companies will emerge with one of two unique value propositions: 1) we can audit your product and provide the provenance, 2) everything you use is contaminated; we are a new company, built cleanly from the ground up. Somehow, all three will survive – the traditional companies, the auditors, and the new “clean” companies. But there will be some very interesting mergers and acquisitions, mostly off the radar as these three entities re-align and learn to co-exist.
Rush: These days, you can basically count on major geopolitical news covering the semiconductor industry week in, week out. At the end of the day, co-development and partnerships are key. The semiconductor supply chain is mind-bogglingly complex. Adopting modern, more collaborative tooling is on the rise. Historically, the semiconductor industry has even been hesitant to adopt cloud-based solutions, and I’ve definitely seen a change in the last few years around this.
Stroud: Like many other segments, the semiconductor market tends to be cyclic, which leads to times of undersupply and oversupply. This is a complex problem to manage with many factors, including global supply chain realignments and geopolitical factors. Naturally, foundry capacity has a big role to play, and we seem to be in an investment phase right now with a number of fabs being built. This is a massive investment with a modern fab costing tens of billions of Dollars and taking multiple years from construction start to mass production. Communication and collaboration across the ecosystem also has a role to play, especially now that we are accelerating into the chiplet era, which can help mitigate risk and ensure resilience in developing complex products.
5: Chiplet and Heterogeneous Integration
Q: What role will chiplet architectures and heterogeneous integration play in addressing performance and scalability challenges? What technical and ecosystem hurdles must be overcome?
Bennett: Chiplets are essential to the continued growth of Semiconductors. Without chiplets, the forecast CAGR ($1T by 2030) is unreachable (basic economics of Moore’s Law). The challenges are two-fold: 1) engineering challenges around interconnecting tiles from different suppliers running at high speed and with the thermal challenges of a modern chip; and 2) coherence – the coherence of the supply chain, compliance, and verification. More specifically, the standards emerging need to be better governed (e.g., Universal Chiplet Interconnect Express (UCIe) for interconnect and system architectures if they aren’t going to become bottlenecks stymying growth.
6: Talent and Workforce Development
Q: With growing global demand for skilled engineers and manufacturing specialists, how can companies address the talent shortage in the semiconductor industry?
Bennett: This is where AI needs to step in and become more readily accepted within Semiconductor Engineering orgs. As stated above, studies show that a small team of AI proficient verification engineers are 5x+ more productive than a traditional team. However, the resistance comes from within – engineers are conservative, and within a traditional engineering organization, the manager / Director / VP still measure their worth by the number of engineers the corporation is willing to fund. This leads to destructive behaviors, such as a VP of Verification Engineering employing 100 RTL validation engineers to do the job that 10 Functional Verification engineers could do because “it’s too expensive to hire the functional verification engineers” – the companies that will thrive and succeed in the next five years are the ones who break down this cultural impasse.
Rush: There are a lot of talented people in the job market right now who can help fill the gap. Hopefully, semiconductor companies will look to hire talent from across industries – automotive, medical, and aerospace. There are certain challenges in getting enough skilled foreign workers to fill certain roles – I’m more concerned that there are many highly skilled, talented people out there looking for jobs!
7: Regulatory and Export Controls
Q: How do evolving export controls, trade policies, and security regulations impact semiconductor innovation and competitiveness? How can companies adapt strategically?
Bennett: They don’t impact semiconductor engineering innovation or competitiveness – in fact, they improve it. Case in point is China – as access to advanced GPUS / EDA tools was limited, they innovated, and actually improved on the technologies they didn’t have access to. Another example is where the Russian engineers working for US companies prior to the war in Ukraine were let go and went to work for Russian companies, helping boost the AI business in Russia. But where the question applies is the innovation at the corporate level. Engineering innovation can be stymied by a C-suite overly concerned about trade or political issues. The paradox is that smaller companies with less of a global or political reach could feel less compelled to avoid the risk associated with innovation.
Gregory: “Evolving” is an understatement! The volatility around export controls and trade policy in the United States right now is simply unprecedented, and 2026 looks like more of the same. Companies can strategically navigate these unsettled times by implementing systems –people, processes, and tools – that enable maximum response flexibility. Modular architectures, whether they’re chiplet-based, specific configurations of IP cores, highly modular software, or other building blocks, will enable the development and delivery of products whose configurations can be changed and modified as circumstances warrant. Variant management is a critical capability to be able to swap features in and out based on policy changes. Solid, well-governed data foundations will be critical to stay on top of the wildly shifting policy landscape.
Q: As demand for edge AI and high-performance computing grows, what innovations are most critical to meet performance and power efficiency goals?
Bennett: There are many ways to answer this, but I’ll focus on the chip-level design aspect. First, the interconnect, as previously described – the clean adoption of UCIe and a strong governing body to oversee its evolution (think Universal Serial Bus, or USB.) 3D packaging needs to keep up with the thermal demands of a heterogenous package – this may lead back to the engineering talent pipeline previously discussed since the engineers who have the combination of skills to analyze and design (future-proof) these packages are unique (think warping of a substrate as it reacts to thermal pressures, leading to subtle issues with the interconnect manifesting as signal integrity.)
Rush: I’ll answer this more from a – data isolation – perspective. Design and testing are really important, but more important is tracing all the way to the highest level and validation. I think responsible AI will help with efficiency here, but companies need a way to trace from the top down. In all honesty, this is a challenge for the semiconductor industry – having one single source of truth that can prove you’re hitting sustainability goals.
9: Cybersecurity and IP Protection
Q: With increasingly complex global supply chains, how can semiconductor companies protect intellectual property and secure their design-to-production ecosystems?
Bennett: Expect a lot more reference to initiatives such as Software Bill of Materials (SBOM) and Engineering Bill of Materials (EBOM.) Expect the concept of a Bill of Materials (BOM) to evolve and take on more significance in the next few years. Expect the term Provenance to take on more importance. Traditional PLM companies will position themselves as the answer, but there will be significant pushback from the semiconductor industry, and rightly so – these PLM systems were never developed with semiconductors in mind. They are monolithic in nature, expecting the end user to move their data into their environments. The C-Suite will sign on, the engineers won’t. This will lead to QMS and IT organizations emerging to manually clone the data inside the PLM systems. For a while, this will seem just fine, until one or more issues come to public light, and the C-suite exec realizes they have spent a lot of money on tools and resources, and it didn’t solve the problem. Those companies that invested in a more lightweight engineer-friendly solution, providing traceability, compliance, and coherence insights without the costly overhead of monolithic tools and the resources that go along with them, will grab the attention of those who lost out. And yes, AI will play a part. A well-managed digital thread with the ability to expose itself in a controlled manner to intelligent insights will win out.
Rush: I mentioned earlier that semiconductor companies are adopting more cloud-based tooling. But they are not slacking in terms of security needs. By selecting best-in-class tools with exceptional infosec track records (like Jama Connect), they are effectively balancing speed and agility with security and not sacrificing either. They are pushing their vendors to expand their tool sets to deliver best-in-class experiences with rationale, scalable permission structures that are tightly governed. They’re looking for tools and vendors that are putting AI at the center of their vision – but need their vendors to offer closed, secure LLMs or integrations with in-hours AI systems.
Stroud: This is not a new issue! The semiconductor industry has been wrestling with intellectual property protection and securing the design-to-production ecosystem for years. The challenge is how to build enough flexibility in the ‘fixed’ silicon that, when combined with software (across all layers), is able to guard against future exploits and vulnerabilities. It’s almost impossible to build a modern chip without multiple integrated security capabilities. Also, it’s worth noting that security has to be a multidimensional approach in this age of hyperconnectivity, spanning seamlessly from cloud to edge. This is why we see an ever increasing number of emerging security standards that apply to both implementation and development processes, impacting hardware, software, and system design and deployment.
10: Future Outlook
Q: What do you see as the most important technological and market shifts that will define the semiconductor industry five to ten years from now? How can companies position for sustained leadership?
Bennett: 1) Semiconductor Technology: Chiplets, and the packages that are needed to realize their promise to alleviate the decline of Moore’s Law. 2) Companies: very different answer–the companies that will succeed in the future are those that completely obfuscate the hardware considerations from their customers—it’s all software, don’t worry about the hardware – we have taken care of that.
In summary, in some ways it’s the same old story – recognize and reward the unique engineering talent that helps differentiate your product, understand what the customer wants, and remove the barriers to growth. Sounds simple, right?
Rush: With AI, the amount of data that companies will manage is going to increase tremendously. Trying to manage that traceability is going to be extremely challenging. Jama Connect, with the new scaling improvements and AI vision, is at the forefront of the market and uniquely positioned to help semiconductor companies here.
Gregory: Agreed. AI is already reshaping the demand side of the market equation. The supply-side will evolve to support highly customized semiconductor design, even purpose-built and assembled solutions that are rapidly defined and fabricated. Edge AI and NPUs (neural processing units), along with open architectures such as RISC-V (and the RISC SW Ecosystem), will further broaden the horizons for semiconductor companies. How to be positioned for success? Again, it’s all about response flexibility. Sensing both strong and weak signals in the market and systematically building resilience into the company’s organizational practices will determine which companies emerge stronger from the challenges of the next five to ten years.
Transform Engineering Processes: Bridge Gaps Between Teams and Tools Effectively
Engineering organizations face challenges delivering complex products on time, within budget, and with high quality. Teams often work with different tools, creating data silos that slow the digital engineering process. These gaps lead to missed requirements, delays, and defects.
In this webinar, our Jama Software experts Preston Mitchell, Vice President of Solutions & Support; Mario Maldari, Director of Product & Solution Marketing; and Vincent Balgos, Director of Solutions & Consulting, discuss how Jama Connect®, and our Jama Connect Interchange™ add-on, address these challenges through key use cases.
What you’ll learn:
Traceable Agile: Integrate systems engineering and software teams using Jama Connect + Jira to drive quality and speed.
Scalable FMEA Process: Empower reliability and risk management teams with Jama Connect + Excel for efficient FMEA analysis.
Universal ReqIF Exchange: Seamlessly import, export, and round-trip ReqIF exchanges across requirements tools with Universal ReqIF, enabling teams to co-develop requirements with stakeholders and partners.
The video above is a preview of this webinar – Click HERE to watch it in its entirety!
VIDEO TRANSCRIPT
Preston Mitchell: We are here to talk about how to save precious engineering time, and each of us is going to cover a specific use case that we think will help your teams save a lot of time, utilizing both Jama Connect, as well as Jama Connect Interchange. And when you think about where is most of the time wasted in engineering teams, we typically find it’s something that visually looks like this. It’s siloed teams and tools across the system engineering V model, and we really find that these things are the number one cause of negative product outcomes.
You know them, you’re probably intimately familiar with them. It’s a lack of identification of defects, missed requirements, or lack of coordination. A lot of manual steps to connect things, maybe requirements that live in one tool, and your system testing that lives in a different tool. And a lot of this can be highly manual, which is really a tough thing when you have to satisfy some of the industry regulations that a lot of our customers work with.
As we all know, kind of late detection of issues really leads to a huge cost in order to correct that with a project. You can kind of see in this bar graph here, that I’ve got on the left the different phases, going to the right of a typical product development. So you’re starting in the requirements definition and design, and moving all the way to acceptance testing. Typically, the number of faults or problems are introduced very early in the requirements definition and design phase. But the problem is they aren’t found until later in the project, like during integration or system testing. And even if you get to the acceptance testing level, you can see the exponential increase in cost to fix these expensive errors. These is not Jama Connect’s numbers, these numbers are from sources at The International Council on Systems Engineering (INCOSE) and National Institute of Standards and Technology (NIST). So you can really take away from this is the fewer errors that we introduce early, or the faster or sooner that we identify those issues, the better off we’re going to be and the more engineering time we are going to save.
How do we do this? Well, Jama Software, we are the number one requirements management and Live Traceability™ product in the market. We really bring a lot of resources and technology to bear to help you manage your product development, whether that’s complex and highly scaled types of products. We help you bring all the collaboration and reviews online. And we help you, number one, integrate the different state of the product across the many disparate tools that you might have in your engineering departments, and, specifically, that’s going to allow you to then measure and improve your traceability.
Mitchell: We work with a lot of the key industries that you see here at the bottom, and in particular, like Vincent, you work with the medical devices. I think your use case that you’re going to cover is going to be very built off of that medical device industry. But really, a lot of the use cases we’re going to cover today are applicable to all of these industries.
We are the leader, and we’d like to be bold about it. We are number one according to G2 in terms of requirements management and traceability tools. So we encourage you to check out the different ratings and how we stack up against our competitors.
The ultimate goal that we want to get you to is saving that time. So moving from disparate, siloed teams and tools to an actual integrated system of Live Traceability. We actually have benchmark data from all of our cloud customers, where we can actually show a correlation between the customers that have a greater traceability score, meaning all the expected relationships have been built out. We find that they have 1.8x faster time to defect detection, nearly 2.5x times lower test case failure rates, and then typically a 3.5x higher verification coverage. So it behooves you and your engineering teams to think about how can we actually integrate, and save ourselves time, and that’s just going to create a higher-quality product down the line.
I’d be curious to pause right here. We have a poll. I’d be interested in asking, if you take a step back and think about your R&D teams, all the different tools and teams that you have, what percentage would you say today in your organization is actually fully covered by Live Traceability? 100%, 50%, 0%? I’d be kind of interested in the scale on that. So we should see a poll pop up here, and I’ll give you a couple of seconds to answer that.
Now, we see some answers coming in. Thank you. Yeah, as to be expected, it’s not anywhere near 100%. Most of the companies that we work with are struggling with this, and so this is where we really want to help them out. And how do we do that? Well, our Jama Connect Interchange add-on to Jama Connect is a really powerful tool that we’re going to walk you through today, and it’s going to allow you to automate the connection between your data and process.
So we’re going to cover three use cases. I’m going to talk briefly first about Traceable Agile™, and this is how we integrate systems and software teams, using Jama Connect and a very popular tool that a lot of our software organizations use, which is Atlassian Jira. So we’ll talk about that Traceable Agile use case. Then Vincent is going to cover the Scalable FMEA Process, so how to utilize the power of the functions that are in Excel, and bringing those functions to bear inside of Jama Connect, so that you can do risk management and reliability management, but tied in with your requirements and testing. And then, finally, we’ll end on Mario covering Universal ReqIF Exchange, and this really enables you to co-develop with partners and suppliers across Jama Connect, but also maybe even different requirements management tools. So let’s dive in.
Mitchell: So when you think about Traceable Agile, Agile software, it’s a methodology, as well as a philosophy. It’s been around software teams for a long time, and it works well. It’s been widely adopted, and widely successful. At the same time, a lot of complex products are not made up of solely software. They have to actually be integrated in with the hardware and perhaps other mechanical aspects of these products that you’re building. So there’s a balance, right? There’s a balance of being completely Agile, but also making sure that you follow some process.
And kind of where we find that Agile sometimes can break down when we talk with software engineering leaders. They have these very common questions that they bring up, and it’s what keeps them up at night. How do I know which requirements have been missed? Am I actually covering everything? How do I know that I’m actually testing all of my requirements, and which ones of those have failed? The fourth bullet there, how do I identify rogue developments? It’s like, how do I make sure my teams are not gold-plating the product, and we’re actually meeting the stakeholder or the user needs that we’re trying to deliver to? And then, finally, change. Change is a given in this fast-paced environment, so how do I know when impacts are made? When changes are made in the software or in the hardware, how do I know what those impacts are across?
So the solution to this is Traceable Agile. It’s really no change to how your software teams may work today using Atlassian Jira. Really, what we are adding on is the ability to auto-detect gaps and measure and take action on those. And so I’m going to step into Jama Connect to give you a little bit of a demonstration here.
The Future of Requirements Management: Top 10 Trends to Watch in 2026
Requirements management keeps changing and evolving. With new technologies and project demands emerging every year, teams can’t rely on the same old playbook and expect great results. Instead, organizations are finding new ways to define project needs, work together, and use technology to their advantage. Adapting these shifts isn’t optional; it’s a must for any business that wants to keep up and deliver real value.
Staying ahead of these changes is crucial for maintaining a competitive edge. This article will explore the ten most significant trends shaping the future of requirements management. From the integration of artificial intelligence to the growing importance of sustainability, we will provide actionable insights to help you prepare your team for the challenges and opportunities of 2026.
1. AI and Machine Learning Will Become Standard
Artificial intelligence (AI) and machine learning (ML) are moving from niche applications to core components of the requirements management toolkit. These technologies are revolutionizing how teams elicit, analyze, and validate requirements. AI-driven platforms can now automate the tedious work of sifting through customer feedback, technical documents, and interview transcripts to identify key needs and potential conflicts.
This automation frees up business analysts and product managers to concentrate on high-value strategic tasks. For instance, AI can generate initial drafts of user stories, acceptance criteria, and even test cases, significantly speeding up the development cycle and reducing the likelihood of human error. The result is a more efficient process that produces higher-quality, more consistent requirements.
2. Sustainability Goals Will Be Integrated into Requirements
Environmental, Social, and Governance (ESG) criteria have become a major focus for corporations worldwide. This shift is now directly impacting project development, as sustainability is no longer just a corporate goal but a tangible project requirement. Requirements management processes must now incorporate non-functional requirements that address a product’s environmental impact and ethical footprint.
This means teams will need to define and track metrics related to energy efficiency, material sourcing, accessibility, and data privacy. By embedding these ESG considerations directly into the project’s foundation, organizations can ensure that sustainability is a core design principle, not an afterthought.
3. Cloud-Native Platforms Will Dominate
The move toward remote and hybrid work models has accelerated the transition to cloud-based requirements management solutions. These platforms offer a single, centralized source of truth that is accessible to all stakeholders, regardless of their location. This real-time collaboration is essential for keeping distributed teams aligned and productive.
Cloud-native tools offer more than just accessibility; they provide the scalability needed to handle projects of any size and offer seamless integrations with a wide range of development and operations tools. This creates a connected digital ecosystem where information flows smoothly from initial idea to final deployment, enhancing transparency and overall project efficiency.
With the increasing frequency and sophistication of cyberattacks, security can no longer be addressed late in the development cycle. The practice of “shifting left” is becoming standard, meaning security considerations must be integrated into the requirements phase. A single vulnerability can compromise sensitive data, leading to severe financial and reputational damage.
Requirements management must now include the proactive definition of security protocols, data encryption standards, and strict access controls. Methodologies like threat modeling are becoming common practice during the initial project stages to identify and mitigate potential security risks before a single line of code is written.
5. Deeper Alignment with Agile and DevOps
The rapid iteration cycles of Agile and DevOps demand a fluid and responsive approach to requirements management. The era of the static, hundred-page requirements document is over. In its place is a dynamic, living backlog that evolves alongside the project. Achieving this requires deep, seamless integration between requirements management software and popular Agile platforms.
This tight alignment ensures that development work is always synchronized with the latest project requirements. It facilitates a continuous feedback loop, where learnings from sprints and testing can be used to refine the backlog instantly. This adaptive approach allows teams to respond quickly to changing market needs and deliver more valuable products.
6. Digital Twins Will Validate Requirements Virtually
Digital twin technology offers a groundbreaking way to test and validate requirements in a risk-free virtual environment. By creating a detailed digital replica of a product, system, or process, teams can simulate its behavior under countless scenarios. This allows stakeholders to see and interact with a virtual version of the final product long before physical production begins.
This is especially valuable for complex hardware, manufacturing, and infrastructure projects. Using a digital twin, teams can identify design flaws, optimize performance, and ensure that the documented requirements translate into the desired real-world outcome. This process minimizes costly late-stage changes and significantly improves product quality.
7. Collaboration Will Extend Across Business Networks
Projects today rarely happen in a silo. They involve a complex network of internal departments, external partners, suppliers, and customers. Effective collaboration across this entire ecosystem is critical for success. Enterprise communication platforms and business networks are becoming indispensable for sharing information and facilitating collective decision-making.
By integrating these collaborative tools directly into the requirements management workflow, organizations can create a transparent and inclusive environment. This ensures all stakeholders have an opportunity to provide input and that their feedback is captured, tracked, and addressed, reducing misunderstandings and preventing project delays.
Ultimately, a project’s success is measured by how well it meets the needs of its end-users. This has led to a much stronger focus on user-centric design principles within requirements management. Techniques such as developing detailed user personas, mapping out customer journeys, and conducting usability testing are no longer optional extras; they are essential practices.
Adopting this user-first mindset ensures that every requirement is tied to a tangible user benefit. By building a deep understanding of the user experience, teams can prioritize features that deliver real value, resulting in products that are not only functional but also intuitive, engaging, and enjoyable to use.
9. Advanced Analytics Will Drive Decision-Making
Collecting project data is easy; turning it into actionable intelligence is the real challenge. Advanced analytics and business intelligence tools are empowering requirements managers to make smarter, data-driven decisions. These platforms can visualize complex data sets, identify emerging trends, and even predict potential project risks.
By analyzing both historical project data and real-time performance metrics, teams can gain a much clearer picture of project health. This allows them to proactively manage scope, optimize resource allocation, and improve the accuracy of future estimates, leading to more predictable and successful project outcomes.
10. Continuous Learning Will Be Non-Negotiable
The tools, technologies, and methodologies in requirements management are in a constant state of flux. To remain effective, practitioners must embrace a culture of continuous learning and professional development. This involves staying current with new software, mastering emerging best practices, and honing essential soft skills like facilitation and strategic communication.
Organizations that foster this culture by providing access to training, certifications, and other learning resources will empower their teams to navigate the evolving landscape successfully. A commitment to continuous improvement is the key to building a resilient and competitive organization.
The trends shaping requirements management point to a more collaborative, intelligent, and user-focused future. By embracing these changes, your organization can not only keep up but lead the way. Begin by assessing your current processes against these trends and identify the areas that offer the greatest potential for improvement. The future of your projects depends on it.
Note: This article was drafted with the aid of AI. Additional content, edits for accuracy, and industry expertise by Decoteau Wilkerson and Mario Maldari.
2026 Predictions for Aerospace & Defense: AI, Sustainability, and the Digital Transformation Frontier
As we approach 2026, the aerospace and defense (A&D) industry stands at the crossroads of innovation and transformation. With rising geopolitical tensions, increased defense spending, and technological advancements, the sector is navigating a complex landscape of opportunities and challenges.
From the integration of AI and digital twins to the push for sustainable aviation and the modernization of legacy systems, A&D organizations are embracing cutting-edge technologies to enhance efficiency, safety, and mission readiness. At the same time, they face critical hurdles, including supply chain disruptions, evolving regulatory frameworks, and the need to attract a future-ready workforce.
In this year’s predictions series, we’ve gathered insights from leading industry expert professionals from Jama Software:
Together, they explore the trends and technologies shaping the future of aerospace and defense. From AI-driven design optimization and autonomous systems to the rise of sustainable aviation fuels and the challenges of digital engineering, this piece highlights the innovations and strategies that will define 2026 and beyond.
Please note: This blog features content from writers in the UK and the US. Spelling variations (e.g., ‘defense’ vs. ‘defence’) may appear due to regional differences.
Emerging Technologies
Q: What emerging technologies (e.g., digital twins, advanced materials, AI-driven design optimization, autonomous systems) do you believe will have the most transformative impact on the aerospace and defense industry in the next five years? How can organizations prepare to integrate these technologies effectively into existing programs?
Matt Macias: Dramatic product transformations are already underway, and we will see increasing fielding of cyber-physical systems that take advantage of software-based intelligence and features combined from the beginning to fully capitalize on extensive use of sensors and electronic systems, as well as the physical aspects of the system. I am very excited to see this next round of intelligent/cyber-physical systems in operation. Should processing capability and AI enable further breakthroughs in model performance, the opportunity to see live or near-live digital twins of craft used to monitor health or guide optimized operations/missions is a tantalizing possibility with enormous potential to decrease costs, increase availability, and mission success.
Karl Mulcahy: With increases in Defence spending occurring worldwide, I’m seeing a move towards Digital Transformation to help in all manners of A&D business. Whether this is for a larger Defence contractor or a new Space Innovator ‘Start Up,’ there’s much more of a focus on moving away from legacy methods and more towards adopting modern technology such as AI to help automate more in operations.
With larger organisations wanting to pivot to being more agile, competitive, and delivering innovation quicker, there’s more of a challenge to modernize legacy systems and to connect data sources, whereas I’m hearing that startups want to learn from time in industry to help define good processes now to aid scalability and drive efficiency.
The need to create digital twins to reduce risks, undertake cheaper / continuous improvement, and helping to innovate faster is a big driver for the customers I’m working with. Also, the need to strategically reuse items from previous projects for modernization programs, or even new variants/products, is a focus to help get to market faster and meet ever-changing market demands.
Cary Bryczek: One tangible example that nearly anyone who travels will benefit from is the modernization of the air traffic controller (ATC) to pilot communications system. Today, controllers unbelievably still use Very High Frequency (VHF) and Ultra High Frequency (UHF) radio signals technology developed in the 40s to communicate with pilots. While new technology aids decision-making, human error remains a significant factor in ATC operations. Voice commands spoken at a rapid pace due to air traffic congestion, received by pilots who may not have English as their native language, over VHF/UHF where signals can be interfered with or stepped on, increases the number of mishaps in aircraft flight takeoffs and landings. Mishaps are on the rise. As of December 2025, there have been 1,097 aviation accidents or incidents in the United States in 2025, according to the National Transportation Safety Board—not including the most recent crash by the UPS cargo jet in Kentucky. Many point the finger at poor ATC technology, policies, and failure to act on the numerous alerts at this location over the past decade as significant contributing factors to the deadly collision of the Army Blackhawk helicopter with the Bombardier CRJ7000 passenger airliner in Washington DC.
My prediction is that AI-assisted technology will dramatically improve the safety in our airspace. Navigation signals will be intelligently generated by the AI based on data and presented to air traffic control operators to be sent as a text message directly to the pilot. Pilots receive it and can even have the navigation message tell the aircraft to change course.
Sustainability and Green Aviation
Q: As the industry pushes toward decarbonization, how do you see advancements in sustainable aviation fuels (SAF), electrified propulsion, and hydrogen-powered systems shaping the future of aerospace? What strategies will be key for scaling these solutions globally?
Macias: While we have not seen the focus on these technologies recently due to a series of financial headwinds, we are just waiting for the next breakthrough in affordable power density solutions in batteries and alternative fuels. These alternatives could also become more viable as new craft become viable with more limited/focused missions that could benefit. In short, while this area may not be making the progress desired as of late, I am optimistic of surprises around the corner that might bring this back to the forefront.
Mulcahy: Despite challenges in this part of the industry, we’re starting to work more with companies retrofitting older aircraft with modern technology i.e. SAF (Sustainable Aviation Fuels), and whilst sustainable to reuse existing products out there and help to make them greener, this is arguably the fastest, lowest risk route to immediate CO2 reductions due to compliance with regulations and existing infrastructure around it.
Whilst we can all see innovation occurring within the eVTOL, UAV, AAM markets due to market needs and also to develop new compelling product lines, I’m curious to see how regulations will continue to emerge in these fields in line with new infrastructure being molded too, i.e., VertiPorts, charging bays.
But with more companies choosing not to develop everything in-house, there are emerging challenges of systems integration and ensuring that all parties are aligned to be fit for purpose and align with higher-level requirements to ensure risks are mitigated, and for example, range/weight calculations are verified correctly.
Bryczek: As much as I personally wish for technologies like hydrogen propulsion and battery propulsion to make our airspace cleaner, this is getting pushed farther out. The technology for batteries is not expanding rapidly enough to make this approach viable at a large scale. Many of the eVTOL startups have already changed their designs from pure electric to now hybrid-electric aircraft. For major manufacturers Airbus and Boeing, finance challenges are plaguing them in different ways. Boeing is still recovering from loss in sales and design/manufacturing problems with its jets and has less ability to focus on the necessary R&D for hydrogen propulsion. Airbus too has slowed its development in hydrogen, citing both infrastructure technology and regulatory difficulties. Interestingly, there have been press releases indicating Airbus shareholders are reaping sizable dividends, yet R&D budgets remain flat. Many in Europe argue that tax exemptions for delivery of aircraft using fossil fuels be eliminated, which does sound like a healthy step in the right direction. So, my answer to this question is that the industry is going the route of evolution rather than innovation.
Digital Transformation
Q: How is digital engineering transforming design, verification, and lifecycle management in aerospace and defense? What are the biggest opportunities and challenges in achieving a fully integrated digital thread?
Macias: In product development transformation, we are now seeing the true impact of model-based product development fully realized, where all disciplines across the enterprise can now both benefit from their own dedicated models, and perhaps even more importantly, the synergistic collaboration around holistic models that bring together all aspects of product, production, operation, and mission. This emerging success will be dramatically accelerated in the near future as Model-Based Systems Engineering (MBSE) and AI/ML concepts get more fully deploye,d with special benefit coming from the democratization of these iterative and collaborative data/model constructs, helping all understand how their work fits into the whole and how they can optimize all aspects of the product.
Mulcahy: The need for a digital thread is emerging more than ever to ensure interconnectivity between systems, reduce siloed working, and ensure the overall single source of truth. Whether companies are looking to deliver projects on time or reduce costs, there is a clear business case to establishing digital engineering practices. However, to get there a large challenge companies are facing is to embrace open technologies that can communicate to each other and allow data exchange. Furthermore, there’s a need to shift from document driven approach to model-based, data-centric workflows to connect teams and empower them with data to make better decisions.
Bryczek: The Department of War certainly is trying as hard as it can to get its workforce to change in step with newer digital engineering methods. It issued its new Digital Acquisition Strategy in November, which directly calls for leveraging digital engineering approaches and data over documents vs. traditional approaches. Requirements will be defined and validated in the context of a model and integrated with software and mechanical models. This vision is sound, but it is not happening across the board overnight. There are opportunities, but the biggest barrier remains the government personnel and their will to change the status quo and invest in the available technologies to make it happen.
We will continue to see increasing development converging around product families and feature-based development. Those who are smartly designing their products to follow Modular Open Systems Architecture (MOSA), which provides a higher degree of interoperability and vendor choice by the customer, will continue to have more success in the government market.
Q: What role will AI and machine learning play in enabling autonomous flight, predictive maintenance, and mission readiness? What impact will AI have on design and manufacturing processes? What challenges might arise in ensuring safety, reliability, and certification?
Macias: I would like to see AI applied in three areas: 1) easing, broadening and acceleration of multi-disciplinary optimization of the product development process; 2) assistance and assurance of quality, comprehensively and consistency of development team work, preventing surprises and moving engineering further and further up-front opening up an order of magnitude of more possibilities; 3) combined with digital twins, AI could assist greatly in ensuring that all operational products are safe, healthy and operating effectively. All 3 of these effects would have a dramatic impact on safety, effectiveness, and cost/sustainability (not to be overlooked as a major driver of ecological concerns itself).
Bryczek: This question is endlessly broad, so I’d like to focus on the less glamorous segment of aircraft maintenance. I described already how there is a rise in air traffic control mishaps, some even leading to deaths. 2025 has been the most vivid year for aircraft accidents in my own personal memory. As more aircraft remain in service such as the aging MD11 that crashed in Kentucky killing all aboard and many on the ground due to a maintenance problem, and aging fleets being sold from one airline to the next often to younger international companies lacking the decades of the culture of safety that enable the processes and procedures for strict maintenance, we see evidence of aircraft slow to catch up to service bulletins and in some cases ignoring warning alerts leading to crashes and mishaps. Machine Learning will be able to use data to predict maintenance needs. It will analyze sensor data, as well as part requirements and testing data tracked even after part delivery, to predict part failures, preventing costly downtime and improving safety by alerting aircraft operators
Responsible AI Adoption
Q: As defense organizations expand their use of AI, how can they balance innovation with ethical and regulatory considerations? What frameworks should guide responsible AI adoption in mission-critical systems?
Mulcahy: There has to be a combination of human education/accountability, transparent governance, with security being a large part of this. With challenges like export control/data restrictions being a large consideration in defence projects, it’s important to test AI’s output and work before rolling out on a wider scale.
It will be interesting to see if organizations like the DOD and NATO release any guidance and/or frameworks for responsible & secure AI use in projects and/or missions.
Bryczek: In my observation, the US Government has taken a more responsible posture to AI than the commercial world. The Department of Defense has already published its Responsible AI (RAI) Toolkit, which is both a practical and public resource providing guidance to align AI projects with best practices and ethical principles as well as concrete activities that need to be taken when implementing AI. One of the five principles that jumps out to me is the “Traceable Principle: AI capabilities should be developed with transparent, auditable methodologies and data sources so personnel understand the technology and its operational methods.”
Traceability is Jama Connect’s core competency spanning engineering disciplines, bringing together the collaboration of both traceable decision-making and data. I predict we will see more use of Jama Connect in AI projects.
Macias: Karl and Cary’s answers are excellent and capture this topic well.
Supply Chain Resilience
Q: How do you see aerospace and defense companies adapting to ongoing supply chain disruptions? What technologies or practices will strengthen resilience and reduce risk in global production networks?
Mulcahy: Having worked with both sides of the supply chain here, with larger System Integrators / Consortium managing lots of parts/players, or with lower-tier suppliers who are changing their business model to become more diverse or enter into new markets, it’s clear how they want to adapt and streamline – by becoming digital.
By embracing technology to become more efficient, more collaborative, and robust, companies are able to differentiate by identifying gaps earlier with connected datasets and make decisions to take action quicker. With remote/international working still forming a large part of the Aerospace & Defence supply chain, it’s important to utilize secure communication to ensure continuous alignment. Furthermore, we’ve seen supply chains being strengthened due to mutual transparency and predictability, leading to more longer-term agreements and better future forecasting for future projects.
Macias: We believe strongly that the Aerospace and Defense supply chain can greatly benefit from increased model and digital data-based collaboration and traceability. As this becomes more adopted, we should see opportunities arise for more resilience and also avoidance of surprises and other quality impacts. At Jama Software, we are working hard to enable this.
Cybersecurity and Data Protection
Q: As aircraft and defense systems become increasingly digital and connected, what are the top cybersecurity challenges facing the industry? How can organizations safeguard sensitive data and critical assets?
Bryczek: We will see continued security mandates for Defense agencies as well as all contractors developing systems under contract, to be scrutinized heavily. Cybersecurity is no longer just an IT issue; it is a core element of national security. Threats have grown far beyond the days of old, with just malware and social engineering. Organizations will be putting more focus on Software bill of materials (SBOM) programs, which are driven by: Executive Order 14028. SBOMs provide full transparency into software components used in defense systems, helping mitigate supply chain compromise, hidden dependencies, and embedded malware and backdoors. This is especially important for weapons systems, avionics, and mission-critical software.
For example, U.S. departments of Defense, Homeland Security and Transportation all have launched cybersecurity initiatives affecting aviation. The Federal Aviation Administration mandated that airlines establish and maintain cybersecurity programs. The European Union Aviation Safety Agency developed a cybersecurity roadmap to address threats to the air traffic management system and operators. In addition, industry groups like the Aerospace Industries Association and National Business Aviation Association rank cybersecurity among key issues facing the aerospace industry.
Workforce and Skills Transformation
Q: With new technologies reshaping engineering and manufacturing, what skills will be most in demand in the aerospace and defense workforce of the future? How can organizations attract and retain this talent?
Mulcahy: There’s a growing need for skills around MBSE / Digital Engineering methods, of course, knowledge about AI / M,L with more technology being developed and introduced into manufacturing today and, no doubt, in the near future. Further skills around cybersecurity and overall secure systems engineering are proving to be in demand. With more software now being embedded into products, both system safety and security are becoming more important to focus on, with companies looking to streamline more to various regulations such as DO-326.
Organisations can attract this talent by helping to innovate quickly by adopting modern tools/workflows, but also empowering employees to make decisions and be able to get on with the task at hand. There are cultural/financial aspects too, which I’m sure are important, but I feel a big thing is to provide opportunities for continuous learning. This will prove to be important to employees to understand new technologies, advance their skills, and also, in turn bring more benefits to their business by applying their learning to continuously enhance workflows and inspire future generations.
Macias: I couldn’t agree more with Karl! The workforce of the future will need the ability to work both in their area of specialization as well as appreciate the total system’s effects, hence the rise in importance of systems/requirements engineering and optimization competencies.
Bryczek: Modern aerospace projects are massive in scale and complexity, involving interdisciplinary teams and subsystems. Systems engineering is the glue that holds everything together, ensuring that avionics, propulsion, structural components, and software work seamlessly. Proficiency in systems thinking, risk management, and integration processes used to be vital but now the new systems engineer is an AI Engineer. AI engineers blend systems engineering, software development, computer science, and user-focused design. This mix helps them build smart systems that can tackle specific tasks or achieve set goals. The skills of an AI engineer are typically: building algorithms, model training, data preprocessing, and model deployment.
Q: How do you see evolving regulations and policies, including new cybersecurity frameworks—impacting innovation and program timelines? How can organizations stay ahead?
Macias: The industry is demanding agility and rapid innovation to react to new technologies and new mission needs. We see this coming from government defense organizations across the globe, where acquisition reforms and digital engineering strategies are coming to the forefront to acknowledge the need to accelerate product to market/field at cost and on schedule. We can expect this to dominate focus going forward, with all product development organizations needing to leave behind legacy tools and processes and move to highly agile, innovative digital model-based approaches to keep up.
Bryczek: There are many moving pieces to the evolving regulatory and policy landscape, which include everything from revamping and rebranding AS9100 to the IA9100 series quality standards, acquisition reform acts such as SPEED and FoRGED that are supposed to stimulate faster technology adoption, and significant cybersecurity rules for AI and Zero Trust, all driven by the FY2026 National Defense Authorization Act. These policy and regulatory changes drive the key changes in what we will see is more open collaboration between government agencies to ensure systems being built do not overlap, and that systems are being developed using interoperable technology. The FACE and MOSA standards will become more important than ever. Commercial organizations need to prepare for the new international quality requirements, embrace digital transformation (AI, cyber), and adapt to faster, more agile defense acquisition processes to remain compliant and competitive.
Long-Term Trends
Q: What trends or technologies will continue to shape aerospace and defense over the next decade? How can organizations ensure sustained innovation while managing cost, risk, and compliance?
Mulcahy: We’ve seen a big theme of reuse and sustainability in industry recently. Reusable satellites, rockets, and even technologies in use such as autophage. No doubt innovation will continue to happen across the wider industry, to help solve global challenges, aid to defence efforts, and contribute to electronic warfare. I think AI will continue to be introduced to more areas of businesses and continue to aid moves towards Digital Engineering and overall efficiencies. I think as research continues and more innovation is created from academia for example, there may be closer links formed between Industries, academia, and potentially even governments to co-invest and accelerate technology development.
Organisations should continue to invest in education on these new technologies to protect themselves, but also to introduce better workflows, attract new talent, and help to deliver projects on time. But an important factor will be to use modern tools fit for today’s project needs that are open and facilitate a digital engineering way of working.
Macias: Sustained/accelerated innovation with improved efficiency, quality, and compliance will be the goal over the next decade, and those who capitalize on current digital engineering practices will be best positioned to both capitalize on emerging AI/ML technologies and improvements in modeling/processing capabilities. The key to this will be the establishment of traceable, agile, model-based environments that bring everyone together in a common view of the total system, giving all the ability to contribute to the total success of the product, production, and mission. This can only be accomplished if organizations focus on democratization of the digital thread and common (MBSE & RM) models by avoiding deepening or perpetuating silos.
Bryczek: Long-term trends in the defense industry are driven by rising geopolitical tensions, increased defense spending—particularly in Europe—and rapid advances in emerging technologies. Global military expenditure continues to grow as nations respond to a worsening security environment and pursue modernization, with NATO members increasingly meeting higher spending targets. The industry is shifting toward autonomous and unmanned systems, including UAVs, USVs, and ground platforms, to reduce human risk, with swarm technology becoming a major focus. Investment is also accelerating in hypersonic missiles and directed-energy weapons to counter evolving threats. Additionally, space is emerging as a critical military domain, with growing emphasis on autonomous spacecraft, satellite-based surveillance and communications, and managing the risks of space militarization and debris.
2026 Predictions for Medical Device & Life Sciences: AI, Wearables, and Navigating Regulatory Change
With 2026 on the horizon, the medical device and life sciences industries are moving through a landscape defined by fast-paced innovation, changing regulations, and dynamic market shifts.
From the transformative potential of Artificial Intelligence (AI) in product development and diagnostics to the growing role of wearables and personalized medicine, the industry is embracing change while addressing critical challenges like cybersecurity, data privacy, and supply chain resilience.
In this year’s predictions series, we’ve gathered insights from leading experts across the field, including:
Tom Rish, Senior Business Development Manager, Medical Device & Life Sciences
Together, they explore the opportunities and hurdles that lie ahead, offering a glimpse into the future of medical devices and life sciences.
Join us as these experts share their perspectives on the technologies, strategies, and innovations that will define the next chapter of the industry. From AI’s growing influence to the challenges of regulatory harmonization and the rise of wearables and personalized medicine, this piece highlights the trends shaping 2026 and beyond.
Q: How do you see AI shaping the future of medical device design and manufacturing, diagnostics, and patient engagement in 2026 and beyond?
Richard Matt: I see AI organizing and mining information that predicts more effective use of medical devices. AI will be used in product development to predict more effective product design and in post-market assessments to confirm or refute assumptions about the treatment’s effectiveness.
Adam Smith: AI has become the connective layer across the device lifecycle, replacing manual research with automated analysis of predicates, guidances, standards, and historical evidence. This reduces ambiguity, improves consistency, and supports more adaptive systems that learn from real-world performance. It also drives more personalized, device-integrated insights, bringing engineering, clinical, and regulatory teams into tighter alignment.
Mike Celentano: AI is already shaping the MedTech development space and will continue to increase its influence in 2026 and beyond. For example, systems engineers I work with already use AI to summarize and affinitize voice of customer interview verbatims into stakeholder needs. Some are also using AI to help organize their requirement statements. Others are using various AI personas as independent reviewers of their deliverables. In 2026, these uses will become more common. But other AI uses will emerge including AI-based trade analysis based on MBSE models since there is now a strong textual component to SysML. AI will also emerge more in risk analysis and root cause analysis. In short, wherever AI can make developers more efficient and/or increase quality, it will emerge as such.
Dan Purvis: AI has amazing abilities when harnessed well. There are many places where an algorithm can do a much better job than a person. I think that you are going to see more therapies with an AI component that makes a suggestion that is then reviewed by a person.
Vincent Balgos: What we’ve seen in industry so far is the continued strong interest in exploring how AI can contribute in developing safe and effective products, but with the limited ROI to date, industry seems to be taking a more methodical and deeper approach in discussing the more how and why of AI. Example, there is initiative to discuss data standardization of AI information following IEEE 2801 or other best practices gleaned from BigTech companies such as Microsoft, Amazon, and Google.
Carleda Wade: I’m seeing more customers looking to explore how they can incorporate AI into their development process. While many companies have yet to create full-blown policies on the use of AI at their organization, I can see this increasing in the coming years with the popularity of AI in everyday life. I think that people in our industry will be a bit conservative in their initial use of AI until FDA standards and guidelines are released. I could see it being very useful in processes like post-market surveillance.
Jakob Khazanovich: AI is becoming ubiquitous, but it will be a tool to work faster and smarter rather than a replacement for human engineers. In the future, initial draft requirements, test cases, or even entire trace matrices will be created by AI and then refined by engineers. Many companies will be slow to formally adopt the use of AI, but there is no question that engineers have a ChatGPT window open on the side to help them refine design artifacts quickly. In manufacturing generally, I could see AI being used to optimize part designs for strength, cost, and moldability.
Romer De Los Santos: AI has been growing fastest in imaging and genomic analysis for a while now. However, I’ve been seeing growing interest in using AI to accelerate their product development process by handling repetitive and tedious tasks. Jama Software is already moving towards automated test case generation, for example. I expect AI will help enable increased modularity of medical devices, manage complex product variants, and quickly identify and patch components with security issues.
Tom Rish: There is no escaping AI, and it is certainly poised to play a huge part in the evolution of the industry. I think most people thought it would revolutionize the products directly, and that will come with time. However, my takeaway from recent conferences is that companies are starting to take a more methodical approach to incorporating AI. After the initial surge in AI popularity, people are starting to realize how important it is to have a strong foundation of data. I believe companies will spend the immediate future organizing data and building good frameworks so that they can better incorporate AI into internal processes like product development and manufacturing.
Q: What ethical considerations should companies keep in mind as they integrate AI/ML into clinical decision-making and device functionality?
Matt: Companies need to rely on evidence of what AI can contribute and avoid rolling out product features based on speculation of what AI ‘should’ be able to accomplish.
Smith: I think companies need to be clear about how AI-driven decisions are made so clinicians can actually understand and trust what the system is doing. I also believe they need to watch for bias in the training data, because uneven performance across patient groups can create real clinical risk. And I think it’s important to stay accountable for how these models evolve over time, making sure updates are monitored so the systems remain safe and reliable in practice.
Celentano: ML in medical devices has been around for a couple of decades now. I worked on an ML fuzzy logic bG meter diagnostic algorithm in the early 2000’s. Then, and now, human verification and validation is essential. Just like when we use ChatGPT for something, we always double check the answer ourselves. Why? Because AI gets it’s knowledge from us, the internet, our databases, our programming, and all of that is not perfect. So the same applies for clinical decision making. Health Care providers must verify and validate the AI conclusions themself, and ultimately, humans must always take responsibility for the final answers.
Purvis: Keep a person “in the loop” as it allows for review, edit, and potential correction.
Balgos: Considering Med Industry’s ethos is to “do no harm,” I was happy to hear the talk about using standards such as ISO 42001 to ensure the responsible and ethical use of AI, including addressing the known bias in medical decision making in the clinical settings.
Wade: They should think about the inherited bias of the AI tool that they use, since it could unfairly classify data about certain demographics.
Khazanovich: Intellectual property concerns will need to be addressed to ensure AI-suggested content is not putting companies in any sticky situations.
De Los Santos: Companies need to have clear rules and controls around when and how to use AI when dealing with private health information.
Rish: It is hard to put anything other than data privacy at the top of this list. Whether it is patient data, information about clients, or proprietary product details, companies need to train their employees to use AI responsibly. It is so easy to copy/paste information into AI tools in the name of efficiency, but people need to think twice about what they are sharing.
Q: What emerging technologies do you believe will have the biggest impact on life sciences innovation in the next 12–18 months?
Matt: AI is the hands-down favorite.
Celentano: AI is one for sure. mRNA is also going to be huge in life sciences since it makes vaccines fast to develop, and any mRNA vaccine appears to have cancer-fighting benefits with immunotherapy that are next-level. One negative impact that will be felt for the next year to 10 years are the 2025 US budget cuts to NIH, CDC, and other long-term research activities.
Purvis: There are big things happening in wearables. The purchase of Nalu. The Medicare reimbursement for Cala. The market is beginning to realize that wearable neurotech has a lot of growth potential to benefit patients’ lives in a less invasive way.
Balgos: AI is the hottest tech right now to make the biggest impact, but the bigger impact is when these AI-enabled devices start talking to each other, with the common goal of supporting the patient and medical professionals. The Model Context Protocol (MCP) will be a key part of that impact.
De Los Santos: I expect that AI will be applied to product development processes to reduce bottlenecks.
Rish: Wearable devices have already had a profound impact on the industry, and I think their influence will only continue to grow. Companies are pushing the limits when it comes to providing excellent data, all from rather simple devices like rings, watches, etc. Details still need to be figured out on the regulatory side when it comes to indications, but patients want to know more about their health. My hope is that the trend of people taking a more proactive approach with their health continues with the continued rise of wearables.
Q: What regulatory shifts (e.g., EU MDR/IVDR enforcement, FDA changes, global harmonization) do you anticipate will most affect medical device and life sciences companies in 2026?
Matt: ISO 13485 brings with it a tremendous amount of explicit detail that was only present in regulations by ‘reading between the lines’. This increased detail about the behavior expected for compliance will affect medical device companies both broadly and deeply.
Smith: I think we’re about to see a wave of impact from AI systems that are purpose-built for regulated work, especially tools that can interpret standards, guidances, historical submissions, and clinical evidence in a structured way. I also believe digital twins and simulation platforms will start to play a bigger role in both device design and verification as companies look for faster ways to generate defensible evidence.
Celentano: There has been more regulatory focus on Interoperability and Cybersecurity lately. This will continue to intensify in terms of enforcement in 2026. More AI guidelines and perhaps regulations will also emerge.
Purvis: All agencies are continuing to focus on cybersecurity. Companies should make sure that they have a product cybersecurity (as opposed to general business/IT cyber) strategy right alongside development and manufacturing strategy.
Wade: The FDA’s harmonization of 21 CFR 820 with ISO 13485, which is slated to be effective in February 2026, will have a large impact on US-based companies. Many have known about this upcoming change for years, but will need to be fully compliant very soon.
De Los Santos: Of course, the FDA’s harmonization effort will have a large impact on the development of US medical devices. Meanwhile, in the EU, I expect that bottlenecks around full compliance with MDR for legacy medical devices will continue as manufacturers struggle, not only with making legacy development documentation compliant with the MDR, but getting it reviewed in a timely manner due to the limited capacity of notified bodies.
Rish: Without a doubt, QMSR is the thing I hear the most about. For those of us that have been in the industry for a while, we have seen a lot of changes (ISO13485 in 2016, ISO 14971 in 2019, EU MDR, and more). This is one change that feels like it is actually helping us out as the FDA is harmonizing with ISO 13485. It seems like this will help the industry become a little more streamlined, which hopefully leads to more and safer products being launched.
Balgos: QMSR transition will cause some immediate local impact on medical companies, especially those that are non-compliant to ISO 13485. Even those that are compliant, a revisit oftheir Quality Procedures will be needed. On a broader, global level scale, the continual changes in general strategy and the reduction in force in the medical related US Federal Agencies (FDA, NIH, CDC, etc) experienced personnel, will have longer term impacts in the way industry and academia pursue new medical innovation, the path to bring products to market, and the overall medical welfare of the general population.
Q: How are companies adapting their software and systems to meet evolving cybersecurity and data privacy requirements across global markets?
Matt: Cybersecurity is greatly under-considered in medical device design, resulting in extensive and growing opportunities for medical cyberattacks.
Celentano: Well, most MedTech companies are finally getting serious about Cybersecurity and privacy as well as data integrity, now that regulators are enforcing the regulations and standards more. Years ago, MedTech companies used to hire one person to be responsible for Cybersecurity. Now most companies have cyber teams, privacy teams, and data integrity teams, all with standard operating procedures, which makes each employee responsible for compliance.
Purvis: The best way to answer this is “systemically.” Companies are setting a comprehensive product cybersecurity strategy that bakes cybersecurity into every aspect of the pre-market cycle. Also, companies are realizing that post-market cybersecurity (ongoing surveillance) must be budgeted and planned for.
De Los Santos: Companies are purchasing or repurposing tools to help them generate new cybersecurity deliverables and update their customer notification systems to be in compliance with the final guidance on Cybersecurity in Medical Devices released just this year.
Rish: I believe the best companies will take a step back and rethink their approach to risk management. A lot of organizations complete risk activities in separate buckets. Things like cybersecurity, human factors, process risk, and more are all done at separate times and then merged into a disjointed system. Since technology is rapidly evolving, I think people need to take a more holistic view of risk. Put the patient or end user first by thinking about everything that can go wrong and how you can mitigate those risks at a systemic level.
Balgos: Due to the FDA’s Final Guidance on Cybersecurity in mid 2025, organizations are taking a more proactive approach to cybersecurity since it is now a required deliverable for device submissions. In addition, Med companies are seeking an integrated approach to both security + safety risk management in the processes & tools since both can impact each other’s associated Risk level, especially in this early era of AI.
Market Forces & Strategy
Q: What macro trends (e.g., supply chain resilience, sustainability, workforce shifts) do you think will influence strategic decisions in the industry next year?
Matt: The macro trend to bring employees who worked remotely back to the office after the rapid and uncontrolled increase in remote workers during the COVID pandemic.
Celentano: 2025 tariff wars will still have residual supply chain impacts in 2026 for MedTech. Reduced funding for research and other economic factors will make MedTech jobs more precious and harder to get. Reduced emphasis on sustainability will continue to flood the employment market with those specialists who now need to become more multi-disciplined. Software-related MedTech jobs will likely grow in comparison to electrical and mechanical job opportunities. Systems Engineering and Program Management jobs will likely increase next year due to the need for more integration of existing technologies and less investment in new technologies.
Purvis: The industry is seeing some positive changes in reimbursement. Several firms are seeing their strategic plan around study data pay off with reimbursement.
Wade: A lot of companies are very conservative with their make or buy decisions due to current tariffs, which will impact how they design their products.
Rish: It seems like the economy has been the main question mark ever since 2020. There have been some major highs and major lows. While private investment seems to be down, there is no denying that large companies are making news lately with some big mergers and acquisitions. I believe the larger players will continue to identify promising technology and take steps to acquire or partner with the organizations developing that technology.
Balgos: With lessons learned from the Covid Era and the current potential dynamics with the US Federal government, companies are focused on strengthening their supply chain to prevent or lessen global market & trade changes. Whether sourcing more locally, identifying equivalent substitutes, or even manufacturing their own materials, flexibility will be key to mitigate any turbulence in the supply chain.
Q: What differentiates companies that are thriving in this rapidly evolving landscape from those that are struggling to keep up?
Matt: A laser focus on the patient. This drives everything in medical devices, but many companies get distracted by technology, profit margins, or timelines. A laser focus on the patient cures all of these ills, but many companies don’t see the connection.
Smith: I think the companies that are thriving are the ones treating regulatory and quality work as a strategic asset, not a bottleneck, and adopting tools that give them clearer evidence and faster decision cycles. I also believe they’re the ones breaking down silos between engineering, clinical, and regulatory teams, so requirements, risks, and documentation stay aligned from the start. And I think the organizations that struggle are usually the ones holding onto legacy systems and manual processes, which makes it much harder to keep pace with shifting standards, rising submission volume, and growing complexity.
Celentano: Adapting to the sometimes surprising demands of the public and the governments. Being nimble to move resources toward new cash cows. For example, marketing Trizepitide, GLP, and GIP more for weight loss rather than diabetes.
Purvis: There are four key stakeholders in every MedTech business: patients, caregivers, corporate (hospital, surgery center, payers), and investors (which includes employees, management, and financial backers). The thriving companies have found a way to satisfy all of them well.
De Los Santos: Companies that are slow to use AI/ML may start to feel like their competition is speeding ahead of them.
Rish: From my experience in the industry, the companies that thrive fully reject the idea that regulations slow you down. Instead, they use regulations to build business practices that create efficiency and excellence. Those that set up smart business processes as part of a QMS significantly increase their chance of hitting product deadlines. They get products to the market faster and are also typically producing much safer products. They increase their revenue and reduce their audit findings.
Balgos: With the constant dynamics in the regulatory landscape, having a solid regulatory strategy that includes sub-topics like cybersecurity, quality compliance, and an actual commercialization plan will help keep companies nimble in the face of change.
Q: What’s the most innovative thing you’ve seen in the industry this year that you believe others will adopt in 2026?
Matt: A novel method to assess whether the benefits of a treatment exceed its risks. This has the ability to both bring new products to market more quickly and relaunch existing products into new patient populations and indications for use.
Smith: I think the most innovative shift I have seen this year is the way AI is beginning to shape entire medical device roadmaps rather than just isolated tasks. The work we are doing with the University of California is a good example, where Agent Astro is being used from the earliest concept conversations all the way through regulatory planning, predicate selection, testing expectations, and submission strategy. I believe this end –to-end use of AI will accelerate a broader shift in the industry, where regulatory affairs is no longer treated as a process-driven function at the end of development, but as a strategic driver that informs design choices, materials decisions, and overall product direction. I think this approach will spread quickly in 2026 because it brings consistency, reduces rework, and gives teams a much clearer path from idea to approval.
Celentano: Weight loss drugs will continue to make record profits. mRNA treatments will emerge to fight cancers. The most innovative products next year will solve medical problems for all patients and doctors, perhaps related to common pain points like healthcare access, healthcare insurance, or prescription drug costs.
Purvis: Bioelectric therapies that directly target the patient’s condition. More firms are realizing that a device play is valuable (in addition to pharmaceutical-based solutions).
Rish: I probably can’t claim it is the most innovative thing I’ve seen, but one of the most surprising innovative ideas is the FDA committing to using AI in their review process. It is great to see that the FDA is willing to modernize a bit, and I hope that leads to more streamlined and effective reviews for all parties. The goal shouldn’t be to just catch random things, but to focus on important topics so that safer products will be launched. I know companies are starting to use AI to prep for things like submissions and audits, and I think that will ultimately help them launch better products and reduce audit findings.
Balgos: The extraordinary rise in continuous glucose monitoring (CGM) devices and at-home testing kits (ala Covid) in the market demonstrates that device manufacturers can effectively market directly to consumers. This may open a wider range of wearables, at-home kits, and DIY applications that may broaden the adoption of FDA’s initial “Healthcare at Home”
Q: What’s one mistake or blind spot you see companies making that could hinder their success in the coming years?
Matt: Focusing on compliance instead of the patient.
Celentano: Many MedTech companies do a terrible job of eliciting and analyzing their stakeholder needs. They often build what they think their stakeholders want instead of providing them solutions they actually need.
Purvis: For startups: stick with what you are uniquely gifted to do and outsource everything else to quality partners. Your IP, your clinical, and your science should stay with you – all other aspects can be handled more cheaply and effectively by others.
De Los Santos: One of the biggest mistakes I see is companies creating huge and complex product development and risk management processes in response to regulatory changes. Congress has directed the FDA to take the least burdensome approach to evaluation of premarket medical devices. The amount of documentation and evidence should be commensurate with the security and safety risk of the device.
Rish: As discussed previously, I think rushing the use of AI increases the risk of a company falling greatly behind the competition. I highly recommend focusing on organizing data, building processes around usage, and training employees on how to use it. The longer you wait to do that, the deeper the hole gets before you can use AI effectively.
Balgos: Believing that only technical prowess is needed for a successful device submission and market penetration. I like the colloquial phrase of “it takes a village to raise a child,” with adaption that it takes a “system of systems approach” to develop a safe, effective, and successful medical product.
Q: Are there any major disruptors on the horizon that you believe could reshape the industry in 2026?
Matt: I don’t believe any disruptors are on the horizon that are so powerful they could reshape the industry in just one year. AI will be the disruptor that will reshape the industry over the next decade.
Smith: I think one of the biggest disruptors will be the shift in how companies access regulatory expertise. For years, firms have charged tens or hundreds of thousands of dollars to help MedTech companies navigate predicates, draft documentation, and map out submission strategy, and there is still real value in working with consultants who bring human judgment and trusted relationships. But I believe the nature of that work is changing because AI is turning regulatory affairs into a strategic driver instead of a downstream, process-heavy function, and for only a few hundred dollars, any company can now access the equivalent of a team of regulatory veterans. I think this will make advanced regulatory support accessible to far more innovators than ever before and will reshape how new devices reach the market in 2026.
Celentano: The confluence of AI with other multipliers will be a dominating success factor in 2026. For instance, MBSE with AI will enable nearly automatize system architecture options based on requirements or vice versa, saving tons of manpower and reducing time to market.
Purvis: BCI is hot – and lots of investment has been thrown at it. I think that “data from the brain” is going to start opening more and more MedTech opportunity in the years ahead. Also, personalized medicine with tailored devices to individual anatomy will continue to grow (think Invisalign for many more conditions).
Wade: The recent government shutdown caused a huge backlog at the FDA for submissions, which will inevitably take a while to sort out.
De Los Santos: The possibility of more federal layoffs or cuts in funding to the sciences will cause uncertainty and may stall development. Innovation often requires significant public investment for technology to develop.
Rish: It is hard to think of anything that can match the potential AI holds when it comes to reshaping the industry. Those that use it wisely and effectively will equip their employees to do amazing things. I truly believe it will help the best minds in the industry spend more time on innovation, which will ultimately improve the quality of life of people all throughout the world!
Balgos: The continued dynamics of the US Federal Government and its impact on global businesses/trade, regulatory, international affairs, and the scientific and medical community.
Jama Software Announces Jama Connect Solution for Semiconductors for Developing Complex Products and Systems Faster without Compromising Quality
Streamline and Accelerate Semiconductor Product Development with Jama Connect
Jama Software, the industry-leading requirements management and traceability solution provider, has released a semiconductor solution for fabless design companies, IDMs, and companies in other semiconductor industry sectors. With increased product complexity challenges and rapidly changing industry landscape, semiconductor companies are facing competitive pressures related to growth and profitability that require development speed and product quality.
Jama Connect for Semiconductors is a custom-built solution pre-configured for common use cases for rapid adoption, accompanied by a Procedure Guide that provides simple process descriptions from initial stakeholder MRD and System level PRDs through validation and verification. This framework enables semiconductor companies to create scalable, consistent, and repeatable processes for bringing innovative high-quality products to market quicker, navigating product variations, and better serving their customers.
“For semiconductor companies facing ever increasing complexity of silicon products plus the need to align software deliverables that must be available at launch, the traditional hardware-centric approach of product definition and development is no longer viable,” stated Neil Stroud, GM, Semiconductors, at Jama Software. “Without Live Traceability™ across tools and engineering disciplines and the controlled coordination it establishes, semiconductor companies will continue to experience significant rework and respins, quality impacts, increased costs, and product delays.”
With effective requirements management and Live Traceability™ of Jama Connect, semiconductor companies can easily manage new product requirements from ideation through to implementation, enhancement, and revisions — enabling them to maximize development efficiency, accelerate speed to market, and meet regulatory or audit requirements.
To learn more about how Jama Connect for Semiconductor can help accelerate product development throughout your ecosystem, download the datasheet, or click here to speak with one of our experts and book a free trial.
Media Contact:
Mario Maldari
Director, Product and Solution Marketing, Jama Software
Jama Software is focused on maximizing innovation success in multidisciplinary engineering organizations. Numerous firsts for humanity in fields such as fuel cells, electrification, space, software-defined vehicles, surgical robotics, and more all rely on Jama Connect requirements management software to minimize the risk of defects, rework, cost overruns, and recalls. Using Jama Connect, engineering organizations can now intelligently manage the development process by leveraging Live Traceability™ across best-of-breed tools to measurably improve outcomes. Our rapidly growing customer base spans the automotive, medical device, life sciences, semiconductor, aerospace & defense, industrial manufacturing, consumer electronics, financial services, and insurance industries. To learn more, visit us at jamasoftware.com.