Product Innovation Trends

PI Congress Recap: Time to disrupt the PLM Software Monoliths

Speakers at innovation conference call for an ecosystem of solution-based PLM tools, dismantle inflexible solutions

fear sign
Photo appeared in GE presentation

I attended this year’s Product Innovation Congress conference in Boston with a Jama team, and many of the conversations and presentations we saw there are still with me. One of the most prominent themes that speakers and delegates talked about was that the enormity of data, combined with the complexity of systems being designed and built today, require new ways to adapt in order to remain innovative and ahead of the game in the marketplace.

One speaker cited this article in The Harvard Business Review saying, “The changing nature of products is disrupting value chains, forcing companies to rethink and re-tool nearly everything they do internally.” Indeed, Big Data and the Internet of Things are areas companies are trying to leverage to gain their competitive edges. Yet, with the current state of PLM software, this is difficult. Massive amounts of data tend to be spread across multiple tools and systems which makes it very difficult to make decisions, achieve higher velocity in product innovation, or even stay ahead of the quality curve. Traditional PLM software strategies are no longer able to satisfy the promise to get to market faster. Moreover, companies now find they need more customer-centric views of data and analytics so that they can engage a broader range of stakeholders, ones that are not necessarily engineers.

The more we talked to our fellow attendees, the more we heard about what’s wrong with current PLM solutions. People want PLM software to be about the ecosystem–tools and people–and not just the software. People want things to be simpler, unified, but not inflexible.  Mark Halbish at TI Automotive said, “PLM fails at surfacing business insights, and it’s not viewed as a valid reference point to enable decision making for managers and executives.”

Companies need better PLM software and data strategies. There are lots of tool types related to product development–PLM, PPM, ALM, ERP–and the data in them is in silos. There are broad ranges of users with varying business drivers, as well as technical needs and abilities, that need access to all of the data, but need to consume and visualize it in different ways. One wonders if it is a pipe dream for a company to choose a single PLM software vendor and completely immerse the culture and teams into that tools entire ecosphere. From what we’ve heard from people at this conference, this approach causes a lot of chaos, takes too long, and costs too much.

Tim Gieske, from General Electric, said his organization wants toolsets and data infrastructure where more users are able to casually use applications to get the information they need without being power users. Today, “pillar applications” such as PLM, Requirements Management, ERP, Manufacturing Execution System, and the Manufacturing Quality System tools have data that resides in structured data repositories of their own. Sixty percent of users are forced to use these pillar apps which result in:

  • Poor user experience which slows down the work
  • High cost to scale and gain organizational value
  • License models that don’t match actual usage
  • Need to replicate security models within each application

Instead GE is treating PLM software as an ecosystem. They are reinventing their data and user strategies to increase consumption of data. For casual users authoring is optimized for speed and simplicity; they will have access to all types of data, as well as access to all types of platforms. For internal consumers, GE is optimizing their data repositories for search and data discovery.

Western Digital’s Dave Davison said his company recognizes that there is a road to PLM maturity and to realize value they aim to evolve their PLM capabilities. They feel the most critical elements PLM should provide in order for them remain innovative as the PC market changes are:

  • Accessibility to product information
  • Traceability
  • Product information interlock
  • Product information analytics
  • End-to-end process cycle time and first pass yield
Graphic appeared in Western Digital presentation.
Graphic appeared in Western Digital presentation.

 

Gahl Berkooz of Ford pointed out that paradigm shifts in consumer behavior, product value propositions, product features, and product ecosystems are causing the greatest transformation in the PLM market. Because companies are treating data as a corporate asset and products are no longer islands, investments in Big Data and analytics is an order of magnitude greater than that of PLM. Big Data will impact how product development is done at its core, and how the product lifecycle is managed.

For instance, in the context of requirements and architectures which are traditionally driven by subject matter experts, requirements documents and designs specify a product’s operating conditions and architectures. In a Big Data PLM world, actual operating conditions from time history of sensors embedded in products drive requirements. Data and analytics will drive optimal architectures and the exploration of architecture alternatives and optimization is based on actual product performance.

Graphic appeared in Ford presentation.
Graphic appeared in Ford presentation.

Customers need a PLM ecosystem that allows teams to work in the tools that work for them but which also has an underlying data strategy that crosses the silos to be able to leverage Big Data and IoT. The data strategy needs to be harmonious with the tools in order to streamline engineering data, provide actionable analytics from Big Data that both engineering and business leaders can use. A collaborative decision-making framework needs to tie organizations together to work together to increase efficiency and drive innovation.