Showing posts with label PLM. Show all posts
Showing posts with label PLM. Show all posts

Wednesday, January 27, 2016

IIoT, Small Data and PLM

Big Data and Internet of things is big (sorry about the pun). There is a lot of promise of golden opportunities and discussions on how to get to smart connected products and opening up new business opportunities. And a lot of anxiety among many smaller companies wondering how to approach this. A pragmatic approach and easier start can be small data within industrial IoT (IIoT).

Big Data and full blown IoT can be too intimidating and a too big step for smaller companies. Yet it is possible for them to enter the race without too high investment.

I am assuming that a PLM backbone with all product data is in place. PLM in context of IoT has been mentioned by for instance Beyond PLM and the Virtual Dutchman. There is also an excellent paper from HarvardBusiness School.

The companies that will have the easiest path here is the business to business companies that delivers physical products that are used by another company in a larger system of products. E.g. a conveyor belt that goes into a larger material transportation system. You have control of the conveyor belt while your customer manages the whole transportation system. If your conveyor belt can be a smart connected product we have IIoT.

A large portion of Big Data is unstructured information that you do not know clearly what you will use it for when you design your product. Small data is instead logical additional data sets that you define up front, you know how it is related to other information and you know what the purpose is.
“Small data connects people with timely, meaningful insights (derived from big data and/or “local” sources), organized and packaged – often visually – to be accessible, understandable, and actionable for everyday tasks”
Making a “dumb” product into a slightly smart connected product can give you very powerful information. Define up front what (small) data you want and how to manage it. You can limit the IT infrastructure investment since the amount of data is limited and as a manufacturing company you probably have an IT infrastructure that can manage the additional data. And it is quite easy to know how to relate the information to other information sources. As the small data can be well defined and structured it can easily be combined with your existing well defined and structured data. The typical information sources will be CRM, ERP and PLM. You can combine your new (small) data with your existing product data sources. This can give you new insights.

Aspects that makes IIoT and Small Data attractive
  • Sensors are smaller, cheaper and more flexible than just a few years back
  • As the data amount is limited it is manageable
  • You have a strong product information backbone in place in PLM and/or ERP
  • The small data and how it must be related to other enterprise data to give meaning is defined up front. You don’t need advanced analytics tools or experts.
Taking the conveyor belt as an example. Take your existing product and how it is delivered and operated and serviced. Think about: What business benefit do you want if you could get anything from your product in use at your customer? Is it possible to get that data somehow? Then decide what you want to achieve, what data you need and how to get it.

What do you want to achieve?
  • Improved service margin – you can do better and more accurate service since you have better insights. Perhaps selling more spare parts.
  • You can provide improved operational efficiency for the customer as you can optimize the usage based on the data feedback.
  • You can sell a service instead of a product. E.g. you promise a certain output in a certain time period. And you make it happen.
  • You can improve the customer experience with new automated functions, remote control or better interaction with other equipment.
Decide up front and design your product accordingly and extend your IT infrastructure to support this. It is still not done by itself, but taking small steps instead of diving into Big Data at once is perhaps more appealing.

Engineering.com has an interesting article about using PLM and IoT in an old industry to create new business opportunities.

Summary

The key is understanding your product, how it is used by your customer and what additional data that could give you an advantage. You have a lot of valuable data already and it can become much more valuable if you add some smart small data on top of it. Use the data and the IT infrastructure you have, add some sensors and connectors and get started.

Tore Brathaug
www.infuseit.com

Thursday, January 21, 2016

PLM and PIM – what’s the difference?

Product Information Management (PIM) and Product Lifecycle Management (PLM) must surely be something similar. Yes and no. There are similarities, overlaps and they can complement each other. At the same time there are clear distinctions and you will probably need both.

Similarities and differences

From Wikipedia:

“PIM refers to processes and technologies focused on centrally managing information about products, with a focus on the data required to market and sell the products through one or more distribution channels”

“PLM is the process of managing the entire lifecycle of a product from inception, through engineering design and manufacture, to service and disposal of manufactured products”

Interestingly they say similar things. At least the product information is in focus. It can also seem from this definition that PIM covers a hole in the PLM processes. PLM jumps from Manufacture to Service. PIM covers what is in between – Sales and Marketing. This should be a perfect match as has been discussed by Tech-Clarity.

ERP is also highly relevant in this context. In some industries also CRM. The picture below explains some of the differences between PLM, ERP and PIM.


Positioning of PIM

When we look at the purpose of PIM we understand the difference better. PIM is focused on providing accurate and good quality sales and marketing information to various sales and marketing channels. Today this is often done manually or semi-automatic for each different channel. The objective is to gather all relevant sales and marketing material in one place and publish it in a controlled and efficient manner.


PLM/ERP feeds PIM

The best approach is to take product information that is controlled and released somewhere else (PLM or ERP typically) and enrich the products with additional information (such as images, video etc) and publish to different channels and markets. Which channels to publish to is controlled in PIM. Which markets to publish to might come from PLM or ERP or can be defined in PIM. The best flow is when PIM can trust that the products and their information is valid at the point it enters PIM and PIM can focus on enrichment and publishing. One reason for this is that PIM is typically poor on revisioning and approval control.


Change and status control in PIM is typically done by using catalogues to show status and control what you can do with the data. An example is shown above. Someone or something pushes the data over to another catalogue. In that catalogue it is defined what you can and cannot do on a general level. The access is not product by product, but more often market by market. PIM works typically on the latest released product information.

Can PIM replace PLM or vice versa?

I have heard PLM people saying that as PLM has most of the product information already; why not extend the data model and processes to also cover PIM? PLM is quite flexible and can be integrated for example to a web portal for product catalogues. Why not?

PLM and PIM are both focused on product information. At the same time they have very different strengths and capabilities. It is better to utilize those differences than trying to build missing functionality in PLM or PIM.

PLM is not good at the marketing side of product information. Like creating print material or publishing to various sales and marketing channels. PLM is focused on detailed control needed by engineers.

PIM on the other hand has ready-made mechanisms for print materials and publishing to different markets and channels. And it has sufficient change control from a sales and marketing point of view. On the other hand PIM is no good at detailed change control, revisioning and management of design data.

Focus from a PLM perspective

Look at the product information from start to end. Where is it born and where is it used? What you have to focus on is ensuring that the product information is structured in such a way that you CAN use it for other purposes than just a design point of view. E.g. if you need grouping of products in PIM; perhaps use the same grouping in PLM. Ensure that you have sufficient information. You might want to add more information early on to be able to streamline processes and information flow. E.g. In which countries can you sell a product?

The success of PIM will be greatly enhanced if you tie the whole information flow together. From design to procurement and manufacturing to marketing and sales and to service. You will be able to re-use information from PLM and give it additional value.

Summary

PIM plugs a hole in the product lifecycle that PLM should not. From a true lifecycle perspective it makes perfect sense to integrate PLM and PIM. They complement each other well and you get even more value out from your structured data in PLM.

Do not try to extend a PLM system to also cover PIM functionality. PLM systems are complex enough as they are.

Tore Brathaug
www.infuseit.com

Saturday, December 19, 2015

PLM and Document Management Systems (DMS)


PLM tools have document management functionality. Does that mean that your PLM tool can and should cover general document management needs in your company? Should it handle ALL product related documentation or just the CAD files?

If you already have or are about to implement a PLM tool you might consider using it for general document management, not only management of strictly product related documentation. Is that a wise way to go? Or should you use a common Document Management System (DMS) instead? I will shed some light on the areas you should consider in such situations.

I agree with the Virtual Dutchman that a data centric approach instead of documents is the right way to go. But for the time being there will still be a few documents around.

Document Management in PLM
Document Management functionality in PLM (or PDM) is focused on structure and control. In PLM you gather product related documentation in one place, put it in a product context under formal control. The primary documentation is documentation that defines the product from a design or manufacturing perspective.

PLM tools typically focus on advanced document management functionality for expert users. Some examples:
  • Formal review and approval processes
  • Rigid change management principles
  • Automatic conversion to PDF with stamping
  • Strictly defined document types with metadata and other behaviour
  • Strong mechanisms for access control
  • Integration to authoring tools to manage drawings
The documents are put into context and their behaviour is depending on the context. E.g. You cannot change a document unless the part is open for change.

You end up with an advanced and complex functionality to reach the level of control you need. An example can be complying with FDA regulations in the medical industry. Engineers can live fine with such solutions, but it is harder for other people managing large volumes of general documentation.

What you typically don’t get is easy-to-use and flexible document management for general documentation. The processes and functionality is often too specific and far from user-friendly enough.

Too often we also see that the PLM solution has ended up as an archive solution and not the working tool it was supposed to be. The users start on and work on their documents on file folders until they are approved and “must” be put into PLM for control and tracking purposes. The result is that a document can be found several places. On the file server (probably several different places), in your mail system and in the PLM solution.

Document Management in DMS
DMS come in many different shapes and colours. Some of them are highly specialized for a specific purpose or industry while others are more generic and focuses on general document management. I focus on the last group here. An example is Sharepoint for document management.

DMS typically focuses on large user groups with various needs for document management. Some examples:
  • Easy creation and update of a new document
  • Smooth collaboration around documents
  • Flexibility to handle any kind of document
  • Flexibility to define your own processes, document types and metadata
You can handle any kind of document as single documents that live their lives independent of context.

You get a solution that is generic and flexible. You can extend to support certain domains. The focus is on replacing file folders with something that is easy to use and yet gives more collaboration possibilities and more control.

What you typically don’t get is the possibility for functionality that requires that the document is put into a certain context. It is not recommended to replicate the PLM functionality with for example change processes with dependencies to the status of the product.

You can of course add advanced functionality and many solutions are very rich in functionality. If you already have a PLM solution you should be careful of introducing a complex DMS.

Positioning PLM with DMS

If you take requirement for requirement for document management you will see that a PLM solution on paper probably can cover all document management in your company. In reality this is not true. I have yet to see a PLM solution successfully being used for all document management in a company (if you have more than 50 users). It is not easy enough to use and lacks flexibility.

DMS cannot cover the PLM document management needs. It lacks the product context capability.

In my view PLM and DMS has different profiles, approaches, focus, content and audience. If you have PLM needs it is not recommended to try to manage that in DMS. And vice versa: It is not recommended to use PLM for the general audience and documents that are not related to the product somehow. You will be better off having both. See also this blog at Beyond PLM.

The question is how to define the roles of PLM and DMS in such a way that the boundaries are clear. You do not want users to wonder where to put a document or where to look for it. You should have easy-to-understand guidelines for PLM and DMS. One simplified approach can be:
  • PLM for all product related documentation and for documentation that requires formal configuration management (change control) - Streamline PLM for this and do not attempt easy management of large volumes of general documents
  • DMS for all other documentation - Streamline DMS for easy management and collaboration and get rid of your file servers. Leave the complex context related functions to PLM
What you should have is a clear strategy for both. What is the role for PLM and DMS? What do you do in each and not least: What do you not do? Try to have as little overlap as possible. Both in terms of what documents and users you target. As well as what functionality and processes you cover.

Summary

PLM and DMS has complementary roles. They fill different purposes and in most cases should not replace the other. Try to find clear boundaries and have an overall strategy for document management that covers both. The strategy must be easy to understand to avoid unclear roles and usage. PLM and DMS together should make it possible to turn off your file folders.

Tore Brathaug
www.infuseit.com

Saturday, August 22, 2015

2 years of PLM blogging – pick a future topic


Time flies. We have a 2 years anniversary as bloggers. Here are some reflections about the previous blogs. What is getting the most attention? What is a bit scary? You also get the opportunity to vote for a future blog topic.

Some statistics
  • 26 blogs in two years.
  • Close to 15000 readings. We aim higher the next 2 years.
  • Most of the readers are in USA, Sweden, Norway or France. Our main market is in the Nordics. Hello Denmark and Finland! You are not even in top 10…
What gets most attention?
As independent PLM advisors we have had very different topics. Some of them are generic views of PLM challenges and how to overcome them. Others are deep dive into specific processes, industries or solutions. Others are on the outskirt of PLM.

The 3 with most attention (percentage of total readings):
It is a bit scary that the top topic is PLM vs ERP. This has been on top of the mind in most PLM implementations and clearly one of the selling points of PLM; Get control with PLM and feed downstream processes automatically with quality data. This is still a struggle, 20 years after I started with PLM. It should not be so.

The other two topics are related to PLM challenges in general. Why is it so hard with PLM? And what do you need to know in order to succeed? This is less as a surprise as people struggling will look for tips and tricks and learn from other failures. These are still on a very generic level.

Some other of the top ones:
These are more to the point and explains or discusses pros and cons of a strategy or method. There is a hunger out there to get inspiration or tips in their PLM tasks. Maybe we should be even more concrete and deep dive into some of our experiences?

Some reflections
I have been working with PLM for two decades. It is a bit sad that many of the topics are the same today as when we started. And there is not commonly agreed definition of what PLM is? Is it a tool, an approach, a process or what? And many companies seem to have to do the same mistakes as many have done before them.

I don’t know, but I guess that many of our readers are really interested in PLM and have some specific PLM responsibility. Then it is like kicking in an open door for some of the topics. Sometimes it seems that the PLM community is for a secluded group of men in their 40’s and 50’s, with a white shirt and little hair, and have been working with PLM for at least 15 years – like myself and my colleagues. Where are the hipsters and our better halves?

It would be very nice if some of the posts also would be read by senior managers or non-PLM people. Then we could have some interesting discussions. There is still a need for raising PLM awareness and understanding. It is fully up to us to attract new audience.

Many are happy to share their experience and there are some really good sources for PLM inspiration. Here are some of the ones that I appreciate the most.
We want to focus on PLM as an approach covering many tools and processes and not that much on PLM solutions as such.

Vote for a future topic
We want to share our experience and want your input on what topics that are of biggest interest. The topic with the most votes will be a future blog post. Please spend a minute and vote for a topic.

Summary
It seems that the interest is in both strategic high level, process or industry specific and deep dive into solution areas. There are a lot of possible topics. Please help us select the most interesting topics.

Thanks for reading our blogs and interacting in discussions. We will continue as long as there is an interest in our posts.

Tore Brathaug
www.infuseit.com

Monday, May 25, 2015

Can a true PLM initiative be decentralized?

In a previous blog I was looking for the PLM expert. The conclusion was that implementing PLM is a team effort (which is perhaps not a surprise), as a PLM implementation includes process, tool and organizational aspects. The executing team should possess knowledge within the customers’ business, the targeted domain, and available technology. Strengthened by having a reference group working in the day-to-day business and the support of management.

But we didn’t address the question on which level this “body of expertise” should act. Where should this organization reside, and what mandate should it have?

In short: what would the impact be of having a centralized or decentralized ownership of PLM?

A Decentralized PLM Ownership

From a process and method point-of-view you could probably find reasons to actually move the responsibility from the corporate level and allow local flexibility:
  • Risk management through diversification, which would allow business units to try out things which could then benefit others in the organization
  • Ownership of your own working methods would potentially allow easier change management
  • Less compromises in the way you work and the tools you use in your daily work would probably generate less friction
  • Closer to the immediate challenges will allow the organization to be more agile and finding solutions to emerging threats, opportunities and trends
Okay, so there might be good things with decentralized processes and methods. But let’s have a look at what a decentralized PLM (both process and tool) does to some of the (old) core values of P L M:
  • Manage Information - Manage data and processes
  • Find Information - Search, where used/referenced, process status
  • Share Information - Real-time collaboration. Information attached to workflow
Doesn’t a “local” PLM system contradict with those values? I assume it has to do with your definition of local. If it is a site with 5000 employees does it have to be integrated with the rest of the company? It all depends on the business.

But let’s flip the coin completely and look at some more challenges you might find in an environment which has a decentralized PLM ownership:
  • No one will be responsible for thinking horizontally and aligning the company across departments and across functions/disciplines
  • Each department is measured and rewarded by their own results, blocking a strategic change as this requires an investment
  • Hard to have job-rotation and change staff as the processes, working methods and tools differ
  • More costly to have an effective tool support as it’s not consolidated
  • Less efficiency further down the chain or even loss of opportunities and inability to execute on initiatives completely. In many cases the lack of a common way of doing things is hurtfully exposed once you have a need of unified processes and information at one end while having a diversified way of working and/or structuring and storing information at the other. It works as long as you don’t think of others … Typical initiatives and areas which will put their demands on what and the way information is delivered are: business analytics, procurement, manufacturing, e-commerce, and aftermarket.
  • In many cases (to my experience) a decentralized ownership means more “let’s just make it work” which means that the experienced employees will have a way of dealing with things but the new ones will have to “find their way”. Is that really making things smooth and efficient?

Decentralized Processes and Methods but Consolidated PLM Tool(s)

Providing that you believe in decentralized processes and methods but also in the need for a consolidated application support for effective distribution of information you might avoid some of the bullets above. And looking at the core PLM values stated above; “only” the process part might not be accomplished.

The concept of self-organizing teams is not new. It is a great part of the agile software development movement. Isn’t this what we would talk about when we put PLM in a decentralized process environment but with consolidated tools and information, having common PLM tool(s) orchestrate these “teams” to deliver information and use tools in an uniting way?

In such an environment process support might be hard to implement within the commonly used tools, which might not always be a bad thing. Having processes tightly implemented into the IT tools assume a deterministic look at the world, but as we all know the world and the way we work is an ever changing parameter. So this could actually trigger a healthy look at how you architect and build your tool support as it will require modularization and flexibility.

The trap which you might fall into is that the consolidating system will become the “scapegoat” as it will have to unite and be a compromise from an end-user point of view. Also, many regulated businesses don’t have this option. To unite under one proven and compliant process is a matter of necessity and survival for these companies.

Decentralized PDM and Consolidated PLM

A trend we see at large companies is that they move away from one enterprise PLM solution covering everything. We see several examples of local PDM solutions allowing local flexibility feeding an enterprise PLM which has information and processes harmonized, but on a higher and more generic level.

Processes and ”how to do things” (methods or practices if you will) are not the same, and to be able to deliver and consolidate information in a coherent and consistent way “the way you do things” might very well have to be aligned as well. An example of an area which requires this thinking is CAD / CAM. In other words it’s important to understand if you have these type dependency so that you act accordingly in your decentralized PDM environment.

Using a PDM tool closer to the authoring tools will often result in better connection and less translation issues and richer details as a result. But keeping the innovation loops within PDM and consolidate in PLM cements the functional silo as it delays the point in which a cross functional view can be “visualized”.

Conclusion

As I can see it you actually don’t need to centralize everything under the PLM “umbrella”. This is of course a bit depending on your type of business.

Processes and methods should be carefully implemented. You could reap a great deal of benefits by aligning and consolidating tools and information across the enterprise and still allow processes to be more autonomous in nature. One trick is to implement the processes on the right / high enough level (in the applications) which would allow people to work within it, while still keeping important flexibility and agility in the tool. It is important to keep a clear distinction between processes and methods so that you consciously decide what it is that you want the tool to support and what you keep outside it, as this will allow you to adjust the way you work to temporary situations and differences in business models, while still delivering and using commonly structured information and tools.

Are you convinced that one approach is better than the other? What’s your view on this?

Robert Wallerblad
www.infuseit.com


Tuesday, April 7, 2015

Personalized PLM Tool - make it yours!

Lately I’ve been thinking about how you could make people adapt and embrace an enterprise system such as PLM. And those thoughts got me to ask these questions - How can you get a feeling of my PLM system? And how far do you need to go to get there?

An email application allows the user to categorize and label emails. A “Todo” application allows the user to categorize, label, set alarms, and flag tasks/todo. And cloud-solutions like Salesforce CRM that are built for flexibility allows the user some flexibility on the user interface and what information to focus on.

These examples allows the user to view and manage data in a highly personalized way, defined by the user.

Individual Personalization through settings and preferences

The above are examples of explicit interactions, and is therefore a result of the users’ direct manipulation of the UI.

If we would take the same approach but put into a PLM application context it could be:
  • Changes to color schema
  • Filtering of content
  • Hiding and moving UI components such as columns in a table or tabs/areas with blocks of functionality
  • Setting up dashboards which would potentially aggregate and analyze information and then display it in a tasteful way
  • “Clipboard” function which allows the user to put information into different buckets
Adaptive Design

But there is also something called implicit interaction, where the system adapts to the user and its context without the need for the user to interact with the UI directly. Some examples of this from the mobile world are when the background light changes on your device depending on the light around you or that depending on the movement of your device the text gets enlarged. In both case it’s something that allows the user to easier digest the information without having to tell the device what should be done to compensate for the conditions/context in which the information is digested.

A common usage of adaptive design in PLM applications is when the application is adjusted to the user’s profile. Roles, teams, skills etc can all be used to expose the user to a suitable set of information and functions.

There are also examples where adaptive design is attempting to use the situational and temporal context in which the user exists to create a better experience or more suitable set of information. Examples of that could be to adapt information to the location of user (geographically, inside or outside company network, etc), network connectivity or bandwidth.

Conclusion & What can we ask from the future?

We will “never” be able to know and anticipate what preferences the user has when it comes to how he wants to look at, work with, and analyze information. So the individual personalization is needed to create flexibility, at least when trying to address a larger user community.  I believe that personalization will have to be there in all the variants that we have today but more sophisticate. And that we will get more powerful capabilities to create dashboards, “feeds”, and searches which will allow the user to collect, view, and analyze information according to his preferences.

But here is an interesting observation; if you look at mobile apps you will not find that many personalization options which allows the user to adapt it. Instead they have to go for simplicity, where the app dictates how the user sees and uses the information. And still people get addicted.

So, what would happen if we join simplicity and personalization? Making the system adapt based upon users’ interaction with the it, instead of designing all preference and rules up front. Think “Recent opened” on steroids. Here are some examples on top of my mind how that could look like in a "PLM setting":
  • Get reminders based upon patterns that has been found in how you act. Examples could be that you have a reoccurring event. Let’s say that you normally check for orders to approve on Monday mornings or that you check the deliveries against the GANT chart on Friday afternoons. Now the system reminds you that “hey, have you forgotten to do X?” and perhaps event provide you with the capabilities to actually perform what she things you should do.
  • Or think of gmails categorization of your mails as important based upon how you have interacted with the sender in previous emails and chats, as well as which keywords that have been frequently occurring in the mails you recently have opened.
  • Search is another area which could be hugely improved by being more “sensible” to both context but also by users previous usage
In the above discussion I used the word system or application (singular). The assumption was that we need to address the need of the individual using one application. But maybe we can agree on that working within processes actually makes you use multiple different applications. And instead of using PLM as a label for a system we use it as a label for the concept. Then what makes PLM mine is that “your” information is following you independently of which application you might have in front of you at a certain point in time. And that the information is the center of attention and the applications are secondary and transparent. That is what would really benefit the user and give the user the notion of my - that your information actually follow you in a coherent and consistent way across applications throughout the complete process. That independent of application you would have your dashboards, feeds, federated searches, subscription inbox, etc available. Collecting information and actions from different sources and applications.




What do you think? How will the future look like for enterprise systems such as PLM in terms of personalization? Will we aim to be more adaptive in system design? And what use cases do you see will trigger it?

Robert Wallerblad

Monday, March 30, 2015

Why is it so difficult to measure PLM success?

Everybody talks about measuring the result of their PLM initiative but nobody does anything! This is of course exaggerated, but in general, there are too few companies knowing what PLM has given them (or cost them). Many companies do not really know if
the PLM project is a success or not. The result is that management only cares about PLM once a year when they see the cost in the budget. Why is it so and what can be done about it?

There are few numbers available

In my 18 years in the PLM business, I have seen very few monetary values of PLM achievements. Of course, the PLM solution providers have their slide decks with generic numbers and maybe a few concrete examples they use over and over again. It is difficult to use those numbers directly in another company. The KPI’s can be reused, but you have to have your own numbers to be able to trust them.

This quote from Accenture is quite illustrating:

“The CEO came to the department and asked: I have spent 100 million euros on the implementation – what are the business results? They couldn't give him a proper answer.”

Unfortunately, this CEO is not alone.

PLM vendors have Excel sheets where you fill in a predefined list of possible improvements. By magic, you get a huge number that nobody (except the PLM sales guy) trusts.

We use the lack of numbers also as an excuse. The others do not have the numbers so I will manage without them as well.

The engineer’s flaws

I think that one reason for this lack of measures is that it is mostly engineers (as me) which care highly about PLM. In addition, engineers have two flaws in this context:
  • We are satisfied when things work (then we move to the next task)
  • We want answers to be un-ambiguous
We engineers want efficient and consistent workflows and a CAD integration that works. Yes, it would be nice to know what value that brings, but what we really care about is getting it to work. We are satisfied when it works and we have a feeling that this is good for the company. We do not see the importance of measuring what value it brings.

Engineers also like things to be accurate and without uncertainty. Calculating a business benefit and ROI is not an exact science, thus we do not trust it and do not do it unless we have to. If we are forced to, we tend to look only at the benefits that we can measure and get an accurate number. E.g. number of documents stored, ECO throughput time etc. Such numbers can show how PLM progresses, but it does not tell so much about actual business value. We miss the big picture and do not put areas that maybe are more important but harder to measure. E.g. global collaboration enabling two sites to share resources and work together or increased service margins due to better access to information.

Do not know how to do it

Given the engineering background, we feel it is hard identifying what to measure and defining calculation methods that we trust. Many PLM people agree that it would be good to have these values, but do not know where to begin and give up before starting. Many PLM people think they lack the financial tools and understanding to attack this beast in a proper way. They focus instead on the things they know: Getting a good configuration management process in place.

Hard to measure

Yes, it can be hard to measure business benefits and you cannot always trust the numbers 100%. It is easier to measure reduced volume of products on storage or faster production processes than improved product information quality. ERP focus on physical products and things you have while PLM focus on virtual products and things you do not have. This is still no excuse. We have to start showing the value of PLM or it will continue to be a product development support tool or even just a CAD management tool.

A typical excuse is also that we lack the numbers from before PLM came in and do not have anything to compare with. That is the same as saying that PLM is cemented and will not improve beyond the initial improvement. If that is true, the management is right in not investing more in PLM.

Why do you need to measure PLM success?

PLM people always complain that ERP gets all the attention and PLM at the best gets the leftovers when it comes to management attention and budget. The typical complaint is that management (thus the budget owners) do not understand PLM and its importance. We can only blame ourselves. We are unable to talk to the management in a language they understand. My experience is that some ERP business values also can be tough to calculate, but the ERP guys and girls are much better at talking a language the management understands – business benefits, calculated savings, earnings and cost. If PLM people were able to do the same, PLM would be in a much better position.

You cannot expect management attention and large investments or operations budgets for PLM (at least not anymore) if you cannot show what PLM brings and an ROI that management trusts. A trend is that PLM investment decisions are done on C-level.

Another thing is that measures in itself give improvements as you put attention on the area and follow the progress.

What can you do?

We also touched this in a previous blog where we looked at data from PLM related processes with a method improvement focus. Here is another interesting blog from Melanie Manning about how to create effective metrics.

First, you have to agree on what to measure. Tie PLM to the business strategy. What is important for management? Where will your business go in the next 2 to 5 years? Identify the areas where PLM contribute to the overall business strategy. PLM can be an enabler for changing things. Identify KPI’s related to the business strategy.

Start thinking like the financial guys. Act like a business analyst. Is there is a business improvement somewhere supported by PLM? How can you measure it? Do not only focus on the bottom-up simple stuff. Look at the big picture and involve the business and management. Think about short-term, long-term, tangibles, intangibles, cost reduction and revenue increase. As a common exercise, you can find improvement areas. Get business and management to give their thoughts on possible improvements. If they give the numbers, they are more likely to trust the result.

Some examples are: Product introductions per engineering hour, degree of collaboration, engineering efficiency, number of executed projects, project execution time, increased service revenue, reduced number of applications, product error costs, degree of CTO vs ETO, amount of manual entries of information. You have to agree on what makes sense for your company. The trick is turning such KPI’s into money values.

The most important is to start measuring. Agree on some KPI’s and follow them over time. And show what the KPI’s mean in money.

Tore Brathaug

Sunday, March 8, 2015

You don’t know what you don’t know about PLM

It is hard for non-PLM experts to know all the aspects and possibilities of PLM solutions. I have seen it over and over again. How can companies looking for a PLM solution make the right choices? Should they start specifying what they need or should they just trust their hunch and pick the one on top of their mind?

The topic is also valid for companies that have had PLM for a long time and are stuck. It is hard to know how and in which direction to go. In this blog I focus on first, or even second time buyers that don’t have large PLM organizations and a lot of internal PLM knowledge.

PLM can be so much

There are as many opinions about what PLM is as there are people. On an overall level you can probably agree that the core is:
  • Part Management
  • BOM Management
  • Document Management
  • Change Management
Throw in some CAD integrations and an ERP integration. Maybe variant management or project management. Or maybe PLM are all the tools and processes creating or managing product information?

An enterprise PLM solution like Teamcenter, Windchill, Agile and Enovia has an enormous breadth, but there is also a challenge to understand the details in a selected PLM area. For example CAD integration or the change management process.

The core functionality mentioned above is really not more than classical PDM and which what most companies still need the most. But it can be hard enough. There are so many details that you probably didn’t think you had to care about.

  • What should the naming rules be? Running number? Any logic?
  • What metadata do you want need, and can you agree on it? How does that match for instance ERP?
  • What are the approval steps for different object types? What is an object type by the way?
  • Easy revisioning in development and full blown change process in productionmanufacturing?
  • How do you get a BOM from CAD? Can you use it directly?
  • Etc.

There is so much detail in core PDM that you will not be able to cover it all during an evaluation phase. The devil is in the details and some of these differences can have significant impact on your way of working. Unless you really know what to look for you will not be able to identify the details that matters for you until you implement or even roll out the solution.

The evaluation process

Buying PLM is like buying a house. You visit the house one or maybe two times for half an hour. You read the prospect and then pay more than you planned. Then you get surprised a couple of months later when you get the house as there are so many details you see now that you could not possible see during the sales process. The good thing is that the human brain usually rationalizes bad decisions to look good. The good things about the house is exaggerated and the bad things are ignored. It can be the same with PLM. However, problems with PLM tends to be visible to others than the ones making the decisions, and that makes the bad things hard to ignore.

The problem with buying a PLM system, and a house, is that unless you are an expert you don’t know what to look for. You look at all and end up looking at nothing and make a decision based on who did the best demonstration and who you trusted most in the sales process. By the way: those people you trust the most are not likely to be part of the implementation anyway.

The sales presentations (and the information you find on the vendor’s websites) from the competitors can probably be switched without you noticing. On a sales level all solutions can do everything. You have to find the differences yourself. I like this quote from David Stewart at Zerowait-state: "..it’s hard to discern which is the better (PLM) choice. I still feel like the best way to know or choose is based on experience".

To compensate for not being able to see clear differences between the PLM solutions you typically describe what your requirements are and send out RFQs with a lot of details. If you are considering any of the enterprise tools the problem is that the solution providers probably can say yes to almost all and do the same things as the competitors. It is very hard to distinguish PLM solutions based on a list of requirements. See this blog for some insights to why traditional RFQ processes is a problem.

The typical tendencies when specifying needs/requirements as first time buyers are:

  • Not to specify at all or on a so high level it does not give any value. The answers all look the same. You end up picking the one you know or the one you believe the most in.
  • Or specifying in detail everything you can think of. You spend tremendous effort in detailing a very long list of requirements to be sure you cover all aspects. You end up with a complex solution.

The first one might be as good as the second. A big requirements phase is a danger in itself. You easily end up specifying today’s way of working. You don’t see all the changes a PLM solution should bring.

I suggest that you spend more time on reference visits and focus on the supplier, and not that much on the PLM solution itself. Find some of the references yourself. Don’t just take the ones that are picked by the suppliers. The key question: Is the supplier capable to be a long-term partner ensuring your success?

Hard to differentiate the PLM tools (even for experienced people)

I have been both on the specifying side as a customer, on the response side as an implementer and now on the customer side as an advisor in many PLM evaluations. And I see it again and again. The people evaluating PLM solutions are not able to distinguish clearly and rationally between the different options.

It is very hard to find facts about the differences. And the truth is that on a high level the differences among the enterprise PLM tools are quite small. You will see differences if you consider some lighter “PLM” solutions that are more CAD management oriented or ERP with PLM functionality. But even there it might be difficult. For CAD management oriented tools they might fall short on multi-CAD and integration to ERP. For ERP they might fall short on CAD integrations or on advanced PLM functionality like systems engineering. But for the core functionality it will seem that they all can do it.

The vendor’s tendency

The vendors tend to fall into one of two categories:

  • The oversimplifyers – They oversimplify the message and the effort. This makes it easy to understand for management and other stakeholders that are not experts in this field.
  • The scope-extenders – They want to extend the scope to show all the magnificent solutions they have that you didn’t ask for in the beginning

Be very aware of people in the extreme end of these two.

  • The too good to be true is usually that. You will end up with a limited solution and realize you got less than you expected.
  • The large scope might be too large. You end up with a very long and complex project which is more expensive than you thought and it is hard to get the organization on-board.

The sales guys anyway will paint a nice picture. The challenge for you is to see behind the fancy slide decks and prepared sales messages.

What should you do?

It is hard for first time PLM buyers to grasp the totality and at the same time identify the crucial details that can make or break your PLM project.

Don’t over-specify to compensate. You will not be able to specify up front how the solution and processes should be in 3 years. Even if you could the world will change in the meantime. It is more important to draw up the big lines and have a long term vision, strategy and plan. You should start with the most important areas and specify to such a level that you can differentiate. There are differences between the PLM solutions, you just have to know what to look for. If the differences is in an area that are of great importance to you it is better to find that out during the evaluation phase than in the implementation phase.

To be able to do this you could recruit or hire people who have done PLM evaluations and implementations before. It might seem expensive, but it will pay-off with a solution that is better fitted to your needs.

You might consider running a proof of concept (POC). But you risk going in the same trap. You have to know what to look for in the POC to get any value of it. Done properly a POC based on your data, products and your wanted way of working can be a really good approach.

You should look at other factors than just the feature list. The most important is the supplier itself: Have they done this many times before? Do they have understanding of your business? Do they act as real advisors guiding you in the right direction? Do they have references in your region?

See our How to select a PLM system blog post for more details about how to run a PLM evaluation.

Summary

PLM buyers: You don’t know what you don’t know about PLM. That means it is hard for you to distinguish the tools and the suppliers. Don’t try to specify everything. Have a clear vision of where you want to go and what you want to achieve and what is important for the business. Establish a commonly agreed strategy and a high level plan on how to get there.

Don’t go too much in detail, just where it matters. The challenge is to know which areas and details that are important for you and at the same time can differentiate the different tools and suppliers.

Focus instead on the suppliers and identify the one that is most likely to become a long-term partner ensuring your success. It is far more important to select the right supplier than selecting the right tool.

Tore Brathaug

Thursday, January 15, 2015

The PLM-user Pitch


The PLM system pitch and the related discussions is almost always focused on the decision makers - how should you convince the management to buy, and how should you show that you provide value with your implementation?

The topic is most often focused on the disconnect between IT and Business and how to bridge the gap.

It’s of course an important topic but today we will look at it from another angle.

There is another void to fill and that is the one between the benefit of the enterprise and the actual user of the system.

Neither vendors nor the companies looking for PLM systems have (enough of) this in focus. There is a functional focus, I can agree on that, but that is not necessarily the same thing as a user oriented PLM focus. It’s more about having a checklist to see that an application can fulfill the functional requirements, which is not really the same as having the user in the center.

So what would happen if we would focus on the users? Because once bought, a system such as PLM is not only good for the business as a whole, it is also intended to help users in their everyday work.

Tools and processes for the greater good

An enterprise tool is often emphasized as the tool which should support the complete company’s need and not necessarily the individual. We focus on overall process/information improvement and harmonization and not the end-users tasks and daily work.

A company oriented pitch is also often more future oriented, than what you would like to phrase it to an end-user. The employees are more focused on the present and solving the challenges of today. That’s where our pitch should be focused – the present and what it means for the individual.

An individual productivity tool

If you take the scope of PDM, you should be able to pitch the actual idea to make it about enabling the individual; making it easier to find the right information, enabling earlier transparency as well as collaboration. For complex data sets and/or tasks it will help out in keeping data integrity, and dependencies thereby offloading the workers from otherwise tedious and error prone tasks.

But unfortunately there are challenges with this pitch:
  • The end user and the way they want a system to behave and support them in their daily work is diversified. What makes a good fit for one will not necessarily fit others. Basically I don’t believe that there is a “Heinz ketchup of PLM”, fitting all tastes. I rather believe that the need is as diverse as the salsas that you can buy in your local supermarket.
  • If we talk about PDM systems - functions associated with PDM comes with quite a heavy baggage in terms of old system behavior which has not always been perceived as enabling. A shift in technology and the ability to work more seamlessly will most likely help out in making PDM applications less of a struggle in the future.
A Tool for Knowledge

The productivity pitch will not get that much traction if your PLM system is used to “only” specify your products once you have developed them, instead of being develop within it. Unfortunately this (mis)usage is not as uncommon as you might think. In many cases putting things into the system is an administrative task at the end of what one consider value adding activities, and this really undermines the individual’s perception of having the system support her needs.

But, independent of scenario you would probably gain one thing and that is a knowledge bank. The system will create transparency which would benefit the individual, as searchable and structured information will allow for higher productivity second time around.

This transparency will also benefit the people further down the chain. The earlier you manage to accomplish it the more power you will get of it, and it’s also an opportunity for business intelligence to analyze trends earlier which again could be used as a pitch for certain end users or consumers of information.

Democratization

By sharing knowledge we will enable decentralizing and distribution of tasks. Technology will enable this. Because by systemizing knowledge we can put it in the hands of “anyone”.

Think about simulation which previously was something that only highly specialized people worked with. Today software is taken the first hit through checking the output of the individuals work before integrating it with the rest of whatever solution one is working on. All domains have it; software, hardware, mechanical design, electronics, etc. And some will take the step to create mockups which brings multiple disciplines together.

There will always be a place for specialized skills but the frontier is and will continue to move as we manage to systemize knowledge. And this should appeal to the expert as it will allow her to focus on things which are less bread and butter, at the same time as it gives the non-expert more confidence in the output she produces.

A trendy phenomena is Internet of Things; think what we can do with data collected from products in the field and once we systemize that knowledge. How will that translate into the way we design our products or conduct or service and maintenance business? Once that data is cracked it can be used as BI put into the hands of the individual.

Could this bottom-up approach result in benefits on company level (for “the greater good”)?

Could we flip this around to make it about the company, and what is best for it? Of course we could. Thinking about and addressing the needs of the individual will at the end find its way to the bottom line, resulting in better overall quality, better flows, higher productivity, higher data quality, etc.

Conclusion

IT and PLM should not be seen as a support function next to the core business – your PLM processes is actually part of your business. In many companies today it is therefore within your PLM systems that you conduct your business. If we embrace that fact, the focus can’t always be on the benefits on company level, the day-to-day work has to find its way to the PLM pitch.

Robert Wallerblad

This %&$#?@! PLM Application!



Have you ever encountered bad user experience in a PLM tool? I have a few times. So how do we address user experience in enterprise software such as PLM Systems?

UI is only one part of UX

When thinking about user experience (UX) of an application you might associate it with words like nice and modern looking graphical user interface (GUI). But it is so much more – in its core it’s about understanding the end-users need and to make the work-tasks as simple, effective, and intuitive as possible. Level 1 is to address things such as consequent usage of interaction patterns, wording, icons and other features that you might associate with the GUI. But that doesn’t necessarily mean that you will develop the right features. You might still miss the actual goal of your implementation.

Understanding the user and the context

So, back to what I considered core – it´s all about understanding the context and the usage. Is the target an expert user? Then it might be important to have an effective pixel-usage, and advanced features, which might have some threshold to learn, but which is motivated by the value it provides. Or, is it aimed for the broader audience, where it has to be intuitive, self-explaining and with zero thresholds? To understand this you need to more or less work in the same way as any business analyst, grasping the essence and details of the tasks that you are about to develop a support for. To know the business impact goal of the implementation is an important part of this, as it sets the framework for whatever feature or function you are about to develop.  You can probably implement “a thing” in hundred different ways but only a few of them will match the desired outcome (e.g. better overview, higher efficiency, transparency, or traceability).

Another important thing, which is put to its extreme when working with customers in the fashion industry (as I do now), is the amount of data that the user is exposed to, and the speed in which the user has to perform his/her tasks. This means that every click counts – if you do thousands of product developments per season you will have some demand on the tools which is supporting your processes. Things which normally might be considered annoying become critical issues. And bad performance becomes show stoppers as the users find other ways to conduct their daily work. If you think about it, it’s not any different to any other industry. It’s just that the situation is more extreme which makes UX a good focal point for efforts made to make life easier to the end-user.

Don't lay the load on the user

A typical trap is to make things over-complicated. This is often found in PLM tools. Tools providing out-of-the-box capabilities often take the path to provide a function which is as generic as possible, allowing as flexible usage as possible. This flexibility is also manifested in the fact that every screen comes fully loaded with features. Something that you most probably can either configure, or use preferences to adjust. The issue is that none of that is actually done to enhance the experience. Actually, you will most likely find this in a lot of highly customized or bespoke software as well, and for the same reason – it is hard to take decisions which limits your options even if it for a good cause; to keep it simple. In many cases you need to take a stand and choose a path for the usage of the software to make it work in a streamlined and effective way - by not taking the decision yourself, you force it upon the end user.

To quote Steve Jobs;

“Simple can be harder than complex: You have to work hard to get your thinking clean to make it simple. But it’s worth it in the end because once you get there, you can move mountains.”

There is money to make from good design

The value of good design is not only about pleasing the end-user. Areas which contribute to a faster ROI are:
  • Less need for support
  • Less error prone
  • Less need for user training
  • More through-put
  • More time spent on value adding tasks
But I have also heard some “softer” reasons for pushing for good design and that is to be able to appeal new, young, co-workers; you need to provide tools which don’t make the hair on their back stand up. That fight is not easy won as applications from the “consumer world” have transformed what we expect from an application in terms of usability. And mobile and cloud has done the same with availability, as we demand access from anywhere at any time.

What can we ask from the future?

I will not take the consumer and mobile path on this one, and I won’t take you to “star trek” millennia either…

Steve Krug’s book “Don’t make me think” is often mentioned when usability is discussed. Isn’t it time for someone to write the book “Don’t make me enter more information, please I beg you!”? I hope and believe that a lot of today’s tedious clicking and entering will be replaced by capabilities which will allow the user to shift focus from administrative to more value adding tasks. Aren’t we soon in the age of more intelligent PLM applications, which could enable more automated metadata creation? I'm not talking about mind-reading applications. It's more about utilizing the context in which the data resides, releasing data by moving out of the boundaries of native file formats, and borrow some from predictive coding. In short - don't make a human do the work that a machine can do.

Robert  Wallerblad
www.infuseit.com

*The title was inspired by the book "Jävla skitsystem" by Jonas Söderström

Monday, November 17, 2014

LCI – Document package or continuous process?

LCI or LifeCycle Information is a hot topic in the Norwegian Oil & Gas industry. For my international readers who do not know this term, it has to do with managing huge amounts of documentation from plant engineering through product engineering and fabrication. Then cross checking it all through multiple iterations. The documentation is then supplied to the Operator at several milestones in the project from early design through commissioning.

A more international term would be preparation of Documentation For Installation (DFI) and Documentation For Operation (DFO).

Rigorous demands on LCI from the operators
Operators put rigorous demands on what information they want in a project, and at which points in time they need it due to their need to monitor progress in the projects, and to be compliant with safety and regulatory standards. The Engineering procurement & construction contractor is responsible for collecting, checking and supplying the documentation from its own disciplines as well as all suppliers in the required way.

LCI Coordinators
Traditionally it has been the domain of a whole host of LCI Coordinators to make sure that all documentation is present and if not, make sure it is created…. However the “best” LCI coordinators manage to produce the information without “bothering” engineering too much. It has largely been a document centric process separated from the plant/product engineering process.

Varying LCI requirements from operators
One of the real headaches for the EPC’s are the varying requirements from different operators in different countries or especially when the end customer is a yard. I’ve witnessed rigorous and detailed LCI deliveries in a project for an operator, and a completely different set of deliveries for a yard. This challenge has led to more EPC’s defining their own LCI strategy and processes as their own best practice, while treating requirements from operators in different parts of the world as “add-ons” to their already existing LCI process.



Going from document-centric to data oriented approach
In recent years, I’ve started to see a shift from the separated document centric approach to a more data oriented approach where data is harvested from different data structures, or linked data in context if you will. This process is no longer separate from the plant design and project execution process, but rather an integrated part of it. This shift is very similar to how the aerospace industry is executing their projects. One such example is Airbus DMU (Digital Mock Up). With this approach it is easier to share and collaborate on data. Dependencies and consequences of changes are more easily understood and experiences can be harvested from one project to the next by copying data structures from one project to the other. Some EPC’s are also creating best practice template structures or libraries. I've seen this approach used successfully to facilitate re-use and to minimize engineering time during FEED phase (Front End Engineering & Design) and also during contract execution.

If you want more information regarding the differences between a document centric and data oriented approach I would recommend Jos Voskuils blog series on the subject.

CONCLUSION
The Oil & Gas industry is under pressure to save money and be more efficient. LCI is one of the domains where there is a lot to be learned from other industries. Building the LCI processes into your engineering and project execution processes will greatly reduce the LCI effort. Of course this demands that you have some way to collect, control and consolidate engineering and design data from various sources.

Bjørn Fidjeland
www.infuseit.com

Sunday, October 5, 2014

PLM vs ERP – the everlasting trench war or the best PLM ROI?

PLM vs ERP discussions have a tendency to end in trench war between the ERP people and the PLM people. If not open hostility, there is little understanding of your opponent. It does not have to be like this. PLM and ERP can live together and make the total outcome better if PLM and ERP has found their roles and co-operate.


 In this blog I am talking about PLM and ERP tools. Not the PLM concept covering many applications and the whole lifecycle.

This is one of the everlasting PLM topics. There are a lot of blogs about this topic. And more will come as this topic will exist as long as PLM and ERP tools exist. Here are some of them: Beyond PLM, The Virtual Dutchman, Engineering (Engineering) and Engineering.com.

Why is PLM vs ERP a challenge?

Traditionally PDM and MRP had clear roles. PDM takes care of the parts and BOM from engineering and MRP takes care of purchasing and production planning. This picture is not that clear anymore. PLM and ERP have overlapping functionality and processes. Both try to handle “all product information”. PLM stretches into the ERP domain with for example sourcing functionality and ERP stretches into the PLM domain with for example CAD integrations.

Another reason is that PLM and ERP are typically owned by different organizations. ERP is owned by the finance-people and PLM is owned by the technology-people. There is difference in culture, focus, language and they often have a hard time understanding each other.

One oversimplified difference is that with ERP you optimize what you have (existing and physical products), with PLM you optimize what you do not have (new products, intellectual property).

How to approach this

The challenge is identifying a clear and not overlapping role for the PLM system and ERP system in your company. The first step is getting the PLM and ERP people into the same room and get them to understand each other. And to see this from a bigger picture. What is best for our company? Can we achieve something better by co-operating and perhaps “giving away” something to the other side?

There are clear benefits if you can get PLM and ERP to work together. Less time entering information again, better quality of data, faster change processes etc. Get out of the trenches and look at the information flow, processes and business needs. Do this without prejudice about where and how to solve it. When you understand what your business needs you can look at a logical flow. Which tools have the best functionality and which roles are supposed to do the work? Don´t make people have to work in both systems and switch back and forth.

Use a top- down approach: What is best for the company; not “how can I build as much functionality into my tool as possible”.

Several of the companies where we have run PLM-ERP workshops with this approach have responded: This is the first time both camps say they understand each other and are agreeing that together they can really improve things.

A typical scenario

There is no single correct answer on what the PLM-ERP set-up should be. The most common solution though is this:
  • The parts and EBOM are born and approved in PLM
  • The EBOM is transferred to ERP when it is approved
  • The EBOM is re-transferred when it is changed and approved again
  • MBOM is created in ERP based on the EBOM
  • The change process includes manufacturing planning (at least manually)

The next level could be feedback from ERP. E.g. Cost, stock, change implemented, preferred suppliers etc.

Some companies create and give names to parts in the ERP system and others create the MBOM in the PLM system. However, most companies with a working PLM-ERP integration follows the set-up above.

The challenge is anyway the business process, information flow and business rules. Not on the technology side.

Typical PLM-ERP integration tasks

When you start looking on a PLM-ERP integration there are some task you have to do. Here is a typical list:

  1. Objective & Ambition – What do you want to achieve? What is the business pain?
  2. Processes – Which processes will be covered/changed?
  3. Information Flow – What information will be handled and how should it flow?
  4. Business Rules – What are the rules ensuring data and process consistency?
  5. Data Model – How is the data related to each other? Are the data the same in PLM and ERP?
  6. Transfer Mechanisms – What is triggering the transfer? What is transferred?
  7. Data mapping – How do PLM and ERP data map?
  8. Data ownership – Which tool owns which data at which step in the process?
  9. Integration Technology – What integration technology should you use?
  10. Technical Implementation – Programming/configuring the solution

The first 8 are business topics, only the last 2 are technology topics. The challenge is on the first 8. If you do those correctly you will find the 2 last ones to be manageable.

Conclusion

PLM - ERP integration is one of the really big benefits you can get from a PLM investment. It will ensure good quality of data and you will reduce the manual effort in data entering.

There is no single right answer to how this integration should be. There are overlapping processes and functionality. Get out of the trenches and find out what is best for your company.

Tore Brathaug
www.infuseit.com

Sunday, August 31, 2014

How do we get PLM right?


How do we get PLM right and who is the PLM expert? Who knows best how it fits the users and the company? Can we get it right without involving the end users?

I’m not claiming to be a user experience (UX) expert of any kind but I have reflected and learnt some during the years, and one thing is that the best way of really getting it right is by testing it on end-users. And I intentionally expressed this in plural – you need a group of people to validate the chosen solution. It is arrogant to think that you will get it right just because it states expert (or PLM advisor as in my case) on your business card.

But a UX expert wouldn’t ask an end-user; so what is it that you want? To put it into a classic T-Ford example: If you would have asked what the people would have wanted they would have said "faster horses", instead their need was faster transportation. So it's not solutions that one should ask for and seek, it is the underlying problem that you want to solve.

After having talked to the users about their challenges, wishes and dreams, she would bring a suggestion to the table to try it out, preferably before and during implementation. That’s where the UX expert has its domain; understanding the end-user, suggesting solutions and validating them (which is a craft of its own to master). 

So how is it with PLM? Should we consider that the same applies to this domain? That the crowd knows better than the expert? And the expert’s job is to suggest solutions and then validate them?

Who knows the customer best?

Well, I would say it would be strange if it wouldn’t be the customer itself. The customer though, might need some assistance in understanding, expressing, describing and documenting their needs, problems and challenges.

Who knows the domain best?

The customer knows how they are doing things but that is not necessarily the same thing as stating that it is the best way (even for that customer). External “experts” will bring ideas to the table, as they have experience from many different companies, thereby allowing them to have another view on the topic.

Who knows what to aim for, and what the vision should be?

Is it the ones working in the process day-to-day? Well, you need to have some perspective to your own work. And be exposed to input from others to be the one with a goal that has a trajectory which aims high enough.  Don’t get me wrong! You will get good initiatives going by listening to the end-users. But, you might find them to be “micro-ambitious” – targeting to solve the challenges and evolve the work of few, but not taking the enterprise view of the PLM agenda. Also, it is most likely that you will get feedback targeting to solve the problems of today which leads to a more reactive behavior than a proactive approach targeting to get your enterprise to the next "level".

Who knows the tools?

There is actually at least one more “expert” to this equation, at least once you actually talk about IT solutions and not just strategies and processes – The above questions don’t take into account that there are constraint in the IT tools, which are limiting your options. This is where the application expert finds his place. The platform/application will create constraints which the expert, whether it is UX or PLM, has to consider at least to some extent when it comes to deciding on solutions.

The other side of the coin can be processes or solutions in the application that you did not think about, but can give short term improvements with little effort.

Have we found all the “experts” now?

We have a good mix, but there is one more which could spice up things and could be used as a catalyst; the generalist. In some situation you will benefit from not being an expert in the field - an outside perspective will allow you to see new connections and reframing your situation with expertise and experience from other industries and business domains.

An interesting take on this is whether you are looking for someone who should help you solve a problem or find a problem. To truly take your PLM vision and strategy to another level it’s not enough to solve a known problem; you need to find problems that you didn’t know you have. Who is most suited to guide you in that quest?

But what happend to the crowd/the end-user?  

They are there. Because whether you are implementing processes or tools to support them the end-user will still be the judge if it fits with her needs or works in the reality of conducting her work. 

Conclusion

Implementing PLM (and knowing how and what to do) is not a one-man job. You need to ensure that your team is multi disciplinary; with both broad and deep knowledge in the domain, business, and technology. I would also emphasize that you need real end-user involvement to make sure that you, at the end, get their acceptance.

I leave you with a quote that stuck in my mind after a conversation with a colleague:

"A 'newbie' will look for evidence to guide her in the path to the most appropriate actions to a greater extent than an expert would. Therefore she will win in the long run."

What if you would have a team of experienced people which uses the tools and mindset of the newbie?

Robert Wallerblad