Sunday, December 15, 2013

PLM for plant design projects?

When I started out on my PLM (Product Lifecycle Management) journey more than a decade ago it was hard for me, coming originally from oil & gas, to understand the difference between  the functional decomposition (tag structure) used in multi-discipline plant design, and the EBOM (Engineering Bill Of Material) used in traditional manufacturing industry. It was easy however, to see the language differences and the fact that PLM focused on BOM and serial production

Having worked with PLM in different industries it is now clear for me that PLM can have a significant role in project-focused industries like in plant design projects. What PLM systems are really good at is creating a globally accessible information backbone, connect this information to processes and connect the processes to people to ensure that the right information is there at the right time so that people can take decisions based on the latest approved “version of the truth”.

Using a PLM system can give a lot of benefits over the traditional way of doing plant design projects, where information and experience tend to be locked in different information repositories and systems in each project. Each project has its own IT infrastructure and setup of tools, and it becomes very hard to reuse knowledge, processes and parts of the design from one project to the next.
If all design information was in a PLM system, an EPC could consolidate all the plant design disciplines, connect the design information as deliverables in the project plan and monitor the projects progress in real-time.  Change management processes could be enforced, design templates could be used, partial designs could be copied from one project to another, equipment requirements could be selected from a catalog and all of it across projects. These are just a few of the benefits that could be harvested.

So why has this not been done a long time ago?
Well, in my view there are two main factors.

One is that there have not been sufficient drivers to do so in the past. One of the primary drivers for changing from a project centric approach to an approach that promotes reuse, harmonization of processes and standardization is cost pressure and competition. We’ve seen this in every industry that has adopted PLM. It’s not until the margins really suffer that PLM gains any real traction. At the moment we see that as a result of the above mentioned factors, there is a growing interest in PLM among several large EPCs.

 The other factor is the actual design process and the tools used. They differ quite a lot between plant and product design.
The difference in process however is not the main challenge. There are several PLM systems that are flexible enough to allow such processes to be created.

The main challenge lies in the fact that in order for a PLM system to work effectively, all the design information must flow from the design tools into the PLM platform. Not many PLM systems today have good integrations to plant design tools, so hence it is difficult to leverage all the benefits that a PLM system could bring. In product design it is quite different, since almost all PLM systems have integrations to most product design tools.

So am I saying that PLM should not handle plant design projects?

Actually no, because there is one other development in the industry that has led to a golden opportunity to get all the needed information into PLM systems and start harvesting the benefits.
There has been a lot of pressure recent years especially from Owner/Operators to standardize information exchange between all the parts of the value chain. From Owner/Operator, through EPCs and  product companies.
The need to consolidate internal disciplines for the EPC, but also to communicate with outside parties like product companies and especially to support the big handover of information from EPC to Owner Operator after installation and commissioning has led to a lot of focus on standardizing information exchange. One such example is ISO 15926 and the preferred exchange format XMpLant. Most plant design authoring tools are claiming to support this standard today. In my view, this is the key to bringing all of the engineering information under process control in a PLM system. From here it can be shared with internal parties, but also to externals, like the Owner/Operator or product companies.
Essentially, this is killing two birds with one stone.

What is my conclusion?
PLM systems “out of the box” are not suited to support plant design projects. However, PLM systems are very much suited as a platform for consolidating plant design information and processes across all disciplines in a plant design project provided that they support information exchange in a standardized manner like with ISO 15926 and XMpLant.
Such a PLM system would solve two major headaches for the EPCs. Firstly, the internal consolidation and follow up of the internal disciplines, and secondly the ability to exchange information with other stakeholders in the project in a standardized manner.
Those PLM platforms that are capable of scaling well enough, support standardized information exchange and handle the sheer amount of information involved in handling multiple plant design projects could have a bright future in this domain.


Saturday, November 16, 2013

Doing Things the Right Way?

In a previous blog, titled Doing the Right Thing, the question was if we would add value if we shifted focus from a processes centric approach when implementing a PLM tool. Today it's time to look at the other side of that coin - why would you want to focus on doing things the right way, or in other words being process centric? And to repeat - the main point here is not to dwell over the question if we need business processes or not, it’s about creating IT support for them in a PLM system, or not.

First of all let’s be frank; it's not necessarily the right way we are talking about, it's more a way; a way that has been chosen as the "right one". The criteria of being the right one could be many; what we know to be the best way at this point in time, a way prescribed by a standard or required to become compliant, or simply something that the stakeholder could agree upon.

So why would we want to implement processes into an IT tool?
  • Compliance to different standards and regulations is a huge driving force. Being able to enforce and show auditors that you are following the directives prescribed by a authoring body is sometimes a good enough argument. Having change management, review processes, approval processes, follow-ups, document management, history records, etc supported by an IT tool makes life easier (even if your business isn't regulated), at least if you have the right IT support.
  • Consolidate and unite companies/subsidiaries/sites/etc in their way of working it's quite often a requirement for initiatives such as joint manufacturing facilities, sourcing, procurement, eCommerce, PIM, or effective after-market support. But I would dare to say that in most cases it's more about securing the data rather than the processes. Call the cards, and you will probably find that besides the initiatives listed above the motive could be found among the softer aspects of implementing IT support for a process - unification across borders through enforcement, thereby creating the notion of one company.
  • Complex development processes which require consolidation and exchange of information across multiple disciplines and organizations will also have a great benefit of harmonizing data and processes using a system to support their tasks.
  • Scalable "know how" as it's captured in the tool allowing it to guide people in their work. Phase-in of new employees will thereby be easier if there is process support guiding and enforcing the user in the way they should work.
  • Improve and enforce quality as you get control and are able to guide users through the companies working procedures
  • Automatisation of process steps will become much easier as we have a repetitive flow of events
So as you can see; there is no "That is the question" in Hamlets monolog. It all depends. There are basically advantages of both approaches. And that is how it is; depending on your needs (some could perhaps be more justified than others ;) you could see the advantages of getting a PLM system to support your processes. But don't take it as something that you have to do! If you start using the approach from the previous post as a "reflex standpoint" - don't tie down your IT system with intricate rules to support an ever-changing process, then I bet you will get a better Total Cost of Ownership (TCO) at the end and a happier customer (the end user).

Robert Wallerblad

Sunday, October 20, 2013

Doing the Right Thing?





This post is inspired by a definition found in the comments which followed Oleg's post and a discussion with a friend of mine who is PLM responsible at a small car manufacturing company.

First, let's apply the definition of processes and practices, that Michael Grieves use:

“Processes are well defined routines that give organizations the outcomes they desire. However, a great deal of what companies do, such as innovation, is driven by a desired end result, which is what practices are all about. “
You could also rephrase this to "processes are about doing things the right way”; dictating how work should be done. Focusing on the what instead, we would allow decisions by knowledged workers and IT automation determine much of the how.   

To generalize, traditional PLM implementations are focused on data capturing and processes (according to the above definition). Capturing data in a common repository enables transparency, access and being able to connect the dots (data) in new ways. Usually, we then cover the data with a process layer, targeted for effective and repetitive management of that data. 

But what if we would focus on supporting “doing the right thing” instead? What would that mean? For me, that would mean focusing our effort on data, functions to manipulate, communicate and collaborate around it. Another critical role for a PLM tool would be to have capabilities to analyze that data, helping workers make informed decisions which would allow them to “do the right thing”.  

Are we on to something? What about "best practice" processes? Could we say that we can deliver value without them? Can we do this without being banged in the head by sales and marketing saying that we can't sell without a out of the box process support?

What if we state that "best practices" are on data level, basically how we choose to model the reality in a smart way? But to not become a consolidating integration hub, we would have to have tools to manage that data accordingly. With decoupled features we then address "best practices" of managing data without "tying it down" with processes.


To not "just" become a good data management tool we would need to wrap the whole thing with communication and analytic capabilities allowing the user to collaborate and making “the right” decisions.


What would we have then?
- A replacement to mail, excel and other data silos, which was exactly the thing that my friend at the car company was asking for; a tool which would allow him to still have the required flexibility but with transparency and the synergy of previously disconnected data sets. This would give him the means to reach the desired result in a perhaps unstructured way but secured via data and supported through communication and visual analytics. Wouldn't that add value? Wouldn’t that create an innovation pull as we then focus on "doing the right thing" rather than paving complicated cow paths?

Robert Wallerblad

Sunday, September 22, 2013

Looking beyond the Social Hype of Enterprise Collaboration Tools

Starting a discussion about enterprise collaboration tools the “visionary” individual in the group says: “I want Facebook for me and my colleagues”. And as Ed Lopategui and Oleg Shilovitsky pointed out in their blogs last weekend - Facebook and social platforms are probably not exactly what she should be looking for. So instead of getting fooled by the allure of marketing promises of social enterprise tools what should she be looking for? 
What Facebook and actually most other social applications are providing is not only a real-time chat but something that is asynchronous and persistent and doesn’t require that someone is actually receiving the message at the point in time when it is sent. Due to the global environment in which companies are working in today this is a “must have” as time zones would be a too much of an obstacle otherwise.
Many collaboration tools are coming from vendors already offering CAD, ERP, CRM, PLM, etc but these are primarily focused on the users and practices of that specific system. It’s basically not user centric! If you would take the users standpoint in such a solution you would easily see that you would allow for cool collaboration but only within the island of that application … and for the end user that is not enough - how many are there living in a one-app or even a one-suite world? So we need something that has crossed the application barriers and can work stand alone or integrated depending on need.
Moving towards a “neutral” collaboration platform would also be a step forward to enable a nowadays quite popularly addressed part of enterprise collaboration – the one towards the suppliers. Not only would we have the “normally” implemented structured exchange of information in place, it would also enable us to support the more unstructured and ad-hoc part of it. Moving away from mail and creating a more transparent way of communicating.
This takes me to the next point – Transparency. When I say transparency I don’t necessarily mean that everything is accessible to anybody. But you can share and you can collaborate across disciplines and multiple users without having to cc each other like you do with a mail! This is a key feature and required if we want to break out from the silos that emails are creating.
So what about mail then? Well, I wouldn’t be so bold to say that mail will disappear the next coming years, but we could definitely fight to reduce the volume of messages going back and forth. With good hooks and APIs to import and generate emails and to plugin and expose features in other tools and portals I bet we could give it a good fight too ;)
Another thing with mails is that it allows the user to structure her things in her way. Creating folders and tasks are features which allow the user to create her world and semantics. Allowing her to find and organize the information in her way.
The thing that neither Facebook nor email has is context – the user is the context in these applications. To bring the features of email and Facebook into enterprise collaboration we need to hook it into an item context. And that is exactly what the vendors of the CAD, ERP, CRM, PLM tools offers – the right data context! But only from their application point of view! Users are working within processes supported by multiple applications. In reality “the context” is something moving across multiple applications and across functional domains and we need to treat the communication the same way.
Tying together most of the features implied above is the accessibility and searchability – for the communication, collaboration, transparency, email, and context to be worth anything you really need to be able to find stuff.
So now I have stated some features I consider important when thinking about collaboration tools. Although being good, they bring some softer challenges to the table, which might be easier to mitigate in current collaboration solutions.
One challenge is to do the right thing in the right place and in the right forum ... and still making it understandable to the user.
  • Data in the “right” place - will we create a mess and store important information in a too unstructured way?
  • Decisions in the “right” place - will decision be made invisible and will we lose the formality required of some decisions? Statuses, do we need them anymore ;)
Then we have an obvious challenge - getting people to move from a silo mail-world to a world which is no longer under “my” control. Transparency is scary. There is no way around that.
So to summarize; Facebook is probably not the way forward for enterprise collaboration tools and there are some challenges to address on both technical and a human level with today’s collaborative tools. One thing is for sure - just because we might re-label "the solution" from time to time the need will still be there. The problem is when the label brings assumptions to the table which are not suited for an enterprise collaboration tool. I'm sure that we will get past the current misconceptions and as collaboration and digital technologies are advancing it will allow us to solve our challenges though new influences and tools. 

Robert Wallerblad

Friday, August 30, 2013

The Achilles' Heel of Data Modeling Standards

The benefit of a data standard is obvious - being able to share and exchange information and integrate business processes in both early and later stages of a products lifecycle is a tempting thought. It also addresses challenges that are high on many executives’ agenda.

Taking standards into an everyday perspective you see the benefit of it in for example electrical sockets. It is also a good example of how hard it is to unite around something as there are different standards of it across the globe. The biggest advantage with a standard is not that it is optimized or perfect; it is that it unites.
Data Modeling Standards has one Achilles' Heel - one has to map the clients business models into the standard model in a consistent way. It is actually the purpose of the standard - same thing should be modeled in the same way independent of source or organization. This is key to success for both the standard itself and the actual implementation of it at a customer.
So let’s try to digest that statement.
If we have a standard which is semantically covering the complete definition of a certain thing, leaving no room for flexibility, there shouldn't be any problem, right? This requires that the area of the standard is mature. An example of such a standard would be STEPs AP214/AP203 (ISO 10303) which covers CAD geometry.
But what happens if you look at standards which are more open for interpretation such as PLCS (ISO 10303-239), or ISO 15926? If it is open for interpretation it can be "misused"; missing the goal of the implementation completely as the intent with the standard is to enable data exchange through the use of a neutral format. But if the same thing is modeled differently we suddenly find ourselves in the normal integration trap with multiple mappings to get to a consistent, homogeneous and coherent information set. Standards like these basically requires that you either have a developed agreement how to apply it for a certain item and business process (in PLCS case a DEX) or that one uses people who more or less embodies the standard and knows by experience how it should be. An alternative is to use a more industry specific standard which thereby is more semantically meaningful and less "error prone".
Taking the road towards standardized data exchange is an easy pick if you do it in a mature area, such as CAD geometry, which has a stable foundation in a standard. But when it comes to jumping on (standard) trains less traveled one should really understand the potential pitfalls in that specific area that one is looking to standardize; the maturity of the standard and its usage, and really understand the reasoning behind the standard and also your own initiative.

Robert Wallerblad