Showing posts with label PLM. Show all posts
Showing posts with label PLM. Show all posts

Sunday, October 3, 2010

#PLM > The mysteries of the ETO industry

Have you ever wondered how an elevator was made? There is no standardization for an elevator. They all have different size, serve building of different heights, have different speed and design...
In a world that ask for more standardization how can a company reduce their engineering cost on some products that are never the same? When maintenance is done, how can you manage the history of changes that happened on a elevator?

This is the problem of all Engineering To Order Industry. You can tell me that Aerospace is an ETO industry, that Ship Building is too, but they manufacture too little to know exactly the same problematic (for the moment).

They are three main processes that I propose to look at in the future posts.
  • Management of standard products and their "standard options" (What models of elevator the company offers and what are the "catalog options"?)
  • Management of Orders and their specifics requirements (When I have an order, what are my customer "specific requirements" that I will need to take in account?)
  • Management of the on site services (How do I manage changes on a existing installation whether I did it or someone else did it)
We will look closely at those three processes through the glass of the largest PLM approach possible. How to manage data, schedules, design definition, configuration definition... we will the see where are the differences with other industries where PLM is well implemented (High Tech, Aerospace,...) and what are the challenges (technical but not only) that the implementation of a PLM implies for such industries.

So talk to you soon.

Saturday, June 26, 2010

#Cloud > What is the next generation of CAD?

Commodotisation
Let's talk first of the commoditisation process. The commodotisation is the process from which an innovation (Computer Aided Design), becomes a Bespoke product to a Product (CATIA, NX, Pro/E,...). It predicts as well that the product will be considered a commodity. Like electricity.


Electricity, I like this analogy... who still buys its own engine to produce their own electricity? When you need electricity, what do you do? Do you go to the grocery store and buy some? Or do you plug your computer on the network?

I firmly believe that this is what is going to happen with CAD and eventually 3D design. It is just a matter of time. Some PLM companies have started their "Cloud Computing Strategy"... Dassault Systemes has launched 3Dvia shape. You can design 3D models and save it directly on the website servers. Is that cloud computing? Only partially because most of the computing is done on your own computer. It' s not even SaaS, because the only service proposed is the storage, not the all software (I hope one day I'll be able to explain why SaaS and cloud computing are not the same thing). So what would be the next steps?

The Technology

Let's take the view of the customer. As an big industrial, I have thousands of machines that I have used for CAD along the years... and I do not know how to leverage all that "dead" material. CAD becomes more and more complex and require more and more resources, therefore more powerful desktops for designers. As an IT manager, I am tired of plan for upgrade of my material and of software on each computer.
What would be life if instead of using computing resources from one computer, I would leverage computing resources of my data center?
Today, the technology is here, and surprisingly simple.
In order to reduce my computing resource to a minimal, I could just be transfered the video flux from a server. You'd connect to the server via a seamless virtual instance that runs your CAD application. The server takes care of all the rest, the calculation, graphic acceleration,...
And for company that do not have data centers? Then they can go a a public one, provided that the data is secured and persistent.
For companies it would be a huge benefit to share their licences, optimize the existing material (if the only thing that is done by your computer is to read a video flux and interpret mouse click, a simple browser can do the job! Unix, Windows, HP... any computer with an Ethernet plug could do it).
Migrations and Upgrades will be simplified, or even become seamless on the public cloud controlled by the PLM vendor. You just have to do it in the data center and replicate the virtual machines...

The Business Model
But then, comes the money. How can one monetize this approach.
Think Amazon. Amazon sells computing resources. Dassault, Siemens, PTC could do the same... (or they could buy computing resources from Amazon and resell them). How does amazon bill their customers? by the resources used for computing, by the resources used for storage (and of course the CAD added value).
I honestly do not know if there is more or less money to make for the vendors (probably less), but the impact on the market will be huge for the first one who will make the jump. And if it is not a PLM vendor, it could possibly be one of their partners. What would prevent them to buy computing resources to Amazon, buy concurrent licences to PTC and leveraging them 24/7 with such an approach. Nothing.
The video game industry already started. Check OnLive.com. And OnLive is not a video game software company. This is not any different from what I am describing. And they have similar computer resource needs as CAD.
As I said, it is just a matter of time, and whoever makes the jump the first, the others better be ready because it will be a significant earthquake in the PLM industry.


Friday, June 25, 2010

#Cloud > What misses to Google Apps...

I was browsing through google apps market place and I was amazed by the ideas people were developing on the Google SDK platform. I really think that this swarm of ideas, concepts is beneficial to everybody. But when you address a company, then it becomes messy.

Why?
  • Company are looking for reducing their TCO. A large part of the TCO of Information System is due to the integrations between legacy systems.
  • Integrations are needed because applications do not speak the same language, do not manipulate the same concepts (Homework: ask an ERP consultant what a BOM is, then ask a PLM consultant, then make them agree on a common definition)
  • In Google Apps Marketplace, there are several applications like process factory or myerp.com or Zoho CRM but they cannot talk to each other.
  • They have their own notion of process, but in a company, all processes are related to each other (otherwise PLM and ERP footprint would not be so big!)
What is going to happen is that company will start to do the same mistakes. They might find myERP and ZohoCRM very interesting and develop web services to make them communicate... Integration.

I think there are two ways to prevent that to happened.
  • Private process management API connected to Google API developed as by a third party. It must become the standard for workflow definition. It must be flexible, scalable and unified. The problem is how to make it a standard...
  • This brings me to the second idea. Google has the power to prepare the standard of data and information exchange along the workflows.
The interest for Google would be double
  • Increase significantly the number of compatible applications and insuring the transparency that is so hard to get from new cloud companies (at least you'll be certain that the API will be supported for a long time, if it's an financially unstable company, what happens when it bankrupts?). For this reason it has to be open sourced.
  • Increase Google credibility in business software and allow the clouds to enter the industrial companies (because despite we all see on internet, the adoption of clouds is still low in the industry)
But maybe such a solution exist already

Thursday, January 21, 2010

#PLM > Reboot

It has been a while since I have written there... I started this blog in a period where I had lot's of free time but now things have changed a lot.

I have been involved in the most passionating project I have took part in years. It's challenging everything I knew about product development, the tools I implement, the best practices I have tried to push my customers for years.

There is so much I have learned in the last 6 months, that I have to capitalize somehow. Rebooting that blog sounded the best idea.

My current customers has a Engineering To Order (ETO) business model with very complex products and large products. What does ETO mean?
It means that the product is ordered before it is engineered. A bit like a house you're buying. You know you will have a bathroom, a kitchen, a lobby, a heating system, maybe a garage for one car or two. The architect design its shape, and then you buy it. At this stage nobody knows which part will be used for heating, but you know you will have one.

I would like first to refer to the excellent article from Jos about BOM in ETO and list the topic I would like to discuss in the next posts.
  • System Engineering design
  • Transition from EBOM to MBOM in complex products
  • Work schedule and resource allocation
  • Role of configuration
  • Design constrains for ETO Businesses
  • ...
This will not come in order, but as I mature my thoughts about how to organize that.

Friday, June 26, 2009

#PLM > Managing Projects with PLM

The PLM has been brought on the market by CAD vendors. By extension, it became not only the management of product data but also the management of the process that drive the product definition.

Some will use Microsoft project (or similar tools like PSNext), some will prefer excel (the most extreme case I have seen is managing project plans with PowerPoint presentation...).
Project process that involves only the engineers. Marketing, Sales, Suppliers, Quality,... all the department of the company can be concerned.

How PLM would help?

1 - Have a centralize information
"The single version of the truth".... the motto of PLM consultant. This is the first key. Find the right information and easily.

2 - Gather Project Document
Your PLM implementation should allow you to have a workspace for the project team. This workspace should be controllable in term of access. Globalized team love that...

3 - Manage Tasks and people
Of course, you need to be able to define tasks. But more than the definition, you should be able to distribute the work and follow the advancement of each task. How? Not by simply giving the approximative percentage of completion but with a standardize life cycle (i.e. In Work -> Review -> Complete).
Assigning people the task and identifying the responsible with allow you to have a smother process.

4 - Manage Deliverable
In a company I see 3 types of documents:
  • Product Documents: Valid all the life of the product (specifications, user guides, ...)
  • Project Documents: Valid only during the project phase (Expenses, Project Finances, ...)
  • Enterprise Documents: Separated from the rest, they define the way the company operates (Quality, Process documents, norms,...)
Deliverable are of the two first kind. Product Document will be related to the product structure, Project to the workspace structure, but both need to be attached to the task in order to facilitate the review and the search for documents.
With the usual tools this is what you cannot do.

5 - Manage the risks
Managing a project without a strong management of risk... is a big risk :). Having the risk management very close to your project allow you to have a better identification of what may go wrong. It ensure you to track the risk mitigation.

6 - Review Dashboard
With all the information in one system (or several, from the moment your PLM implementation gets the information it needs), you can get very easily consolidated dashboards.

I think these 6 points justify the need for a centralized data. system. Of course you can extend that view to Project Quality (6-sigma), to Project Financial, to manage resource, to enter actual hours, have project templates... but that's another story.

Project management is a great entry point for a PLM approach. Because it involves everyone. Because you can limit the change and then the reluctance to change.

Wednesday, June 24, 2009

#PLM > An ideal team?

Yesterday, during a meeting with a customer, the representative had a very interesting question:
"What are the profiles do you recommend for my team in order to increase the success of an implementation?"
It's not often that a customer ask this kind of questions and I was more than happy to answer.

1 - The project manager
On the top of classical project manager skills, the project manager must know the company pretty well. It is key that the project manager has a global knowledge of the effect of a PLM implementation in his company. He must as well have a knowledge of the tool and not only high level.
Of course he does not know everything about everything, but he must have a clear understanding of the stakes and issues the user will face. This is one of the reason that I would not recommend someone from the IT department to take that role.

2 - IT
IT should be strongly involve in all discussions though. We need people who know the architecture of their enterprise and that will be able to understand our constrains. Honestly in my experience I have seldom been disappointed with IT at customers from the moment they understood what we are doing and why.

3 - Key Users
If the PLM implementation fails, often it is because the users don't use it.The rejection rate in PLM can be very high. Why? when there is a process in place, everybody finds good reasons not to respect it to go faster. with such a tool, you cannot anymore. The choice of the key user is then... key. They will champion the project to the users. So they need to be conviced that what we do and what they do is the right thing.
Rule #1: Never have key users from IT or from a department not affected by the implementation. They will never use the tool.
Rule #2: Never chose new people in the company. They don't know how you work.
Rule #3: Find the most old school and reluctant people in the department and make them your champion. You convince him. He will convince others.
Rule #4: Have technophiles in the team. People who would find the tool fun. Enthusiasm is important. On the long run that will pay.Tehy will exchange with the old schooled in a way they would have never done and will learn a lot. That will tend to reduce the generational gap.
Rule #5: Chose people who have natural leadership.

This might be very simple, but too often I have seen low commitment from the key users ("I don't have time", "This is crap, I don't want to do that", "Blah they put me in that room with you then I don't bother them anymore",...).

PS: Of course, for big implementation, you need to add some layer. You need to decompose the project in domains, then define key users by domains... etc... Maybe one day I will tell you how to prepare the change in the organization...