Sunday, August 30, 2015

Manufacturing Industry Leads Cloud Adoption

It was good to see a blog by Gary Mintchell revealing that the industry sector is leading the adoption of Cloud, yet so often I hear the words that “it will not happen in our company or industry for years!”.

Gary writes a great set of blogs always worth having a link to it.

http://t.co/LIPLIc6SgV

Some Quotes from this blog:

global study that indicates cloud is moving into a second wave of adoption, with companies no longer focusing just on efficiency and reduced costs, but rather looking to cloud as a platform to fuel innovation, growth and disruption.
The study finds that 53 percent of companies expect cloud to drive increased revenue over the next two years. Unfortunately, this will be challenging for many companies as only 1 percent of organizations have optimized cloud strategies in place while 32 percent have no cloud strategy at all.

So often I have sales and people saying that cloud is driven by a change in cost model, but also in all my interviews with customers and strategic thinkers it has been a platform for addressing the “changing speed of change and flexibility needed today” that is driving it. The world is changing faster and faster, and the ability to deliver the RIGHTS:
  • Right Product
  • Right Price
  • Right Cost
  • Right Time
  • Right Location

Is key and this means rolling increased new numbers of products across a distribution of value assets (plants) that will produce smaller lots (production runs), at less cost.
Understanding NOW what the state of Inventory, Work in Progress, and equipment to get to market is key.

Also we seeing the “walls of a plant” expand, beyond the manufacturing plant to now treat the whole manufacturing, and distribution supply chain as part of manufacturing. So the traditional MES (manufacturing Execution System ) is expanding, to offer the ability model the plant to store as operations, where product must be tracked to compliance, and work items distributed to workers and assets in that distribution chain.

“In the study IDC identifies five levels of cloud maturity: ad hoc, opportunistic, repeatable, managed and optimized. The study found that organizations elevating cloud maturity from the ad hoc, the lowest level to optimized, the highest, results dramatic business benefits, including:
  •        revenue growth of 10.4 percent
  •         reduction of IT costs by 77 percent
  •         shrinking time to provision IT services and applications by 99 percent
  •         boosting IT department’s ability to meet SLAs by 72 percent
  •         doubling IT department’s ability to invest in new projects to drive innovation.”


Cloud Adoption by Industry
By industry, manufacturing has the largest percentage of companies in one of the top three adoption categories at 33 percent, followed by IT (30 percent), finance (29 percent), and healthcare (28 percent). The lowest adoption levels by industry were found to be government/education and professional services (at 22 percent each) and retail/wholesale (at 20 percent). By industry, professional services, technology, and transportation, communications, and utilities expected the greatest impact on key performance indicators (KPIs) across the board.”

The above learnings and results do not surprise me, based upon my own engagements in the field, and observing the increased realization that speed of change is important, and tradition large projects are going out the door. Replaced by rapid projects leveraging existing expertise in the industry and adding through own operational process value to differentiate.  

Sunday, August 23, 2015

Can Sustainable Manufacturing Operations Management Exist without some sort of Master Data Management?

Over the last couple of months we have seen customers increasing investigating the strategies to answer this question: “how to enable alignment across “level 3” operational applications”. 

This area of aligning the level 3 applications without rip and replace will become one of the core requirements in making Manufacturing Operations Management sustainable and effective.   

Syncing between systems, people look at data warehouses , they do manual binding, but these are just not practical in a sustainable and every changing world. There are many systems usually upwards of 20 + systems which come from different vendors and even if they do come from the same vendor they implemented by different cultures in the plants. The thought pattern on “just asset naming” is different between these groups.

Again Borrowing from As Gerhard Greeff – Divisional Manager at Bytes Systems Integration put it in his paper"When last did you revisit your MOM?"

MDM or Master Data Management is the tool used to relate data between different applications.
So what is master data and why should we care? According to Wikipedia, “Master Data Management (MDM) comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization (which may include reference data). MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, assuring quality, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.”

Processes commonly seen in MDM solutions include source identification, data collection, data transformation, normalization, rule administration, error detection and correction, data consolidation, data storage, data distribution, and data governance.

Why is it necessary to differentiate between enterprise MDM and Manufacturing MDM (mMDM)? According to MESA, in the vast majority of cases, the engineering bill-of-materials (BOM), the routing, or the general recipe from your ERP or formulation/PLM systems simply lack the level of detail necessary to:

1. Run detailed routing through shared shop resources
2. Set up the processing logic your batch systems execute
3. Scale batch sizes to match local equipment assets
4. Set up detailed machine settings

This problem is compounded by heterogeneous legacy systems, mistrust/disbelief in controlled MOM systems, data ownership issues, and data inconsistency. The absence of strong, common data architecture promotes ungoverned data definition proliferation, point-to-point integration and cost effective data management strategies. Within the manufacturing environment, all this translates into many types of waste and added cost.

The master data required to execute production processes is highly dependent upon individual assets and site-specific considerations, all of which are subject to change at a much higher frequency than typical enterprise processes like order-entry or payables processing. As a result, manufacturing master data will be a blend of data that is not related specifically to site level details (such as a customer ID or high-level product specifications shared between enterprise order-entry systems and the plant) and site-specific or “local” details such as equipment operating characteristics (which may vary by local humidity, temperature, and drive speed) or even local raw material characteristics.

This natural division between enterprise master data and “local” or manufacturing master data suggests specific architectural approaches to manufacturing master data management (mMDM) which borrow heavily from Enterprise MDM models, but which are tuned to the specific requirements of the manufacturing environment.

Think of a company that has acquired various manufacturing entities over time. They have consolidated their Enterprise systems, but at site level, things are different. Different sites may call the same raw material different things (for instance 11% HCl, Hydrochloric acid, Pool Acid, Hydrochloric 11% etc). Then this same raw material may also have different names in the Batch system, the SCADA, the LIMS, the Stores system, the Scheduling system and the MOM. This makes it extremely difficult to report for instance on the consumption of Hydrochloric Acid from a COO perspective, as without a mMDM for instance, the consumption query will have to be tailored for each site and system in order to abstract the quantities for use.

The alternative of course is to initiate a naming standardization exercise that can take years to complete as changes will be required on most level 2 and 3 systems. That is not even taking into account the redevelopment of visualization and the retraining of operators. The question is, once the naming standardization is complete, who owns the master naming convention and who ensures that plants don’t once again diverge over time as new products and materials are added?

The example above is a very simple one, for a raw material, but it can also be applied to other area.
But when you are talking to customers you see comments and projects and so many are trying to deal with this issue without really looking at the big problem and plan.


If a company has for instance implemented a barcode scanning solution, the item numbers for a specific product or component may differ between suppliers. How will the system know what product/component has been received or issued to the plant without some translation taking place somewhere? mMDM will thus resolve a lot of issues that manufacturing companies are experiencing today in their strive for more flexible integration between level 3 and level 4 systems.

The objective of the proposed split in architecture is to increase application flexibility without reducing the effectiveness and efficiency of the integration between systems. It also abstracts the interface mechanisms out of the application into services that can operate regardless of application changes. This will get rid of numerous “point-to-point” interfaces and make systems more flexible in order to adapt to changing conditions. The mSOA architecture also abstracts business processes and their orchestration from the individual applications into an operations business process management layer.  Now, one person is able to interact with multiple applications to track or manage a production order without even realizing that he/she is jumping between applications.

Even with mSOA and mMDM, integration will not be efficient and effective unless message structures and data exchange are in a standard format. This is where ISA-95 once again plays a big part in ensuring interface effectiveness and consistency. Without standardized data exchange structures and schemas, not even mMDM and mSOAm will enable interface re-use.

ISA-95 part 5 provides standards for information exchange as well as standardized data structures and XML message schemas based on the Business-to-Manufacturing Mark-up Language (B2MML) developed by WBF, including the verbs and nouns for data exchange. Standardizing these throughout the manufacturing operations ensures that standard services are developed to accommodate multiple applications. Increasingly we are seeing the process industries such as Oil and Gas, Mining, looking towards these standards, and developing them to address this growing challenge of expansion, vs sustainability.

Saturday, August 15, 2015

Virtual Expert Teams, Provide One answer to “Time to Performance”

We often hear of the aging workforce as a big problem, and certainly it is due to fact that it is not a evolutionary transition to next generation of workers it is leap to new generation of workers missing at least one generation. So from "baby boomers" to Gen Y or Millennials.

As has been stated many times comes with this a change in "Experience levels" of the job, and site, but correspondingly comes a change in the way they native work, engage with others sharing more, asking more, providing the opportunity to bridge this experience transition.

Last week in some discussions around Integrated Operational Centers (IOC) it was clear that IOC is not about bringing what happens field control rooms into a central location, the real transformation happens when experience and operational work transformation happens. Shifting to an operational experience where:
  • Experience can be shared across sites, and workers through standard operational interfaces and experiences
  • Shift to Exception based experience from monitoring, where the user interacts with the system only when required.
  • Where planners, operational control, and subject matter experts can align, and collaborate in real-time, sharing the same view on a situation.

But it was clear that the companies making the big step are going further, and really introducing the "Flexible Operational Team " concept, shifting from an operator to a true operational team. From site workers who now are agents (eyes) to the rest of the team on site, to central control working closely with real-time planning and work order execution, to experts in maintenance, process, safety, and management providing the real-time knowledge and experience across multiple sites, and multiple situations. The diagram below shows this "flexible operational team" and the associated transformation in operational work across the team, due to new work ethic of sharing and asking.




But what we seeing in the market is some innovative approaches to solving this existing experiencing and the transfer, and it is through the use of “Virtual Experts Teams”. So what is this concept?


Key is to have these highly valued knowledge experts which could be across different aspects of the business e.g. asset management, process, planning, optimization, quality being empowered to work across the many enterprise production assets/plants. Today many of these experts are restricted in their contribution to their local plants, but a number of companies have started strategies which say we must leverage these people. These experts must be able to access the state, information of the plants in a consistent manner even though they may have never been there. They must also have the natural ability to collaborate with the local teams.


This means a local team is able to call upon the best experts relative to the situation they are dealing with. The expert is able to go online and access the plant situation in “near real-time” so they can see the situation while collaborating with the site team. They are able to drill in and do analysis, so as to draw upon experience and provide advice in real-time to the local team.

Now is this easy, I would say no, as just accessing the plant data is not effective, as the expert maybe over the other side of the world, never been to the plant and so the data will be in different measures/ context to what he expects. So in order to achieve this virtual team we need to have a “trusted information” system, where the data / information is a consistent context.

But the above concept is real, and is valid with leading companies currently implementing these "virtual communities" with subject matter experts on call across sites and situations. The operational team experience is not a "rip and replace" it is built on the existing automation/ supervisory systems installed in 90s, and 2000s, but now aligned with a validated model, and context. Plus a collaboration user experience where systems notify of "abnormal situations" and the controllers will share and collaborate in real-time, and virtual experts can access their view of the situation in real-time. Note the view of the expert may be different angle on the data relative to their expertise, 

But the system is NOT a one way information system, it is bidirectional, interactive experience with accountability, of action and role, with a built in ability to evolve the knowledge and experience of the system for the future.

Too often I am seeing slices of this Operational Team landscape, being implemented without realizing the whole picture to realize the paradigm shift in value.  





Monday, August 10, 2015

How Real are we Treating the State of Connectivity in Your Operations relative to Success of the Business?

Control over your business across different sites, the supply chain, and targeted markets is key to survival. Control ensures that you are delivering the correct “Rights” in order to maintain the “shelf space” therefore access to the customers and market position and potential growth.


But as time moves forward the requirement for control over a wider value chain, and tighter control is key, and becoming critical. This directly relates to the “connectivity” within your business especially over the “value chain” including supply chain, manufacturing/ production, and then distribution. With increased regulations set by government, or the public/ market, brand integrity is core as brand loyalty has gone.

Consistency in decisions, consistency in actions, real-time awareness drives the operational world towards “self aware” production, and “self Aware” products that enable the timely awareness and action.


Connectivity means real-time alignment between People at all levels, with focus, and the value chain assets relative to their current production/ operation. The diagram below shows how the world is changing how control is becoming key, yet if connectivity and that means not data but “knowledge” and "Wisdom" is key.

Will you let opportunities pass, due to un-awareness, or the inability to be agile even if you do have the data, can you act on it with the current operational systems or operational culture?

In mining, oil and Gas we seeing strategies with the “integrated operational centers” to take a paradigm shift in the operational connectivity between the key functions like planning, experts and operational control over multiple assets by putting them in a common room. No longer is it a call, or meeting, in real-time people can cross the room and talk, review each others situation, call small adhoc decision meetings. The returns have been significant, when you then combine this with “trusted data” and total transparency in real-time in context across the Value Chain decisions can be made.

Take that one step further with a “self aware” system, that knows what it is meant to be doing, at what efficiency and safety level, and can see into immediate operational future through embedded simulation, that the system can draw awareness to critical decisions that can be acted on. If the system knows the collaboration it requires to move to a resolution and maintain operational continuity then it can interact with other systems, key people in real time with the correct context of their contribution.

Can we not deliver this, can we competitive without connectivity?

Monday, August 3, 2015

Transforming the way should view integration?

Even though this seems to be clear the levels of worker engagement, but in many cases the true impact on the solution design has not been considered. I thought the diagrams below introduced the concepts of integration in relation to worker empowerment. These are very important in design for the “new world of work” or “smart Work”, too many of us talk mobility, or delivery anywhere, this should be just assumed in the system, not a feature.


There are important distinctions of how we should view about integration, not to actual connection but relative to how that integration will be used by the worker/ users. Too often we look at integration on how the linkage is happening, but there are key requirements on the integration based upon user. 


Data Integration, the users expects to see the access to data across all systems/ sources in a transparent access with assumed reliability.

Process Integration: Is the data is transformed into information relative context to asset, and process at the time in a trusted form, but is presented in an open loop dashboard as decision support.

Performance Integration: Data as been transformed into context driven information for decisions, but now aligned to activities of the time, place and role, so that actionable decisions can be taken. But key to a closed loop actionable decision is timely delivery of knowledge, and associated actions, with accountability, this will in many cases require collaboration in order to accelerate the repeatable actionable decisions.

Again so many of the discussions in designing systems for the next 10 years a beyond are shifting from applications, to “activities” that will be required to deal with new work no matter if engaging with human assets, capital assets or applications. The above discussion on re thinking “integration” relative to how the results of that integration will be used aligns with transformation in solution design.