This was a question asked to me this week by an engineer at
the New Zealand User Conference, and I stepped back surprised at the question
from the remote south island of New Zealand, but pleased. The question was
asked with full reality especially around the utility industry such as water etc
why would people put in servers today and have all that sustaining hassles, versus
just renting space and capability at a managed data center. The point is extremely
valid as the networks become more robust, the acceptance of virtualization as
the norm to deploy applications, why have the hassle with local servers.
This especially applies to plants with limited IT
capability, as a server requires upgrades, virus protection, and OS support,
plus it must physically maintained. Also within 2 - 3 years, a server has to
replaced either the software has advanced beyond the PC capability, or parts
are not available. When thinking of the South Island of New Zealand the picture
of remote, long distances on windy roads, by installing a PC a maintenance issue
is created. If a basic local capability that interfaces with devices such as valves
and instruments, maybe a local driver and basic SCADA, on a “shoe box”
industrially suited box with limited moving parts, e.g.| fans and hard drives. All
the data is up to the “cloud” or in this case a remote data center where the
heavy lifting of the historian, and the execution of supervisory control is
performed. Many of you are probably saying not viable for reliability reasons
but is that truly the case with the improving infrastructure? BHP and Rio Tinto
the two biggest mining companies in the world have both gone live with remote
operational centers, where large equipment are operated from 1500km away, yes
with local supporting teams. Yes, they have local control available if needed,
but in the last 3 years Rio Tinto has not had to use local control except in
controlled situations for maintenance etc. So why not do this for distributed
water utilities etc, makes a lot of sense?
In another presentation at the conference, for a distributed wind turbine application the
company was showing how they were using the “cloud” for connection security to
a remote data center for remote data
historians, and SCADA. At ARC conference 2 weeks ago in the US there was
company showing Gas pipeline control, and leak detection in the cloud. There are
10s of more applications going in, solving the core issue of sustaining remote
servers in harsh environments by shifting that computing to a managed
environment.
What surprised me when I posted a blog on the cloud about 6 months ago it was
one of the least read posts, during last 6 months the acceptance of remote computing
capability is growing. In 6 months to a year it will be main stream, and like this
engineer down in the south island of New Zealand, he will be implementing architectures
with remote computing and a secure cloud as natural.
Certainly through this year Invensys will release managed
services to provide the opportunity to move to this architectural freedom, I
expect that this will only accelerate into 2014.
So now if you look forward 2020 the networks either wired or
wireless will be extremely stable and capable, in remote areas people will use wireless
infrastructure naturally, employing cloud capabilities as Microsoft’s Azure for secure managed access, and leasing
computing in the ‘cloud” or managed data centers. Even if, there are
local controls for high availability, and reassurance, the normal days
activities will be using this remote capability, where someone provides a
service to maintain, and upgrade the system hardware and sustain a high level
of availability.
It
was refreshing to see engineers embracing this capability, not as a threat but
as an advantage that will help satisfy their customers, and applications.
No comments:
Post a Comment