The Good and the Bad
Unless you have taken a sabbatical for the last 11 years, I am sure you have noticed that the enterprise computer environment has changed radically since the introduction of SAAS-based solutions in 1999.
Let’s face it, the adoption rate and reach of these new-age solutions across all functional areas of the enterprise has no end in sight. Add to this the more recent introduction of Private Cloud Computing, and you end up with is a highly distributed computing environment that includes a mix of public and private resources, and an IT team facing a completely new set of challenges and opportunities.
One of those challenge and opportunity is how the IT Team will manage the integration and automation of these resources into a process workflow that will convert enterprise data and processing power into online content, reports, analytics and dashboards. In the last 5 years, the requirement for timely information has grown exponentially to the point that business users and customers depend on daily and sometime hourly information updates to do their work.
Can in-house scripts and job schedulers do the job?
Can you really continue to integrate and automate your computing environment with batch files, scripts and job schedulers? Simply put… No!
In-house scripts and enterprise job scheduler are too limiting. They are typically time-based, require skilled and specialized professionals to modify, maintain and monitor, they do not deal well with complex multi-dimensional events, and do not natively interconnect with external SAAS-based solutions. In the case of in-house scripts, they are typically not well documented, distributed, and highly dependent on a few individuals. As for the Job Scheduler, they are typically very capable tools, but usually cost prohibitive and are not yet adapted to deal with the SAAS part of the computing environment.
Consider the impact of introducing SAAS-based solutions in combination with your own on-premise computing or cloud computing environment.
1) Some of the enterprise data is off-premise within a public SAAS solution
2) Public APIs rather than direct access via command line are used to request processing or data extracts
3) Access controls and the management of those access controls are inherent to accessing and processing the data
4) Both external and internal event are driving the process sequences and outputs.
So can today’s enterprise job scheduler or in-house scripts s deal with these new complexities easily and without a skilled and very creative operator and programmer? Again, to put is simply…no!
Without a new way of orchestrating today’s computing environment, the modern day IT Operation is at risk… and will increasingly have difficulty ensuring control and integrity over the production environment, and ensuring documentation and compliance requirement are met.
Even more of a concern is that without an agile processing environment, how can the enterprise maintain its competitive advantage?
The Next Generation IT Process Automation Tool
So what is this new way… a new product category called “Business Process Automation Tools”. These next generation process automation tools:
1) Require no programming and use visual drag and drop interfaces to sequence 100s and 1000s of individual processes into clear and visually intuitive process workflows;
2) Can easily deal with complex events so that production is both event-based and time-base;
3) Can natively connect to today’s popular SAAS-based solutions to process data; 4) Can natively exploit the feature of today’s most popular information management and business intelligence applications;
5) Provide central control and monitoring while being self documenting;
6) Can exploit on-premise and off-premise computing power.
Business Process Automation tools are available and have been available since 2004. However, it is today’s new process automation challenges that makes these next generation IT process automation tool a must have for the agile enterprise. The need for these new tools has also been identified by many analysts as a required investment to ensure that today’s enterprise computing environment is a productive one.
SAAS for everything?
Nobody will argue, that SAAS based solution have merit and that they will continue to offer new and more interesting options for the IT Team. However, can every aspect of the computer room be served up through a wire by a public service provider…like a utility? Simply put…maybe?
As more and more application, computing power, and data is being shifted over public infrastructures, like brick and mortar industries, it is how these services will be used that will be the differentiator. Basically, it is how these raw materials will be processed that what will become the enterprises competitive advantage rather than the proprietary software and infrastructures.
So depending on how strategic IT processes are to the enterprises, the way process workflows are created and managed may need to be secured deep into the private part of the computing environment. This does not mean that public SAAS resources cannot be exploited. The public resources remain the raw materials that will be orchestrated along with the private resources into a seamless information production environment.
In this sense, the next generation automation tool is the conductor that will orchestrate the entire enterprise computing environment (public & private) into an effective information production environment that can and should provide the enterprise with the competitive edge it needs to secure its growth.
Seek them out today!
Managing output Xerox's James Joyce outlines the key elements of a successful managed print services strategy.
Virtual Reality Datalink's Kent Christensen reveals to Business Management how a virtualized data center can revolutionize your business.
Security Board: The World of Board Portals Joe Ruck explains why a secure and stable board portal is important for today's business leaders.