From my view, Microsoft bought GitHub for 2 major reasons – access and information. Access is the first reason and it enables an extension of their own tools and cloud. My assumption is GitHub will soon find the first option for tools and for cloud to be Microsoft’s unique line up. Why would a developer publish to AWS, Oracle, Google, or IBM if a single button press got you the latest features and tightest integration by going to Azure. They won’t eliminate or block the others, they’ll just make Microsoft the default.
I don’t think Microsoft is buying GitHub to bury it or ruin it. Microsoft is not exactly the biggest promotor of open source, but they are an active player. This is not like Gillette buying the stainless steel razor blade patent so they could drag their feet on producing one and get more money out of their existing products. If Microsoft blocked GitHub, I think the world would just develop an alt-GitHub or shift to competitor.
The second is probably the more important: information. GitHub is where developers, programmers, and coders dream. They put snippets of code which are glimmers of the future. Simply understanding what libraries, language, databases, tools, and clouds are being used, frequency, and in what combinations will yield bright headlights into the near future. If you release a new library, you can now easily see its uptake in the community. Put more money into it if it’s yours, alter yours to look more like the winner, partner where you can’t win, or buy it up if it’s a good investment.
As long as Microsoft uses a respectful hand and doesn’t become the evil overlord, I think the purchase of GitHub will yield a bounty of information by which they can steer their own development of tools and products. For a company that has jumped in late on the Internet, Open Software, and Cloud, they sure do an impressive about faces.
Digital Twins enable testing of real world testing of complex systems. The concept of living test lab has been dream for testers. Digital Twins are not static, but allow for constant new input based on the real world from IoT sensors. Digital Twins make use of AI, Machine Learning, and IoT to simulate complex system behaviors. IBM is working in our labs and with our clients to find exciting new ways to use and create digital twins.
When flight first started, a man had to risk his life to test each innovation. An innovation had a high threshold since the bet was a human life. Eventually, engineers built wind tunnels where they could simulate the effect of the air flow over the plane. While it no longer was risking a life, it had limitations on size (can’t fit an entire 747 in wind tunnel), was artificial, and was costly. Also how do you simulate more complex events like sudden down drafts, lightening strikes, rough landings, wear and tear over years (metal fatigue, corrosion)? Now with digital twin, you can test the effect of changes to the digital twin of the airplane. We can run 100’s or 1,000’s of changes and combinations of changes to identify the impacts. Only the best of these changes will be put into use.
The while the changes put into use could be small, similar to agile built software application, they would add up to significant impacts. The feedback from the IoT devices in the real world will then update the digital twin allowing new sets of changes to be developed, deployed, and tested before the best combinations are rolled out in rapid succession. As most planes are now fly by wire and highly digital, incremental changes are possible to many of the systems. Today it might not be possible to reshape physical parts like wings, fuselage and rudders, but maybe in the future technologies could reshape the surface to change physical parts of the plane. Clearly there would need to be progression from test bed, to unmanned, to test flights before it went into passenger aircraft, but the rate of innovation in safety related industry goes up by orders magnitude and the risk and costs come down proportionally, too.
The ability to try millions and even billions of combinations in each digital twin is not yet possible as it would overwhelm the compute power of traditional binary computers. The rapidly evolving Quantum Computer may provide the power required to make machine learning nearly unlimited in capacity enabling deep learning and unlimited numbers of combinations of factors in our digital twins. You can even try out quantum for yourself in IBM’s DevOp environment – bluemix.
Benefits of digital twins can apply to almost any machine, group of machines, or ecosystem of lots of groups of machines. I wonder if in the future, a quantum digital twin could be more complex and subtle in its simulation than the real world. As of today, our models of reality pale in complexity to the real world. Below is simply mind map machine systems with a focus on transportation machines. It shows how digital twins can use data from other digital twins. It is model composed of multiple models.
How could a digital twin help your industry? How can you take advantage of a digital twin to improve the quality of life and leverage the vast amount of data pouring out from mushrooming number of IoT sensors? It is an exciting problem to explore with real business implications.
Blockchain will significantly change how we hold and transmit items of value. The business process can finally be designed from the ground up as a digital process – truly transformed. The record of choice for the last 100+ years has been paper. Even when you scan a record, if the paper exist, it is the legal record.
We have birth and death certificates, laws, deeds, bank statements and even money (bills) on paper. The problem is we are becoming a digital society and paper is hugely inefficient. In addition, the processes for handling paper have stayed in place even with digital systems and are highly error prone. How likely is it that an error occurs upon copying or reading a document. Blockchain offers the opportunity for processes to become 100% digital, secure, and low friction from birth to destruction.
The simplest definition of a blockchain is a digital ledger that is not terribly different from an old fashioned paper accounting ledger. A well implemented blockchain has 3+1 key characteristics. It is immutable meaning once a transaction is entered, it can’t be removed or altered. It is sequential in that each transaction is tied to the one before it and after it. It has consensus based peer nodes that can be distributed. I’ll add a fourth for a “well implemented” blockchain, that it has inherent security with multiple levels and is highly resistant to attack.
A blockchain is not Bitcoin or any single crypto-currency. Crypto-currencies like Bitcoin do run on blockchain technology. The data entered in the ledger for crypto-currency are financial transactions representing value. What fascinates me, and is the subject of this blog, is what else can you do with a Blockchain.
In 3 recent experiences with obtaining a mortgages, I’ve had a 10%, 50%, and 90% paper based process. I’ll exclude the closing process which is still done on paper thanks to the government being nearly 100% paper based.
Bank 1 was 90% paper based. Everything went to the branch office as paper which they put in a box and shipped to the home office where it was processesed. Only the communication via e-mail was electronic.
Bank 2 was about 50% paper based. They allowed us to submit our documents via an upload, most documents were e-signed, and a few that presumably the government required wet-ink signed, we’d print, sign, and then upload them.
Bank 3, actually a mortgage service, never gave us a piece of paper, but I’d argue it was still only 90% digital. Someone was still transcribing from the uploaded images into the lender’s databases. Even if all of the paper was eliminated, Bank 3 still was working workflow designed for paper. It took limited advantage of the fact that everything was now digitized.
Blockchain will change the above mortgage process. There will be no transcription and therefore less chances of error. Hypothetically, you’ll upload your information from a digital store on blockchain of IDs which will include multiple biometric authentication methods to confirm it’s really you. You’ll permission your lender to do research on your credit worthiness in various financial blockchains. It may even eliminate the need for the credit scores and credit bureaus as credit data can be gathered directly and relatively quickly (more on that later). At the same time, they will review your deed and other documents with the government making sure you are lien free.
All of the approvals, audits, and additional documents will be kept in the lending agency’s blockchain, but can link with permissions to other blockchain’s or simply make copies with reference to the source. Finally, your signatures will cause the down payment to be transferred along with the signatures of everyone involved from the bank, regulators, attorneys, auditors, county and state tax authorities, county court records, insurance agencies, buyers, and sellers. In theory, from discussion of the mortgage to the completion of the mortgage a day or few day process.
The biggest issue with blockchains, besides they are new and we are just starting to build applications for them, is that they are slow in terms of computer transactions. The slowness is mostly due to the consensus element. For all the nodes (computers) to agree it is valid entry, it can take up to a few minutes. While this is huge leap forward in terms of recording a legal records which can now take weeks or months, it far is too slow for sub-second transactions like purchasing on the internet or recording an entry at help desk. So for now, it is best applied to large block type transactions of higher value which fit the characteristics of blockchains.
Here are a few areas where blockchains are a natural:
Medical records (individual, hospital, doctor, etc.)
Government documents (deeds, judgments, laws, titles, licenses, etc.)
Change is eternal in life, nature, business, and technology and you only have 3 options: 1) Adapt; 2) Move, or 3) Die. I learned this truism in my 7th grade from my social studies class regarding animals response to ecological change, but the same is true for changes in our business environment. The technology environment has forever been changed by CLOUD (IaaS, PaaS, and SaaS). The software business and the systems integration (SI) will never be the same and is now presented with 3 simple options: 1) Adapt; 2) Move, or 3) Die.
As humans, especially business people, engineers, etc. who populate a lot of technology field, we believe we can overcome or stem the tide. While this may work for short term or against small storms of change, it will not defeat real, substantial change any more than you can push back on walls of water from hurricanes.
SAP has made a massive shift around SaaS and is adapting. In 2013, Jon Reed, noted that even SAP executives would love to go on selling traditional on-premise perpetual licenses when he paraphrases the executive with ‘Hey, if we could continue to sell software to customers the way we’ve sold it to them for the last 40 years, we would. But they want new options.’ (more from Jon Reed’s Diginomica blog). Fast forward to 2016 and about 80% of SAP’s revenue is from 4 acquired SaaS products: SuccessFactors, Fieldglass, Concur, and Ariba. If SAP could figure out S/4 HANA cloud, they might even become a dominant SaaS ERP player.
Cloud and specifically SaaS to the software industry is a category 5 hurricane force of change driving a wall of water. Remember when virtualization was only for non-proction. Now, most systems depend on virtualization.
Moving and adapting take time. So while almost everything will go cloud, it will take time. It will have to make IT and financial sense to move. The argument that some applications will not run well on the cloud will be a moot point when they are rewritten for the cloud.
The hurricane of cloud in all forms is coming here. What are you doing to make sure your ready to move or adapt (and not die).
We continue to automate and improve business systems. I’ve spent my whole career improving business efficiency. Each time we do so, we mostly disrupt lower level service jobs and now some medium level professional jobs. We do this because making a business more efficient, effective, and cost competitive keeps that business ahead of its competition.
The recent article by CIO Insight “How Repetitive Tasks Waste $1.8 Trillion” made me consider the consequences, both bad and good. That $1.8 Trillion amounts to a lot of people’s jobs. The downside is elimination will be the elimination of jobs. I once recall discussing how we were going to put in telephonic automation for the service desk when someone said “you know, we just fired 300+ people.” We observed about 30 seconds of silence, swallowed hard, and then finished our task of designing the solution. It was going to happen regardless as most of their competitors had already eliminated large human level 1 service desks. Now we are observing the impact of readily available cloud wiping out many small and medium data center and application support people’s jobs. I’m certainly not against cloud solutions. IoT, Mobile, and SaaS solutions all stem from basic cloud capability and are creating NEW job markets and careers.
Jobs are both a way wage along with an identity for most of us, so I take it personally and seriously. I’ve done both the laying off of people and been laid off. Neither is fun. After I had to lay off my staff, I was physically ill and just thinking about it gives me the chills. I was able to get the best of them lined up with new job opportunities. No one wants to be told they are no longer needed and can be discarded.
To the positive, people can be moved to new jobs. The best companies work with their people to find them jobs that can help the company grow. As individuals, we all need to be on the look out for the possibility we’ll be disrupted by new technologies. There is no job that is immune entirely. Hands on trades people are probably the least susceptible, but even they must learn new skills constantly to stay employed. If you are in job that can be digitized, you need to start planning how to adapt. Your job will be under threat inevitably.
Companies are not social employment agencies and I don’t advocate socialism. I think it is in their best interest to be part of the community, since ultimately it is the community who consumes from them and makes them successful. Companies in capitalistic market that must out compete each other and to do so must make money for the owners / stockholders. In addition, if a company does not continue to move forward ahead of its competition, it will fail and NO ONE will be working for that company.
In the end, the march of improvement and technology is inevitable part of human history. Stopping progress is neither possible or wise. We can and should think about how to do it humanely by recognizing the impact and helping those impacted find ways to be productive members of society. We can use it wisely to improve our conditions as a planet and as human beings.
Living on your past successes is the road to ruin. Change is inevitable as the sunset. Whether you see it or not, it sets. The tree in the forest falls causing vibration of the air even if you are not there to perceive it as sound. In 2015, you will see the beginning of the end of on-premise software and systems including ERP systems like SAP.
Clients are no longer willing to buy the infrastructure, software, and services to implement business functions. There will always be a special market for special software, but it no longer necessary for core business functions. Even if business pays a few more pennies on the dollar, they want to buy the service directly and more and more vendors have entered the market with realistic scope and depth of function of business services that running very large businesses on SaaS based solution is possible.
It is not a flatline Total Cost of Ownership (TCO) argument. It is all about speed and velocity of change in technology and in business. Whether it is reacting to technology change like “electronic payment” or business conditions like “extremely low cost of oil”, it is necessary to react and react swiftly. SaaS based systems are better able whether it is because the SaaS system you are one rolls out upgrades without sympathy, something the IT department could never get away with, or because you can switch to another SaaS provider due to more standardized interfaces and a more universal user interface (UI) that requires less training. SaaS solutions provide the velocity and agility not found in on-premise solutions.
While the on-premise system looks like a promising TCO as its 5 year cost may be lower than the competing SaaS system, it falls apart when the inevitable change occurs. The on-premise system TCO is based on the idea that the 3 to 5 year roll-out will occur with limited change, but that change always occurs and in half decade it can be dramatic change. In fact change is the only consistent truth you can bet on on.
Next you’ll argue SaaS doesn’t have enough functionality. That is a limited truth for now. It is rapidly changing as the SI’s plunge into the market to fill the gaps with extensions that verticalize each SaaS solution or extend each solution via internal options (using named spaces in the application) or external options such other cloud based systems. It becomes a question of SaaS agility and velocity vs. on-premise optimization; however, optimization fails massively when the conditions of the system that was optimized changes, and again change is inevitable.
The question for each of us to answer is how will SaaS based systems change your job? If you are functional, what is the SaaS based system that will eclipse your on-premise role and skill set. If you are technical hands on, will you work for a SaaS vendor or will you move to area that is still in demand like architecture or network. IT is still in high demand, maybe higher than ever as technology is not just required for business, but the very fabric of business. Everyone who works on business software needs to evaluate your future based on the inevitable change brought by SaaS.
In 2015 software and especially ERP software will evolve due to change. With change, we have the three choices: move, adapt, or die? You can go to part of the world where on-premise is still new, you can learn how SaaS will impact you and update your skills, or you hang on for dear life hoping everyone else changes their way. As for the latter option, I’m not hopeful. Your success in 2015 is by recognizing the shift to SaaS just like knowing the sun has gone down even though you were stuck in the office.
While we are on New Year’s resolutions, do try to get out and see the sunrise or sunset with someone you love just a few times in 2015. Here is to true success for you, your friends, and your loved ones in 2015.
On the flight to Las Vegas for SAP D-Code (aka TechEd). One of the first years, I’m not presenting, but IBM has a wealth of content in our booth and being presented. Be sure to check out Scott Geddes on the Apple + IBM relationship and how it will help build the Individual Enterprise. I’m in the IBM both all week, too, so come by and say HI!
We’ll be talking a lot about the exciting announcements for SAP to sell their applications on IBM’s Cloud. If you’re not clear on how it works for your organization, we’ll have lots of people who can help guide you at the booth. Mike Ryan is giving 2 sessions on moving to IBM cloud, too.
From SAP, I hope they have a few major Suite on HANA references. I’m looking for $5B and up companies who’ve made the migration, not just a division. I’m hoping to gather more details on their own SaaS strategy especially on Business-by-Design. I know Wall Street was disappointed with earnings from SAP, but I take it as a positive sign their rate of transition to cloud. We all know that is our future, but I don’t even think that the most aggressive analyst see how soon that future will arrive.
Please do check out the IBM sessions and at least come by and say hello, discuss SAP, IBM, or any technology topic. Have a great show and learn big!
Thursday, October 23, 10:30 AM – 11:30 AM
Session ID MOB105: – Bellini 2105 Level 2
Title: Apple + IBM: Evolving to the SAP Enabled Individual Enterprise
Speaker: Scott Geddes
Description: What’s next, now that you’ve done your first waves of transformation with SAP? How do you empower end users in ways never possible before and unleash the power of our SAP implementation? In this session we will explore how Apple + IBM are working together to change the way people work and create new, never before seen capabilities.
EXPERT NETWORKING SESSION:
Thursday, October 23
12:00pm – 1:00pm Lounge #3
Apple + IBM: Evolving to the SAP Enabled Individual Enterprise (IBM and Apple alliance discussion cont’d)
Scott Geddes, IBM SAP Global Business Services – Mobility
Chuck Kichler, IBM SAP iCoC CTO Tuesday, October 21, 2:00 PM – 3:00 PM
Session ID: DMM137
Title: IBM’s Recommended Approach for Optimizing Different Kinds of SAP Workloads
Speaker: Guersad Kuecuek
Description: Today, customers face various requirements to effectively deal with different kinds of workloads. Key aspects are high Service Level Agreements while maintaining optimal performance for analytical (OLAP) and transactional (OLTP) workloads. Find out how customers like Audi, Balluff, and Coca-Cola have mastered these challenging requirements.
Tuesday, October 21, 3:15 PM – 4:15 PM
Session ID: DMM142
Title: SAP HANA on IBM Power – Value, Performance and Experience
Speaker: Alfred Freudenberger
Description: With the announcement of the testing and evaluation program for SAP HANA on IBM Power Systems at SAPPHIRE NOW in 2014, a new option for SAP HANA deployments will soon be available. Why should SAP clients consider this option? For which environments is it well-suited? What have IBM and SAP learned during development, testing, and evaluation?
EXPERT NETWORKING SESSION:
Wednesday, October 22
11:30am – 12:00pm Lounge #4
SAP HANA on IBM Power – Value, Performance and Experience
Alfred Freudenberger, IBM Leader NA SAP on Power
Tuesday, October 21, 4:30 PM – 5:30 PM
Session ID: DMM145
Title: Simplify IBM Database Performance Tuning with the DBA Cockpit
Speaker: Thomas Rech
Description: In today’s IT world, it is crucial to maintain high SAP system performance to meet demanding Service Level Agreements. The DBA Cockpit for IBM DB2 Linux, Unix, and Windows is an easy, fully integrated solution for database monitoring and administration with SAP. Learn about the design concept, the capabilities, and discuss customer use cases.
Wednesday, October 22, 11:45 AM – 12:45 PM
Session ID ITM220
Title: Business Continuity for SAP HANA-Based Applications – Shared Experiences
Speaker: Irene Hopf
Description: Learn about the options to keep business continuously running when you migrate SAP application landscapes to SAP HANA. High availability and disaster recovery are essential for business-critical applications. Discuss experiences with your peers and learn how other customers have implemented it.
Wednesday, October 22, 5:45 PM – 6:45 PM
Session ID INT206
Title: Integrating Shop-Floor with Enterprise in Real-Time – SAP MII In Action
Speaker: Dipankar Saha
Description: How to integrate heterogeneous shop-floor systems with SAP ERP by SAP Manufacturing Integration and Intelligence (SAP MII) using custom frameworks with various industry case-studies. This includes: manufacturing integration use cases, real-time integration using SAP MII, and architecture and case studies of integration using the frameworks.
Thursday, October 23, 8:00 AM – 9:00 AM
Session ID UXP117
Title: Experience with Google Glass and Business Applications
Speaker: Markus van Kempen
Description: Google Glass presents a mobile form-factor which allows for new possibilities. This session discusses examples of user experiences, including the disconcerting experience of “wearing” a camera all the time, reactions from others, and navigation challenges. We show how to design for Google Glass and demonstrate business applications.
Thursday, October 23, 10:45 AM – 11:45 AM
Session ID ITM235
Title: Establishing Architectural Patterns for SAP in the Cloud at CokeOne +
Speaker: Michael Ryan
Description: The CokeOne + migration to cloud for their non-production SAP environments included the establishment of architectural patterns to take advantage of the services provided by cloud computing. This session focuses on establishing the architectural patterns needed to transform businesses by moving business systems and processes to a cloud model.
Thursday, October 23, 2:00 PM – 3:00 PM
Session ID DMM127
Title: Streamline SAP HANA Solution with Near-Line Storage Solution by PBS and IBM
Speaker: Elke Hartmann-Bakan
Description: Streamline your SAP HANA solution by keeping only hot data in memory and moving warm data to near-line storage (NLS). This allows you to maintain a lean SAP HANA database and sustain high performance. The PBS and IBM NLS solution offers near real-time speed on NLS, ultra fast load time from the online database to the NLS, and extreme compression.
Thursday, October 23, 4:30 PM – 5:30 PM
Session ID ITM123
Title: Planning Your Company’s SAP Systems Migration to the Cloud
Speaker: Michael Ryan
Description: The opportunity to move the SAP infrastructure to cloud is a game changer. Businesses are offered a level of speed and agility that has not been available in the past. However, moving to cloud does not solve basic issues that we experience in the IT world. We take a look at some of the key issues and think about the impact across enterprises.
EXPERT NETWORKING SESSION
Tuesday, October 21
2:30pm – 3:00pm Lounge #4
SAP Applications on IBM Cloud – from self-service to fully managed
Keith Murray, Global Offerings Manager SAP on SoftLayer, IBM SmartCloud Services
Wolfgang Knobloch, IBM GTS Global Offering Manager, SAP
Production HANA on VMware is in “controlled availability, allowing selected customers, depending on their scenarios and system sizes to go live with SAP HANA on VMware vSphere immediately” per SAP OSS Note: 1788665 – SAP HANA Support for VMware vSphere Environments. However, the SAP marketing team left this small stipulation off the press release and got everyone very hot and bothered.
Must be approved for controlled availability by SAP
Must be on VMware vSphere 5.5
Must be on SAP HANA SPS07
Maximum of 1 TB
Must be on SAP approved HANA server and storage
Must comply to SAP’s current recommendations for vCPU and RAM
Must not over-provsion the CPU nor RAM
Maximum of 1 Virtual Machine (VM)
In other words you need the latest and greatest version of VMware and HANA running on your HANA approved appliance in a nearly non-virtual manner. While this is less than what we all want, it is a step in the right direction. It will allow you to manage the HANA instance under your VMware management utilities. It makes HANA part of your Software Defined Environmental strategy. I’m confident that over time, as it becomes Generally Available, that production HANA will have far fewer restrictions.
I’m actually looking forward to when we can run production HANA on lots of virtualization schemes. I look forward to more of software defined service level agreement (SLA) with SAP so that other virtualization environments including the cloud providers can provide production services. Right now it is about shipping hardware to Waldorf, DE for certification and is so specific, it is not practical even for hardware manufactures.
SAP needs to move to software defined SLA would be good for everyone including SAP by making HANA more available and take less effort to certify platforms, hardware and cloud providers who want the ability to vary the make-up of servers based on market conditions and newer evolutions of chipsets, and especially my clients who want HANA, but in but running in a completely virtual word they are defining, not the one SAP is trying to define for them.
As Vishal Sikka (former SAP CTO) exits, limited Production HANA on VMware is great first step for the product he called his child, HANA. Unlimited production HANA on VMware would be a great toddler-hood. I really look forward to seeing it rapidly reach its teenage years and start trying to run on everything everywhere. Isn’t that what teenagers do?
There are 2 dimensions that have meaning when discussing cloud solutions and the terms public, private, and hybrid contains too many overlaps and ambiguity. Is a cloud private if it is hosted? Is a cloud public because I access it over the internet and does that mean my corporate data center accessed via VPN is public? Worse, the term Hybrid Cloud gets bounced around so many different ways that is no longer relevant at all. I could have SAP ERP and SuccessFactors or SAP ERP production in a corporate data center and the non-production SAP ERP systems on a IaaS cloud provider such as IBM’s SmartCloud IaaS or it could mean I do some ERP functions on SAP ERP on premise and some using SAP Business-by-Design (BbyD) or on NetSuite ERP. In the end, it only means is I’m using more than one type of solution.
There are meaningful dimensions to describing cloud solutions. The two (2) dimensions that matter are: 1) location, and 2) separation.
First is location. Where is the solution residing? Is it on my premise, site or facility that I own or at least consider part of my corporate network of locations. Alternatively, is it away from the bulk of my IT assets such that I need a WAN to access it. For clients with highly distributed data centers this becomes a moot point since everything connects over the WAN; however, most clients have consolidated their corporate application and data centers into just a few locations. IBM runs its corporate business in just a few data centers with SAP in just 2 of them globally.
Second is separation. What separates the resources. Are the solution resources separated by physical boundaries such as server, application, or database or is the solution separated by layers of software?
The trend is clearly towards SaaS where location is off premise and all the infrastructure and application components of the solution are software separated. In a SaaS solution, it is clear you are using the SaaS providers data center of choice and not your own. You also are accepting that the secured division of servers, network, and storage. Even more importantly, you are accepting that your data in flight (within the application) and at rest is kept secured and separated from others including your competitor. I have seen numerous cases where direct competitors use the same SaaS solution. In fact, most SaaS providers count on competitors using their application to scale since you can’t build a SaaS business on single client. Clearly, we do believe in software separation has come of age.
Software divided applications and infrastructure is becoming a the rule. If you want to take the software divided and secured environment on premise, IBM calls it Software Defined Environment (SDE) (). Another major player, VMware, calls it Software Defined Data Center (SDDC). Regardless, each is an attempt to virtualize the infrastructure for more effective use of all assets. This is good for both cost savings and for agility. Deploying virtual infrastructure is far faster than physical. Often clients think the value of the business case is in the cost savings, but ultimately it is found in the agility that translate into real application and business value.
So, next time someone comes to you with a hosted semi-private cloud infrastructure offering with hybrid capabilities, just ask them 2 questions. Where is the solution residing? What is separating the infrastructure and applications? Concise answers will reveal a lot about the solution and you’ll have more time to do something beautiful like grow your hybrid tea roses and find PEACE.
Rather than considering SaaS the enemy of on-premise ERP, consider how you can leverage its capabilities to consolidate, optimize, and simplify your on-premise ERP systems to achieve a better, faster, more agile result. On-premise ERP is not going away tomorrow, at least if you run it well and provide the required agility to the business and SaaS can help. The key is providing the right services levels across the entire environment.
In my last blog, 2-tier ERP: A cure for the smaller markets in a global implementation, I discussed how you can use a SaaS based ERP implementation to eliminate some of complexity of the implementing simpler, smaller markets. The focus was geographic or unit based. Similar logic can be applied to functionality. Some Line-of-Business (LOB) applications and/ Human Capital Management (HCM) / Human Resources (HR) functions can also be pulled into SaaS. The result is even more acceleration of your overall program.
Most of my clients are either using or reviewing the use of SAP SuccessFactors, SAP Ariba, and SalesForce as SaaS based LOB’s. These 3 SaaS solutions show up repetitively at many SAP clients. I am well aware SAP has CRM and even a CRM RDS (Rapid Deployment Solution) on cloud, including IBM’s SmartCloud and even CRM capability in Business-ByDesign, but SalesForce is dominant in the market and needs to be considered. Clearly there all alternatives to these 3 SaaS solutions in the market. I use them as examples and your company needs be responsible and do a comprehensive review of each area to find the exact solutions that best fit your company. I’ll simply use these as common examples.
The idea is if you can remove the scope of configuring SRM with Ariba, most HR / HCM scope with SuccessFactors, and CRM scope with SalesForce, the reduction in scope in the core and plus making them parallel projects can enable the overall implementation to move far faster and introduce some heavily requested mobile capability.
In addition, by using SaaS based products you can potentially lower costs of procurement, implementation, and support. You definitely can accelerate the roll-out of these functions and bring significant mobile capability to business in these areas which might help your business case by bringing benefits earlier. These 3 areas are especially important since the touch so many individuals with mobile roles.
By design, every SaaS program I’ve seen is heavily mobile enabled. Mobile and cloud are ideally suited for each other. The more recent thought process of agility and velocity drives a simpler and more targeted design for most SaaS solutions. The SaaS thought process better fits the minimalist mobile screens than the idea of highly optimized processes found in most core on-premise ERP systems. Cloud serves mobile well since it is equidistant to all devices. Mobile devices are not clustered like terminals around headquarters based system which up until recent was the assumption for on-premise ERP.
Even if the SaaS implementation is a wash on cost, the acceleration of delivery and the enhanced mobile capabilities may push the business case over the goal line. Keep in mind that the mobile applications are native to most SaaS applications so you don’t have to build up a support staff to maintain the mobile interfaces like with many ERP add-on products. They are as native as browser support is in SAP ERP. You still will need to manage devices and may even want to build, but the native mobile capability will lessen your IT support burden.
SaaS is not a panacea. For companies who require extremely high complexity or significant optimization of processes, it may not meet their requirements. If your company is sensitive to OpEx over CapEx (utilities generally prefer CapEx for IT) then SaaS may not be a good solution. Finally, SaaS applications require integration, most likely you require some optimization (custom functions), and require good project management, organization change, communication, and training which is not significantly different with SaaS than with on-premise based implementations. Maybe a simpler program can demand less of these soft skill deliverables, but it is not the “SaaS” that changes the effort.
In your next transformation program you should consider the use of SaaS based applications to remove scope, increase velocity and agility, reduce the overall program timeline, accelerate benefits, and potentially lower total costs.
In my next blog, I’ll examine how this idea can be applied to an existing SAP landscape requiring renovation, optimization, or simplification.