GitHub purchase by Microsoft

From my view, Microsoft bought GitHub for 2 major reasons – access and information. Access is the first reason and it enables an extension of their own tools and cloud. My assumption is GitHub will soon find the first option for tools and for cloud to be Microsoft’s unique line up. Why would a developer publish to AWS, Oracle, Google, or IBM if a single button press got you the latest features and tightest integration by going to Azure. They won’t eliminate or block the others, they’ll just make Microsoft the default.

I don’t think Microsoft is buying GitHub to bury it or ruin it. Microsoft is not exactly the biggest promotor of open source, but they are an active player. This is not like Gillette buying the stainless steel razor blade patent so they could drag their feet on producing one and get more money out of their existing products. If Microsoft blocked GitHub, I think the world would just develop an alt-GitHub or shift to competitor.

The second is probably the more important: information. GitHub is where developers, programmers, and coders dream. They put snippets of code which are glimmers of the future. Simply understanding what libraries, language, databases, tools, and clouds are being used, frequency, and in what combinations will yield bright headlights into the near future. If you release a new library, you can now easily see its uptake in the community. Put more money into it if it’s yours, alter yours to look more like the winner, partner where you can’t win, or buy it up if it’s a good investment.

As long as Microsoft uses a respectful hand and doesn’t become the evil overlord, I think the purchase of GitHub will yield a bounty of information by which they can steer their own development of tools and products. For a company that has jumped in late on the Internet, Open Software, and Cloud, they sure do an impressive about faces.

 

Advertisements

Digital Twin: the 2018 agile wind tunnel with quantum future

Digital Twins enable testing of real world testing of complex systems. The concept of living test lab has been dream for testers. Digital Twins are not static, but allow for constant new input based on the real world from IoT sensors. Digital Twins make use of AI, Machine Learning, and IoT to simulate complex system behaviors. IBM is working in our labs and with our clients to find exciting new ways to use and create digital twins.

When flight first started, a man had to risk his life to test each innovation. An innovation had a high threshold since the bet was a human life. Eventually, engineers built wind tunnels where they could simulate the effect of the air flow over the plane. While it no longer was risking a life, it had limitations on size (can’t fit an entire 747 in wind tunnel), was artificial, and was costly.  Also how do you simulate more complex events like sudden down drafts, lightening strikes, rough landings, wear and tear over years (metal fatigue, corrosion)? Now with digital twin, you can test the effect of changes to the digital twin of the airplane. We can run 100’s or 1,000’s of changes and combinations of changes to identify the impacts. Only the best of these changes will be put into use.

The while the changes put into use could be small, similar to agile built software application, they would add up to significant impacts. The feedback from the IoT devices in the real world will then update the digital twin allowing new sets of changes to be developed, deployed, and tested before the best combinations are rolled out in rapid succession. As most planes are now fly by wire and highly digital, incremental changes are possible to many of the systems. Today it might not be possible to reshape physical parts like wings, fuselage and rudders, but maybe in the future technologies could reshape the surface to change physical parts of the plane. Clearly there would need to be progression from test bed, to unmanned, to test flights before it went into passenger aircraft, but the rate of innovation in safety related industry goes up by orders magnitude and the risk and costs come down proportionally, too.

The ability to try millions and even billions of combinations in each digital twin is not yet possible as it would overwhelm the compute power of traditional binary computers. The rapidly evolving Quantum Computer may provide the power required to make machine learning nearly unlimited in capacity enabling deep learning and unlimited numbers of combinations of factors in our digital twins. You can even try out quantum for yourself in IBM’s DevOp environment – bluemix.

Benefits of digital twins can apply to almost any machine, group of machines, or ecosystem of lots of groups of machines. I wonder if in the future, a quantum digital twin could be more complex and subtle in its simulation than the real world. As of today, our models of reality pale in complexity to the real world. Below is simply mind map machine systems with a focus on transportation machines. It shows how digital twins can use data from other digital twins. It is model composed of multiple models.

Screen Shot 2018-02-13 at 1.55.54 PM
A network of machine ecosystems that can become digital twins

How could a digital twin help your industry? How can you take advantage of a digital twin to improve the quality of life and leverage the vast amount of data pouring out from mushrooming number of IoT sensors? It is an exciting problem to explore with real business implications.

Terrific Practical 10 step path for success with Analytics from Jerry Kurz.

Thirty years of experience talking. Worthy of 15 minutes of your time.

Folks, I am very proud and happy to have my dear friend Jerry Kurtz do a guest blog on my site. Jerry runs the Cognitive and Analytics businesses in my portfolio, and is a long time IBMer. He has been in this field for 30 years across SAP, Managed Business Process Services and Analytics and […]

via And Jerry Says : A Path to SUCCESS with Advanced Analytics — Vijay’s thoughts on all things big and small