Technology-as-a-Service: An enterprise M2M strategy

Machine to Machine (M2M), Industrial Internet, Internet of Things (IoT)… different names, but they all in a way converge on the same thing, looked through different lenses. This is one of the biggest emerging trends in network computing. Mega-trends like these happen once in a decade or longer. At Sun Microsystems where I spent 10 years, we used to have a tag line from start till the end which said: The Network is the Computer. What a vision that was, and it outlasted the company! Scott McNealy and others at Sun frequently would talk about the connected refrigerator. Today, it is a reality. Smart machines are here to stay and not only stay, but to grow and thrive. A company that sells any kind of hardware (washing machines to MRI machines, cars to planes, computers to mobile devices, the list goes on to virtually every physical object sold by someone), cannot afford to sit by on the sidelines and watch this mega-trend demolish them. These companies have to react and respond, and do it now.

GE, Cisco, and many other large corporations have long realized the importance of IoT. One of my favorite ads on TV (I don’t see it on anymore) comes from GE where all kinds of machines are coming home (I found this, but those videos are no longer active on youtube, however I did find this one that still works, sort of). Also check out what Dave Evans, Chief Futurist at Cisco says about this topic.

The point is, we have not only been talking about IoT for a long time, in fact, it has silently become a reality. Those that are still asleep at the wheel of their organizations can now ill afford to not notice and take some action.

MQ Identity Methodology
MQ Identity Methodology

I was talking to my friends Alok Batra and Jane Ren, who just launched their new company MQIdentity. Alok and Jane are well-known thought leaders and change leaders behind the Industrial Internet and M2M related transformations at Cisco and GE. Their knowledge of this space is quite vast and impressive.  So I asked them, what is MQ? They explained it stands for Machine Quotient. They went on to describe two concepts: Machine Quotient (MQ), the technical measure of efficient machines. You want MQ to be really high. And Service Coefficient (SC), the business measure of service competency. For your organization, you need to find the right combination of MQ and SC to bring the maximum effective business value out. All this sound too complex? This is where I think Alok and Jane with their MQIdentity methodology can help.

For me the most fascinating aspect dawned on me when I read their white paper titled Technology-as-a-Service (TaaS), which they just released on their website. It is a well-written, thought provoking paper that lays down a solid foundation to think about how to transform your business into the new IoT economy. They argue that TaaS will be the path of technology transformations for tech industries into service-centric economy. The paper is quite detailed and there is a lot to think about. But if I were to distill it to a few key take aways, here they are 1:

  • TaaS gives you a new way to expand your proprietary IP and technology, by unlocking its access to new customer base leveraging other proven services model (IaaS, SaaS, etc.).
  • Your business is heavily disrupted by technology trends and you cannot stand by any longer and watch your customer base erode to newer solutions while you still have huge value locked inside your proprietary technology/products that can generate new revenues and expand in ways that you can’t otherwise think of expanding.
  • Your business is also disrupted by new startups who are going to offer the state-of-the-art solutions at a much lesser price, even though their solution may not be as excellent as yours, you will see customers erode to the “good-enough” solutions.

In short, this is a space (M2M/IoT) that I am totally fascinated about, and this space is going through tremendous innovation. As a result, in the coming years, there will be a new emergence of creative solutions, products, business models. In my opinion, Alok and Jane are right in the thick of it. So watch this space, it will be exciting!

1 – There are many other implications, consequences and micro-disruptions at work in this area, read their white paper or their executive summary for more details. Don’t just take my word for it.

Balancing Act: Build and Buy Strategy

LegoIn most startups, you begin with a hypothesis of how you are going to address a need, and hope to build a product or a solution that can fulfill that need and build a business out of it. The time to market is one among the many pressures weighing down on your mind. How do you execute in this scenario?

  1. Build: You can build everything on your own. Though you don’t know what to fully build yet,  you will discover along the way.
  2. Buy: You can find other technologies that address certain problem in a generic way and bundle (integrate) those to create your product. 1
  3. Build and Buy: You can build what you think is the core competency that your startup is going to focus on, and buy/acquire the other peripheral components to create your full offering.

Sometimes building everything on your own is stupid, why would you want to waste your time and resources in solving problems already solved. And you can’t just buy everything and cobble together a unique product that is worth something. So you have to really come up with a Build and Buy strategy for your product to both be uniquely valuable and addresses time to market. Time kills startups.

So if you are in charge of building the product, you are bound to face all these pressures coming from your CEO, CTO, Marketing and Sales visionaries in your company. And this is where if you are not careful, you will end up building something that probably takes more time, is not elegant, and does not add any real value. And thus, if the product has no real value, how can you expect your company to establish itself as a viable business. The paranoid around you will start proposing ideas, most of them tend to be hare-brained, spur of the moment types coming from really loud voices in the organization. This is the time when you have to stand strong. After all, you are there to build a new product that offers a new and unique value proposition and change the dynamics in the market.

I have gone through this experience myself and wondered if I and our product team caved to these pressures and simply bundled what is out there already in the market, would we really be a product company or just another integrator? Where does innovation rank? Where is the IP that is going to make your company highly valuable and set us apart? As a product owner, you need to really think hard and have the courage, conviction and vision. You need to lead and convince others that what you propose to build will be more valuable than just cobbling together a piece of other technologies. At JackBe, we established very early on that innovation was going to be the key value my team was going to live (or die) by, and as such, we went against the opposing forces time and time again to innovate and really build the core technology and IP that established JackBe as the market leader in enterprise mashups and real-time intelligence. Our product gained customer admiration, won several awards and eventually, the company was acquired by Software AG because of the value of our unique product and IP.

I am not saying that your complete product has to be 100% home-grown, every line of code written by your team. There are plenty of FOSS products for you to build your unique product or platform. And there may be commercial products or components that you might want to OEM and license to build your product. There is no need to build those from scratch. That would be silly and a stupendous waste of time and resources. The key is to identify what is core to your differentiation and what is commodity, and then build your core by leveraging the commodity components out there and that’s how you balance the forces and address the time-to-market. Some common commodity components to consider (I only mention a few to just illustrate the example)2:

  • Server Side (Java Based): Application Server (Tomcat/Jetty/etc.), Application Frameworks (Spring, Akka, Play, etc.), Utility Libraries (Apache Commons, Apache Axis, Saxon XML Parser, etc.)
  • Database Systems: RDBMS (Apache Derby, MySQL, etc.), NoSQL (MongoDB, Redis, etc.)
  • JavaScript: jQuery, jQuery Mobile, Prototype, Angular JS, etc.
  • Visualization: Fusion Charts, High Charts, D3, NVD3, etc.

So my message to you is this. Don’t just bundle other COTS to create your product. It won’t help you to build a revolutionary new product. Instead make  your own unique and secret sauce and mix it with what is commodity already to create a high value game changing product.

1 – Buy: For the purpose of this discussion, I am using ‘buy’ very generically. In some cases, you don’t really buy, but you can just acquire it via partnership. For open source products and components, you just use it (beware of what open source license they come with and their restrictions there upon. My favorite FOSS licenses are Apache 2, MIT, BSD. I tend to stay away from GPL (all versions) and LGPL.
2 – Free or Commercial: Note that some of the components are open source, and some are commercially available sold by other vendors in an OEM friendly license that you have to acquire from the respective vendor for a price.

JackBe Engineering Methodology

Often I have been asked by customers, partners and friends, what methodology we use at JackBe to deliver our product releases. The closest I have come to describe it is to brush it off by saying we use Agile development methodology. But that is just scratching the surface. When you dig me in for more specifics, I end up scratching my head to explain coherently. So I thought I would share my thoughts briefly to see if I can catch the main message I want to convey about this topic. For a brief comparison of the various approaches under the agile umbrella, see what Martin Fowler has already said about that subject.

At JackBe, the methodology we adopted was crafted out of several approaches and stabilized over time. The number one goal was to keep delivering the releases and features, and not let processes get in the way. And we did that successfully over the past 7 years, averaging 3-4 releases per year, which I believe is a significant and consistent achievement in enterprise software products business. Let me discuss around the 2 most popular methodologies and my take on how they were relevant to our approach:

  • On XP The values espoused by the XP approach (Communication, Feedback, Simplicity, Courage, and Respect) were fundamentally intrinsic to our team, and that took a lot of time and effort for us to cultivate initially. However, once this was in place, as new members go on the team, they quickly absorbed and contributed to the approach and became an integral part of the team very quickly. However, though we initially started with a bit of Test Driven Development (TDD), but over time it became hard to maintain that in our development cycle. This often meant, that tests took the back seat to other aspects of development, but we were only accumulating the technical debt, as testing the software later racked up the bill. The number of unit tests we were writing as a team dropped drastically as the rush to push out new features became a business priority, and this is one of the traps that most development teams fall into.
  • On Scrum While Scrum recommends a periodic sprint (say monthly) that the entire team coalesces around delivering, we sometimes adopted this principle. However, we did not follow the daily scrum meetings and there was no Scrum Master to do this. Instead, we valued open and continuous communication between all the members of the team including myself so that no task would get held up in order to reach the goal of delivering a software milestone. Our team did not wait for daily scrum meetings, we just took up the issue as it arose and resolved it. We did have the weekly team meeting as it was necessary to get everyone aligned and re-aligned to ensure we were making continuous progress. But the main work of communicating and aligning was done daily on a continuous basis allow the weekly meetings to provide a wider update and status check for the team. Many others in the industry who use the Scrum approach rigidly, have complained that the team finds the daily scrum meetings which should be taking very little time, turn into hours of debates. If you have a good scrum master, perhaps this can be avoided, and it also depends on the overall personality of the team and individuals in the team. See

So in my experience, there is no one single methodology that one can simply adopt for the entire team. I believe it is the collective responsibility of the engineering leader and the team to craft the team’s own agile methodology by choosing the various best practices that are best suitable for the team, the product and the business. Each methodology comes with its own framework, values, practices and constraints. They are meant to provide guidance in determining your own successful course, and there is no need to be rigid about what you want to box yourself in. Make your own agile methodology by choosing the best of breed, blending with your own and team’s experience and skill set.

At any cost, do not forget the original purpose of your work, which is delivering high-quality software, and become a slave to the process. After all, delivering software is our business and we must enslave the process to do our bidding!

The Trifecta: Big Data, Mobile and Real-Time

Image[Cross-posted from JackBe blog]

While it is interesting to come up with your own predictions, I am more interested in seeing what other people I follow are predicting. First, to see if there are some synergies – perhaps if they too say what I said, it somehow validates my prediction, or so I feel anyway. And second, which is more important, is to learn from all the smart guys out there that are saying something significant about the technology space in 2013. I recently saw this blog post from Mike Gualtieri, another analyst/researcher that I like to follow, and lo and behold it is all about Big Data!

I liked one thing he says in particular, which I think others are also saying based on what I read in the last couple of days: “All data is Big Data”. In my prediction, I hinted that Big Data would get segmented along the now familiar dimensions of velocity, volume and variety – which is my way of saying “all data is Big Data,” but we have to deal with each segment in slightly different ways. In Mike’s predictions, another thing stands out in my mind. He says:

“Real-time architectures will swing to prominence. Firms that find predictive models in Big Data must put them to use. Firms will seek out streaming, event processing, and in-memory data technologies to provide real-time analytics and run predictive models. Mobile is a key driver, because hyperconnected consumers and employees will require architectures that can quickly process incoming data from all digital channels to make business decisions and deliver engaging customer experiences in real time. The result: In 2013, enterprise architects will step out of their ivory towers to once again focus on technology — real-time technology that is highly available, scalable, and performant.”

Hurray! Big Data coupled with two things: Real-time and Mobile. Now, that’s something else! This area is ripe for innovation and disruption. We ourselves at JackBe have been long focused on real-time aspects of dealing with data, be it for Real-Time BI or for Mobile BI. We believe that data can be mashed up in real-time and served on a platter to business users in the form of insights, delivered to them as Apps Anywhere™. And as Big Data enters into the equation, most people are quick to realize that technologies like Hadoop (most prominent among Big Data discussions as of now) are not real-time at all and definitely need some augmentation. Business moves fast and needs insights fast. You can’t wait till all the data is collected and all the batches are run to get your insights. You can’t wait until your data warehouses (big or small) are built to get decisions made to move your business. You need real-time answers to fast changing data in your environment.

What do you think? How are you reacting to the three forces of Big Data, Mobile and Real-Time?

Next Generation of SOA is Here: Are You Taking Advantage?

[Cross posted from my JackBe blog]

Since blogging my 2013 BI predictions, I’ve come across ZapThink’s predictions and one of them caught my eye. No, it is not about Big Data. It’s about something that has gone out of fashion, almost. It was about SOA. Here is the excerpt from ZapThink:

Next generation SOA begins to coalesce – For years, ZapThink has touted the difference between the practice of SOA and purported implementations of SOA. Our mantra has always been that SOA is protocol and technology independent: it doesn’t require Web Services, or ESBs, or any of the heavyweight IT infrastructure that has given SOA its reputation of complexity and failure. With the rise of Cloud Computing, architects are increasingly realizing that the bits and pieces of SOA best practice – loose coupling, intermediary-based abstraction, and automated governance, to name a few – can and should be applied as appropriate, independent of the existence of any specific, funded SOA effort.               

Back when we started building our Presto platform at JackBe—seems long ago now—our goal was to actually build the bridge between all the services and data sources in an enterprise with the business users who really need to access it. Over time, we built our flagship Enterprise Mashup Server to fulfill this need and introduce this slice into the enterprise architecture as a meaningful way of extracting ROI from your SOA investments and delivering real value to end users and business users. Now, we do that and more.

For example, we’re delivering comprehensive insight into the full spectrum of operations for critical decision-making and for measuring the business impact of every occurrence. And we’re doing it all in real-time. We don’t advocate complex architectures or re-architecting your existing systems in the name of SOA. Why would you if there is a better way? We like to work with what already exists, leverage and complement them to bring the value of these otherwise inaccessible systems and data sources.

I like to think that we liberate these silos of information, bring them together in a quick, nimble and rapid way that we are known for, and generate new business value out of thin air. By correlating existing data sources and combining new and old information sources, we are able to mash them up to generate new insights in real-time for our customers. This now proven approach is already here and is what we have always believed as the continuation of the SOA marches toward the users. Such an approach already embraces the concepts of loose coupling, intermediary-based abstraction, and so forth, and yes also works with stringent security mechanisms already in place in your enterprise architecture.

So, the next generation of SOA is already here, you don’t need to wait for 2014 to take advantage of it. Are you ready to exploit it?

Big Data Gets a Real-Time Face-Lift

big-data[Cross posted from my JackBe blog]

Read through any technology publication, website or blog and I dare you to tell me you didn’t come across a Big Data story. Big Data is hyped. And that’s not the first time I’ve said that. In fact, the last time I addressed this topic I declared that one of the biggest problem with Big Data is the disconnect between how much it’s talked about and how much it’s understood. Cindi Howson of BI Scorecard recently hit the nail on the head by drawing an integral connection between Big Data and BI. She dared to say—against advice from a fellow Strata Conference attendee that it might be considered blasphemy—that Big Data is more than Hadoop. She couldn’t be more right.

And Gartner’s Hype Cycle for Emerging Technologies 2012 prominently featured Big Data and linked it to various other emerging technologies, with Big Data just entering the “peak of inflated expectations” and predicts it will take 3-5 years to reach the “plateau of productivity.” But don’t just walk away. Big Data as much as it is hyped, is also real and it is here to stay and is making new inroads every day.

My colleague Dan Malks recently shared some of his 2013 BI predictions and now I would like to follow it up with a few of my own. Yes, I’m focusing on Big Data. But it also turns out—in a great minds think alike kind of way—that real-time intelligence will be a critical element to navigate the way in which we consume, analyze and leverage data of all types and from all places, for instant decision-making. Here goes…

  • Big Data will go more real-time, for real. Forget about batch processing. Data that is stashed away without the ability to understand it on a continuous real-time basis is just another old data warehouse, albeit a Big Data warehouse.
  • Business users will want direct access to insights derived from Big Data, again in real-time, and anywhere. Here I am talking about the kind of tooling and processes needed to make it easy for the business decision makers to have insights delivered to them in real-time from analyzing Big Data warehouses— especially on smartphones and tablets.
  • Big Data will get further segmented into Big Data, Big Small Data, Fast Big Data, and Fast Small Data, along the dimensions of speed/velocity, size and volume. The capabilities (technical and non-technical) needed to handle these are different for different segments.

That’s my two cents. What do you think will happen next year? Share your predictions with us in the comments section or on Twitter using the hashtag #BIin2013 for consideration in an upcoming post.

Using enterprise mashups to save billions

I just came across this from Joe McKendrick on ZDNet Blogs that caught my eye – Study: Increase data usability, save billions.
Here is an excerpt:
Researchers say data usability can be improved by focusing on the following factors:
  • Intelligence of data “can be improved through the accuracy of the prediction, trends analysis, recommendations and profile matching/associations made by the associated applications. For example, what percentage of recommendations made by a business intelligence application results in cross-selling?”
  • Remote access to data and applications is essential in an increasingly mobile workforce.
  • Sales mobility “involves the ability of salespersons to use portable devices and applications to exchange information related to all aspects of a deal or transaction with a customer.”
  • Improvements in data quality will result in improvements that “may come through better and timely decisions (which may increase customer satisfaction, loyalty and hence revenues), as well as fewer errors and rework, lower working capital requirements, faster receivables, etc. (which will lower costs).”

A 10 percent improvement can add up to big dollars. Researchers determined that if a median Fortune 1000 business (36,000 employees and $388,000 in sales per employee) increased the usability of its data by just 10 percent, it would translate to an increase in $2.01 billion in total revenue every year, or $55,900 in additional sales per employee annually. End of excerpt

I find this is very interesting. But, the question is how do you go about achieving this.
  1. You don’t want to be spending millions to save millions.
  2. You don’t want to take years to achieve this goals.

If you can afford to do either, then I suggest, you read no further.

To me, enterprise mashups have been at this for a few years. Take remote access to data and applications for instance. It is dead easy for us to create a new enterprise mashup that wraps the existing data and applications, creates a specific usable view of that data, and then expose this mashup as a Web Service (SOAP or REST), using Apps to your end users and customers. This does not take years, it can be done in hours and days today.

Consider mobility. You want to not only create a more usable view of data, but in turn ensure that this data is available for your mobile users to interact with wherever they are via any portable device. This too is fairly easy to achieve using enterprise mashups.

Basically, enterprise mashups create that agility layer in your enterprise architecture to deliver concise, specific, usable data and applications to your users, without disrupting your current enterprise architecture. This new agility layer can respond rapidly to new business needs by changing the enterprise mashups and creating new enterprise mashups when required.

Enterprise mashups don’t solve the traditional problem of data cleansing in the traditional way…Extract/Transform/Load (ETL). That’s the whole point. Most customers can’t afford (time or resources) cleansing data that way. I think that enterprise mashups thrive when conventional solutions become expensive and time consuming.