Business · Data

Understanding the value of data integration

To understand the value of data integration, one has to first understand the changing data landscape. In the past few years, more data has been created than existed in all of time prior to that. In 2014, I penned a post asking ‘Are enterprises prepared for the data tsunami’? When it comes to data, enterprises of all sizes and maturity face two core issues: 1) How to effectively manage the sheer volume of data in a meaningful way and 2) How to extract insights from the data. Unfortunately, the traditional ways to manage data start to break down when considering these new challenges.

DIVERSE DATA SETS

In the above-mentioned post, there was reference to an IDC report suggesting that by 2020, the total amount of data will equate to 40,000 exabytes or 40 trillion gigabytes. That is more than 5,200 gigabytes for every man, woman and child in 2020.

However, unlike data in the past, this new data will come from an increasingly varied list of sources. Some of the data will be structured. Other data will be unstructured. And then there is meta data that is derived through analysis of these varied data sets. All of which needs to be leveraged by the transformational enterprise.

In the past, one might have pooled all of this data into a classic data warehouse. Unfortunately, many of the new data types do not fit nicely into this approach. Then came the data lake as a solution to simply pool all of this data. Unfortunately, this approach is also met with challenges as many enterprises are seeing their data lakes turn into data swamps.

Even beyond data generated internally enterprises are increasing their reliance on externally sourced data. Since this data is not created by the enterprise, there are limits on how the data is leveraged. In addition, simply bringing all of this data into the enterprise is not that simple. Nor is it feasible.

Beyond the concept of different data sets, these new data sets create ‘data gravity’ as they grow in size. Essentially, creating a stronger bond between the data set and the application that leverages it. As the size of the data set grows, so does its ‘gravity’ which prevents movement. All of these reasons create significant friction to considering any movement of data.

VALUE OF INTEGRATING DATA

The solution rests with data integration. Essentially, leave data where it resides and leverage integration methods to the various data sets in order to create insights. There are actually two components when considering how to integrate data.

There is a physical need for data integration and one that is more logical in nature. The physical component is how to physically connect the different data sources together. This is easier said than done. It was already challenging when we managed all of the data within the enterprise. Today, the data resides in the hands of many other players and approaches. This can add complexity to the integration efforts. Modern data integration methods rely on Application Programming Interfaces (APIs) to create these integration points. In addition, there are security ramifications to consider too.

The logical integration of data often centers around the customer. One of the core objectives for enterprises today is customer engagement. Enterprises are finding ways to learn more about their customer in an effort to build a more holistic profile that ultimately leads to a stronger relationship. Not all of that data is sourced internally. This really is a case of 1+1=3 where even smaller insights can lead to a larger impact when combined.

THE INTERSECTION OF DATA INTEGRATION AND ADVANCED FUNCTIONS

Data integration is a deep and complicated subject that is evolving quickly. Newer advancements in the Artificial Intelligence (AI) space are leading enterprises to gain greater insights that even they didn’t think about. Imagine a situation where you thought you knew your customer, but the system suggested other aspects that weren’t considered. AI has the opportunity to significantly augment the human capability to create more accurate insights and faster.

Beyond AI, other newer functions such as Machine Learning (ML) and Internet of Things (IoT) present new sources of data to further enhance insights. It should be noted that nether ML nor IoT are able to function in a meaningful way without leveraging data integration.

DATA INTEGRATION LEADS TO SPEED AND INSIGHTS…AND CHALLENGES

Enterprises that leverage AI and ML to augment their efforts find increased value from both the insights and the speed in which they respond. In today’s world where speed and accuracy are becoming a strong differentiation for competitors, leveraging as much data as possible is key. In order to leverage the sheer amount of data, enterprises must leverage data integration to remain competitive.

At the same time, enterprises are facing challenges from new regulations such as the General Data Protection Regulation (GDPR). There are many facets and complexities to GDPR that will only further the complexities for data integration and management.

While enterprises may have leveraged custom approaches to solve the data integration problem in the past, today’s complexities demand a different approach. The combination of these challenges push enterprises to leverage advanced tools to assist in the integration of data to gain greater insights.

 

This post sponsored by:

SAP_Best_R_grad_blk

https://www.sap.com/intelligentdata

Business · Cloud · Data

Microsoft empowers the developer at Connect

iagGpMj8TOWl3Br86sxsgg

This week at Microsoft Connect in New York City, Microsoft announced a number of products geared toward bringing intelligence and the computing edge closer together. The tools continue Microsoft’s support of a varied and growing ecosystem of evolving solutions. At the same time, Microsoft demonstrated their insatiable drive to woo the developer with a number of tools geared toward modern development and advanced technology.

EMBRACING THE ECOSYSTEM DIVERSITY

Microsoft has tried hard in the past several years to shed their persona of Microsoft-centricity of a .NET Windows world. Similar to their very vocal support for inclusion and diversity in culture, Microsoft brings that same perspective to the tools, solutions and ecosystems they support. The reality is that the world is diverse and it is this very diversity that makes us stronger. Technology is no different.

At the Connect conference, similar to their recent Build & Ignite conferences, .NET almost became a footnote as much of the discussion was around other tools and frameworks. In many ways, PHP, Java, Node and Python appeared to get mentioned more than .NET. Does this mean that .NET is being deprecated in favor of newer solutions? No. But it does show that Microsoft is moving beyond just words in their drive toward inclusivity.

EXPANDING THE DEVELOPER TOOLS

At Connect, Microsoft announced a number of tools aimed squarely at supporting the modern developer. This is not the developer of years past. Today’s developer works in a variety of tools, with different methods and potentially in separate locations. Yet, they need the ability to collaborate in a meaningful way. Enter Visual Studio Live Share. What makes VS Live Share interesting is how it supports collaboration between developers in a more seamless way without the cumbersome screen sharing approach previously used. The level of sophistication that VS Live Share brings is impressive in that it allows each developer to walk through code in their own way while they debug and collaborate. While VS Live Share is only in preview, other recently-announced tools are already seeing significant adoption in a short period of time that ranges in the millions of downloads.

In the same vein of collaboration and integration, DevOps is of keen interest to most enterprise IT shops. Microsoft showed how Visual Studio Team Services embraces DevOps in a holistic way. While the demonstration was impressive, the question of scalability often comes into the picture for large, integrated teams. It was mentioned that VS Team Services is currently used by the Microsoft Windows development team and their whopping 25,000 developers.

Add to scale the ability to build ‘safe code’ pipelines with automation that creates triggers to evaluate code in-process and one can quickly see how Microsoft is taking the modern, sophisticated development process to heart.

POWERING DATA AND AI IN THE CLOUD

In addition to developer tools, time was spent talking about Azure, data and Databricks. I had the chance to sit down with Databricks CEO Ari Ghodsi to talk about how Azure Databricks is bringing the myriad of data sources together for the enterprise. The combination of Databricks on Azure provides the scale and ecosystem that highlights the power of Databricks to integrate the varied data sources that every enterprise is trying to tap into.

MIND THE DEVELOPER GAP

Developing applications that leverage analytics and AI is incredibly important, but not a trivial task. It often requires a combination of skills and experience to fully appreciate the value that comes from AI. Unfortunately, developers often do not have the data science skills nor business context needed in today’s world. I spoke with Microsoft’s Corey Sanders after his keynote about how Microsoft is bridging the gap for the developer. Both Sanders & Ghodsi agree that the gap is an issue. However, through the use of increasingly sophisticated tools such as Databricks and Visual Studio, Sanders & Ghodsi believe Microsoft is making a serious attempt at bridging this gap.

It is clear that Microsoft is getting back to its roots and considering the importance of the developer in an enterprise’s digital transformation journey. While there are still many gaps to fill, it is interesting to see how Microsoft is approaching the evolving landscape and complexity that is the enterprise reality.

Business · CIO · Cloud

IBM Interconnect expectations, a CIOs perspective

IMG_2882

This week is IBM’s annual cloud conference in Las Vegas. Quite a bit has changed in the past year for IBM and at this year’s IBM Interconnect there are a few things I’m looking for. Each of them centers in the mainstream of enterprise demand. Here’s the quick rundown:

IBM CLOUD CURRENT STATE AND DIRECTION

Over the past several years, IBM made strategic acquisitions that feed directly into IBM’s core cloud strategy. Those include Softlayer and Bluebox Cloud. Since last year’s Interconnect conference, I’m looking to hear how things have progressed and how it impacts their direction. Both are key attributes to enterprise engagement.

UNDERSTANDING THE IBM CUSTOMER

IBM is well known for catering to their existing customer base. As enterprises evolve, I’m looking for indications on how non-IBM enterprise customers are choosing to engage IBM. Is most of the demand still coming from existing IBM customers? Or have others started to gravitate toward IBM…and why?

In addition, how has the recent partnership announcement with Salesforce changed this engagement? Granted, the ink is still wet on the agreement, but there may be a few tidbits to glean here.

PORTFOLIO HALO EFFECTS

IBM’s Watson provides an interesting opportunity for enterprises looking to engage analytics, machine learning (ML) and artificial intelligence (AI). Watson, along with the strides IBM has made with Internet of Things (IoT) provides some interesting opportunities for both existing and prospective IBM customers. I’m looking to see if these are creating a halo effect into IBM’s cloud business…and if so, how and where.

LEADERSHIP CHANGES

Finally, IBM is changing up the leadership team. Longtime IBM’er Robert LeBlanc has departed from leading the IBM Cloud division and changes are afoot in marketing. How will these changes impact how IBM approaches cloud and how IBM is perceived in the broader enterprise market?

 

Overall, IBM is clamoring to be a leader in the enterprise cloud space, but faces some stiff competition. Cloud has been a key element in IBM’s enterprise portfolio for some time. This week should provide greater insights on their current state and path moving forward.

Business · CIO · Cloud · IoT

The five most popular posts of 2016

img_5311

While 2016 is quickly coming to a close, it offers plenty to reflect on. For the CIO, IT organizations and leaders who work with technology, 2016 offered a glimpse into the future and the cadence in which it takes. We learned how different industries, behaviors and technologies are impacting business decisions, societal norms and economic drivers.

Looking back on 2016, here is a list of the top-5 posts on AVOA.com.

#5: Understanding the five tiers of IoT core architecture

In this July post, I suggest an architecture to model IoT design and thinking.

#4: Changing the language of IT: 3 things that start with the CIO

This May post attracted a ton of attention from CIOs (and non-CIOs) as part of their transformation journey.

#3: IT transformation is difficult, if not impossible, without cloud

Another May post on the importance of the intersection between transformation and cloud.

#2: Microsoft Azure Stack fills a major gap for enterprise hybrid cloud

Only one of two top-five vendor-related posts digs into the importance of Microsoft’s hybrid cloud play.

And the #1 post…

#1: Is HPE headed toward extinction

This provocative post looks at business decisions by HPE and how they impact the enterprise buyer.

2017 is already shaping up nicely with plenty of change coming. And with that, I close out 2016 wishing you a very Happy New Year and an even better 2017!

Business · CIO · Cloud · IoT

Three things to look for at Amazon’s upcoming AWS re:Invent conference

screen-shot-2016-11-23-at-8-10-21-am

As folks in the US prepare for the Thanksgiving holiday, those of us in technology are looking to Amazon’s annual cloud confab AWS re:Invent the week after Thanksgiving in Las Vegas. In preparation for the sold-out conference, there are a few things to look for at the conference.

ENTERPRISE ENGAGEMENT

Amazon has done a good job of attracting the Webscale and Startup markets. One could go so far as to say that Amazon has cornered these markets. For these folks, the options are wide open to address their specific and scaling requirements. The requirements for enterprises, however, are vastly different from their webscale and startup counterparts.

Look for indications that Amazon is starting to learn how to engage enterprises and is moving in that direction. The method, language and solutions vary greatly when considering a prospective customer that has an existing footprint over those that are looking to build their first.

INTELLIGENCE: AI & MACHINE LEARNING

Amazon already has a Machine Learning solution in their portfolio today. Look for further expansion beyond just tools and into the realm of intelligence. Amazon has done a great job of create a bevy of infrastructure tools. However, moving into the intelligence space is both necessary and the logical next step in maturing Amazon’s position in this market. Also, look for Amazon’s response to the growing interest in AI solutions. In addition, the combination of AI & Machine Learning is paramount to more mature IoT solutions.

MOVING UP THE STACK

To date, most of Amazon’s portfolio targets the infrastructure end of the stack. There are a few solutions that move up the stack, but not many. Even so, Amazon has done a stellar job of their existing efforts. If, however, Amazon is intending to capture more of the enterprise market and move beyond being simply a tool provider, it needs to move up the stack into the application realm. To date, it is not clear if Amazon has both the capability and strong interest to do so.

Across the board, Amazon’s competition may not have the depth in IaaS cloud that Amazon has today. Nor do they have the ecosystem that Amazon has worked hard to build over the past 10 years. However, what they lack in IaaS depth is countered by their breadth up and down the stack. And while they may lack the breadth of features in the IaaS space compared with AWS, each are quickly catching up and are posting impressive growth rates.

Next week should provide another exciting event for Amazon and those working in the Cloud space. Whether coming from the startup, webscale or enterprise space, all eyes are on Amazon and their next move.

Cloud · IoT

Understanding the five tiers of IoT core architecture

Internet of Things (IoT) is all the rage today. Just tagging something as belonging to the IoT family brings quite a bit of attention. However, this tagging has also created quite a bit of noise in the industry for organizations trying to sort through how best to leverage IoT. Call it IoT marketing overload. Or IoT-washing.

That being said, just about every single industry can leverage IoT in a meaningful way today. But where does one begin? There are many ways to consider where to start your IoT journey. The first is to understand the basic fundamentals of how IoT solutions are architected. The five tiers of IoT core architecture are: Applications, Analytics, Data, Gateway and Devices. Using this architecture, one can determine where any given IoT solution fits…and the adjacent components required to compete the solution.

THE FIVE TIERS OF IOT CORE ARCHITECTURE

  • DEVICE TIER

The device tier is the physical device that collects data. The device is a piece of hardware that collects telemetry (data) about a given situation. Devices can range from small sensors to wearables to large machines. The data itself may be presented in many forms from electrical signals to IP-data.

The device may also display information (see Application tier).

  • GATEWAY TIER

The sheer number of devices and interconnection options creates a web of complexity to connect the different devices and their data streams. Depending on the streams, they may come in such diverse forms as mechanical signals or IP-based data streams. On the surface, these streams are completely incompatible. However, when correlating data, a common denominator is needed. Hence, the need for a gateway to collect and homogenize the streams into manageable data.

  • DATA TIER

The data tier is where data from gateways is collected and managed. Depending on the type of data, different structures may be called for. The management, hygiene and physical storage of data is a whole classification onto itself simply due to the four V’s of data (Volume, Variety, Velocity, Veracity).

  • ANALYTICS TIER

Simply managing the sheer amount of data coming from IoT devices creates a significant hurdle when converting data into information. Analytics are used to automate the process for two reasons: Manageability and Speed. The combination of these two present insights to the varied complexity of data coming from devices. As the number and type of devices vary and become increasingly more complex, so will the demand for analytics.

  • APPLICATION TIER

Applications may come in multiple forms. In many cases, the application is the user interface that leverages information coming from the analytics tier and presented to the user in a meaningful way. In other cases, the application may be an automation routine that interfaces with other applications as part of a larger function.

Interestingly, the application may reside on the device itself (ie: wearable).

IoT Architecture

 

Today, many IoT solutions cover one or more of the tiers outlined above. It is important to understand which tiers are covered by any given IoT solution.

CLOUD-BASED IOT SOLUTIONS

Several major cloud providers are developing IoT solutions that leverage their core cloud offering. One thing that is great about these solutions is that they help shorten the IoT development time by providing fundamental offerings that cover many of the tiers outlined above. Most of the solutions focus on the upper tiers to manage the data coming from devices. Three such platforms are: Amazon AWS IoT, IBM Watson IoT, and Microsoft Azure IoT Suite. Each of these emphasize on a different suite of ancillary solutions. All three allow a developer to shorten the development time for and IoT solution by eliminating the need to develop for all five tiers.

THE SECURITY CONUNDRUM

One would be remiss to discuss IoT without mentioning security. Security of devices, data elements and data flows are an issue today that needs greater attention. Instead of a one-off project or add-on solution, security needs to be part of the DNA infused in each tier of a given solution. Based on the current solutions today, there is a long way to go with this aspect.

That being said, IoT has a promising and significant future.

Data · IoT

The intimacy of Internet of Things along with security and privacy

IMG_4092

Internet of Things (IoT) is hot. Really hot! Every single industry from buildings to healthcare to financial services is looking at IoT. As more and more organizations look to capitalize on the blistering IoT market, they…and consumers risk the equivalent of ‘running with scissors’ when it comes to security and privacy.

PERSONALIZING MECHANICAL DATA

IoT presents a number of new sources of data. Some of that data is mechanical and non-personal in nature. A good example is machine data from sensors in a data center. The sensors tell a story about how the data center is performing. Those sensors may include electrical equipment, temperature sensors, airflow and water consumption.

However, some machine data can be personalized to provide insights to a person’s behaviors. An example in the Retail industry might be sensing how many times a person enters a store, the path they take within the store and ultimately what they purchase. In an office building, it may be telemetry around movements of people in/ out of the building, use of the restroom, meeting rooms and the like. At home, it might be temperature settings, entering through the garage door, turning on/ off lights.

The fact is: sensors are everywhere and organizations are starting to correlate the data from those sensors to people behaviors. Now, while most have the best intentions on integrating this data (creating better work environments, automating efficiencies, understanding purchase decisions, etc), it does bring up the question of how data is used.

THE INTIMACY OF WEARABLES

Adding to the data streams from machine data, there is a large source of IoT data is coming from personal devices including those we wear or wearables. And in some cases, this data is being correlated with machine data to provide even more personal automation.

The data coming from wearable devices is interesting…and personal. How personal? Very personal. Sure, wearable devices can register your exercise, activity patterns and sleep patterns. They can also tell when you have elevated heartrates and swift movements. It essentially starts to identify patterns…even intimate patterns. Start to marry this data with other data around location, purchase habits and you start to see how the data streams can be very telling. For example, one could surmise with accuracy what products were purchased before and after certain activities which in turn could provide a very personal perspective on the person.

Now imagine if that data or those patterns were made public. One starts to see how the privacy concerns about our behaviors (intimate or otherwise) quickly become apparent. That is not to say that we should avoid IoT and wearables.

SHIFTING FROM AN AFTERTHOUGHT TO CORE

The obvious solution is to take care with the data and understand how it is used. As with many technology solutions, the security of the device and the data often comes as an afterthought. Unfortunately, IoT is following in those well-understood footsteps.

Security often flies in the wind to avoid constricting innovation and speed to delivery. This is true across the strata for IoT from devices through gateways and all the way to applications. It seems that privacy and security is a routine subject for IoT. Privacy and IoT are not new challenges for innovation in its infancy. The level and intensity of interest in privacy is starting to reach a feverish pitch as device users start to consider the implications.

The issues are not just around wearables either. The same issues reside for corporate IoT solutions that start to understand and react to user behaviors. Even the historically most mundane things, like a building, are starting to get a personality. The personality of the building is starting to understand the behaviors of its users. No longer is security and privacy relegated to only newer solutions.

Now is the time to stop thinking of security and privacy as an afterthought. It is possible to infuse both into each step of the process. It requires a change in the culture and way that solutions are developed. Consumers can help drive these changes through their buying habits. Look for solutions that take security and privacy seriously. End-users, whether corporate or consumer have the best opportunity to impact change.