#CIOitk: Are IT organizations headed toward irrelevance?

Several IT executives and CIOs (former and current) turned a Twitter conversation about the ‘platform economy’ into a candid conversation about the core issues facing IT. Is IT facing such a dire downward spiral? And if so, is it too late to course-correct?

Why are these conversations important? IT executives learn from their peers, however, there is a general lack of transparency, not to mention quite a bit of fear, uncertainty and doubt (FUD) in the market. The goal of these conversations is to cut through the noise and quickly get to the heart of the issues. Enough of the motherhood and apple pie. It’s time to roll up our sleeves, take a hard look at the situation and dig in.

We assembled this group with first-hand experience on the front-line of IT – CIOs ‘in the know’ (#CIOitk) to have a candid conversation about the issues. Together, we will get to the bottom of IT’s value moving forward. Does IT really matter…still? The questions may not be immediately obvious, but when identified and answered, will bring greater impact.

 

Stuart Appley: CIO of Shorenstein

Tim Crawford: CIO Strategic Advisor of AVOA

Bob Egan: CEO of Sepharim Group

Mark Thiele: EVP Ecosystem Evangelism at Switch

 

Join us for the LIVE conversation:

#CIOitk: Are IT organizations headed for irrelevance?

Monday, August 3rd at 10am Pacific Time

YouTube Live Stream: http://avoa.co/1ghcTT5

Twitter Hashtag: #CIOitk

 

In addition to the Twitter chat, feel free to pose questions here. This is an ongoing conversation, not a one-time chat. Looking forward to continuing the dialog in a variety of ways.

Outages happen. How prepared are you for the next one?

Significant outages hit several major firms today including United Airlines, New York Stock Exchange (NYSE) and the Wall Street Journal. And that was just this morning alone. While many suggest a correlation between the three, I will leave that to the experts.

The point is, outages happen. How prepared you are and how you respond to the next outage is what matters.

DR/ BC: Not new and still broken

The bottom line is that disaster recovery and business continuity is not new and is still broken. IT organizations have cobbled together solutions for decades with limited success. There are shining lights but often they are startups based on an entirely different culture.

Are you only using backups? How do you know you’re backing up the right information? Is that enough? Generally, the answers are less flattering than most would want.

Redundancy doesn’t cut it either. Organizations continue to build redundancy into systems to the point of over complicating systems. And complication leads to greater risk of missing a step along the way

Today, backups and redundancy is not enough. Building to increase the number of 9’s of uptime is not the answer either. Organizations need to go beyond that.

Time to streamline

There are a number of ways this problem can be resolved. DR/ BC is a must but not using traditional means. Investment is needed, and there are clever solutions available today. This is more than just a technology problem. It involves process, organization and culture. Think about DevOps. Think about how to streamline.

 

Remember: The more the complicated the system, the longer to troubleshoot. And the greater the risk to both the company and its customers. How prepared are you for the next outage?

Are the big 5 enterprise IT providers making a comeback?

Not long ago, many would have written off the likes of the big five large enterprise IT firms as slow, lethargic, expensive and out of touch. Who are the big five? IBM (NYSE: IBM), HP (NYSE: HPQ), Microsoft (NASDAQ: MSFT), Oracle (NYSE: ORCL) and Cisco (NASDAQ: CSCO). Specifically, they are companies that provide traditional enterprise IT software, hardware and services.

Today, most of the technology innovation is coming from startups, not the large enterprise providers. Over the course of 2015, we have seen two trends pick up momentum: 1) Consolidation in the major categories (software, hardware, and services) and 2) Acquisitions by the big five. Each of them are making huge strides in different ways.

Here’s a quick rundown of the big five.

IBM guns for the developer

Knowing that the developer is the start of the development process, IBM is shifting gears toward solutions that address the new developer. Just look at the past 18 months alone.

  • February 2014: Dev@Pulse conference showed a mix of Cobol developers alongside promotion of Bluemix. The attendees didn’t resemble your typical developer conference. More details here.
  • April 2014: Impact conference celebrated 50 years of the mainframe. Impact also highlighted the SoftLayer acquisition and brought the integration of mobile and cloud.
  • October 2014: Insight conference goes further to bring cloud, data and Bluemix into the fold.
  • February 2015: InterConnect combines a couple of previous conferences into one. IBM continues the drive with cloud, SoftLayer and Bluemix while adding their Open Source contributions specifically around OpenStack.

SoftLayer (cloud), Watson (analytics) and Bluemix are strengths in the IBM portfolio. And now with IBM’s recent acquisition of BlueBox and partnership with Box, it doesn’t appear they are letting up on the gas. Add their work with Open Source software and it creates an interesting mix.

There are still significant gaps for IBM to fill. However, the message from IBM supports their strengths in cloud, analytics and the developer. This is key for the enterprise both today and tomorrow.

HP’s cloudy outlook

HP has long had a diverse portfolio that addresses the needs of the enterprise today and into the future. Of all big five providers, HP has one of the best matched to the enterprise needs today and in the future.

  • Infrastructure: HP’s portfolio of converged infrastructure and components is solid. Really solid. Much of it is geared for the traditional enterprise. One curious point is that their server components span the enterprise and service provider market. However, their storage products are squarely targeting the enterprise to the omission of the service providers. You can read more here.
  • Software: I have long since felt that HP’s software group has a good bead on the industry trends. They have a strong portfolio of data analytics tools with Vertica, Autonomy and HAVEn (being rebranded). HP’s march to support the Idea Economy is backed up by the solutions they’re putting in place. You can read more here.
  • Cloud: I have said that HP’s cloud strategy is an enigma. Unfortunately, discussions with the HP Cloud team at Discover this month further cemented that perspective. There is quite a bit of hard work being done by the Helion team, but the results are less clear. HP’s cloud strategy is directly tied to OpenStack and their contributions to the projects support this move.

HP will need to move beyond operating in silos and support a more integrated approach that mirrors the needs of their customers. While HP Infrastructure and Software are humming along, Helion cloud will need a renewed focus to gain relevance and mass adoption.

Microsoft’s race to lose

Above all other players, Microsoft still has the broadest and deepest relationships across the enterprise market today. Granted, much of those relationships are built upon their productivity apps, desktop and server operating systems, and core applications (Exchange, SQL, etc). There is no denying that Microsoft probably has relationships with more organizations than any of the others.

Since Microsoft Office 365 hit its stride, enterprises are starting to take a second look at Azure and Microsoft’s cloud-based offerings. This still leaves a number of gaps for Microsoft; specifically around data analytics and open standards. Moving to open standards will require a significant cultural shift for Microsoft. Data analytics could come through the acquisition of a strong player in the space.

Oracle’s comprehensive cloud

Oracle has long been seen as a strong player in the enterprise space. Unlike many other players that provide the building blocks to support enterprise applications, Oracle provides the blocks and the business applications.

One of Oracle’s key challenges is that the solutions are heavy and costly. As enterprises move to a consumption-based model by leveraging cloud, Oracle found itself flat-footed. Over the past year or so, Oracle has worked to change that position with their cloud-based offerings.

On Monday, Executive Chairman, CTO and Founder Larry Ellison presented Oracle’s latest update in their race for the enterprise cloud business. Oracle is now providing the cloud building blocks from top to bottom (SaaS PaaS IaaS). The message is strong: Oracle is out to support both the developer and business user through their transformation.

Oracle’s strong message to go after the entire cloud stack should not go unnoticed. In Q4 alone, Oracle cloud cleared $426M. That is a massive number. Even if they did a poor job of delivering solutions, one cannot deny the sheer girth of opportunity that overshadows others.

Cisco’s shift to software

Cisco has long since been the darling of the IT infrastructure and operations world. Their challenge has been to create a separation between hardware and software while advancing their position beyond the infrastructure realms.

In general, networking technology is one of the least advanced areas when compared with advances in compute and storage infrastructure. As cloud and speed become the new mantra, the emphasis on networking becomes more important than ever.

As the industry moves to integrate both infrastructure and developers, Cisco will need to make a similar shift. Their work in SDN with ACI and around thought-leadership pieces is making significant inroads with enterprises.

Summing it all up

Each is approaching the problem in their own ways with varying degrees of success. The bottom line is that each of them is making significant strides to remain relevant and support tomorrow’s enterprise. Equally important is how quickly they’re making the shift.

If you’re a startup, you will want to take note. No longer are these folks in your dust. But they are your potential exit strategy.

It will be interesting to watch how each evolves over the next 6-12 months. Yes, that is a very short timeframe, but echoes the speed in which the industry is evolving.

Containers in the Enterprise

Containers are all the rage right now, but are they ready for enterprise consumption? It depends on whom you ask, but here’s my take. Enterprises should absolutely be considering container architectures as part of their strategy…but there are some considerations before heading down the path.

Container conferences

Talking with attendees at Docker’s DockerCon conference and Redhat’s Summit this week, you hear a number of proponents and live enterprise users. For those that are not familiar with containers, the fundamental concept is a fully encapsulated environment that supports application services. Containers should not be confused with virtualization. In addition, containers are not to be confused with Micro Services, which can leverage containers, but do not require them.

A quick rundown

Here are some quick points:

  • Ecosystem: I’ve written before about the importance of a new technology’s ecosystem here. In the case of containers, the ecosystem is rich and building quickly.
  • Architecture: Containers allow applications to break apart into smaller components. Each of the components can then spin up/ down and scale as needed. Of course automation and orchestration comes into play.
  • Automation/ Orchestration: Unlike typical enterprise applications that are installed once and run 24×7, the best architectures for containers spin up/ down and scale as needed. Realistically, the only way to efficiently do this is with automation and orchestration.
  • Security: There is quite a bit of concern about container security. With potentially thousands or tens of thousands of containers running, a compromise might have significant consequences. If containers are architected to be ephemeral, the risk footprint shrinks exponentially.
  • DevOps: Container-based architectures can run without a DevOps approach with limited success. DevOps brings a different methodology that works hand-in-hand with containers.
  • Management: There are concerns the short lifespan of a container creates challenges for audit trails. Using traditional audit approaches, this would be true. Using newer methods provides real-time audit capability.
  • Stability: The $64k question: Are containers stable enough for enterprise use? Absolutely! The reality is that legacy architecture applications would not move directly to containers. Only those applications that are significantly modified or re-written would leverage containers. New applications are able to leverage containers without increasing the risk.

Cloud-First, Container-First

Companies are looking to move faster and faster. In order to do so, the problem needs reduction into smaller components. As those smaller components become micro services (vs. large monolithic applications), containers start to make sense.

Containers represent an elegant way to leverage smaller building blocks. Some have equated containers to the Lego building blocks of the enterprise application architecture. The days of large, monolithic enterprise applications are past. Today’s applications may be complex in sum, but are a culmination of much smaller building blocks. These smaller blocks provide the nimble and fast speed that enterprises are clamoring for today.

Containers are more than Technology

More than containers, there are other components needed for success. Containers represent the technology building blocks. Culture and process are needed to support the change in technology. DevOps provides the fluid that lubricates the integration of the three components.

Changing the perspective

As with the newer technologies coming, other aspects of the IT organization must change too. Whether you are a CIO, IT leader, developer or operations team, the very fundamentals in which we function must change in order to truly embrace and adopt these newer methodologies.

Containers are ready for the enterprise…if the other aspects are considered as well.

HP Software takes on the Idea Economy

The Idea Economy is being touted pretty heavily here at HP Discover in CEO Meg Whitman’s keynote. Paul Muller (@xthestreams), VP of Strategic Marketing in HP Software took us on a journey of how HP Software is thinking about solving today’s problems and preparing for the future state. Unlike many of the other presentations the journey is just as important as the projects. It helps organizations, partners, customers and providers align their vision and understand how best respond to the changing business climate.

The combination of non-digital natives looking at new technology in one way while millennials are approaching technology in a completely fresh way creates a bit of a challenge. Millennials often create and support disruption. Quite a different approach from their non-digital natives. According to HP’s Muller, a full “25% of organizations will fail to make it to the next stage through disruption.” If you’re an existing legacy enterprise, how do you embrace the idea economy while at the same time running existing systems? This presents a serious, but real challenge for any established enterprise today.

Muller then took the conversation of ‘bi-modal IT’ as a potential answer to the problem. Bi-modal IT is being discussed as ‘hybrid IT’ or two-speed IT to address the differences between running existing core systems while innovating with new products and services. In addition to the technology challenges, bi-modal IT creates a number of other challenges that involve process and people. Side note: Look for an upcoming HP Discover Performance Weekly episode we just recorded on the subject of bi-modal IT with Paul Muller and Paul Chapman, CIO of HP Software. In the episode, we take a deeper dive from a number of perspectives.

HP Software looks at five areas that people need to focus on:

  1. Service Broker & Builder: Recognize that the problem is not a buy vs. build question any longer. Today, both are needed.
  2. Speed: The speed in which a company innovates by turning an idea into software is key. Most companies are just terrible at this process. DevOps plays a key role with improving the situation.
  3. BigData & Connected Intelligence: Understand the differences between what customers ask for vs. what they use. BigData can provide insights here.
  4. User Experience: What is the digital experience considering the experience, platforms and functions?
  5. Security: Securing the digital assets are key. 33% of successful break-ins have been related to a vulnerability that has been known for 2 years. (Stuxnet).

HP leverages their Application Lifecycle Management process to address each of these five areas with data playing a fundamental role.

There was some discussion about the maturity cycle of companies regarding BigData. Trends show that companies start with experimentation of data outside the enterprise in the cloud. The data used is not sensitive or regulated. When it’s time to move into production, the function is brought back in-house. The next step in the maturity cycle are those that then move production BigData functions back outside into the cloud. Very few folks are doing this today, but this is the current trend.

And finally a core pain point that is still…still not managed well by companies: Backup and Disaster Recovery. This is nothing new, but an area ripe for disruption.

Overall, it was refreshing to hear more about the thought leadership that goes into the HP Software machine rather than a rundown of products and services.

HP Discover Executive Storage Discussion

HP kicked off their coffee talk series with executives from the HP Storage group. Manish Goel is the new SVP & GM of HP Storage replacing David Scott. While Manish has only been with HP two months, he has a firm grasp on where the HP Storage group is headed. Manish was formerly with NetApp and understands the shifts in the storage marketplace today.

One of the core premises from the HP Storage group is to drive the cost of storage down; specifically in the flash category. HP is still sticking with both 3PAR and StoreVirtual strategies. Both solutions are squarely geared toward the enterprise market with a myriad of models to suit the varied requirements.

In addition, HP’s approach is to address the storage framework with three tiers:

  1. Infrastructure and Hardware: These are the core building blocks.
  2. Infrastructure Technology: Backup and other core storage services.
  3. Storage Management: This is where HP OneView comes in to provide a single pane of glass view across storage.

Flash is a hot topic in the storage communities. HP’s objective is to lower the cost of all-flash arrays. One statistic supporting this move is: 1 out of every 3 arrays that HP ships are all-flash arrays. But will flash continue the march down the cost path and eventually replace tape? Doubtful in the near-term. Many of the cost comparisons take place between flash storage and conventional spindles.

When asked about HP’s strategy to provide storage solutions to the service provider market, the conversation changes a bit. It’s clear that HP is focused on the on-premises enterprise market. Storage support for the service provider market will come via the HP Helion Cloud solutions.

On the subject of data services, the storage team is fully engaged with OpenStack, REST APIs and data integration. This is one area to watch as HP Storage moves forward strategically.

What to watch for at HP Discover this week

This week marks HP’s annual Discover conference in Las Vegas. HP has come a long way in the past couple of years and this year should prove interesting in a number of ways. Here is a list of items to watch in the coming couple of days:

Announcements: There are a couple of significant announcements planned this week. While the announcement itself is interesting, the long term impact should prove a more interesting opportunity for HP’s strategy post-split. Watch the keynotes for more details Tuesday and Wednesday.

Split Update: News about the HP split into two companies is not new. Look for more details on the progress of the split and what it means for each of the two entities. On the surface and through a number of ‘hallway conversations’ I’ve had, it seems that the split is bringing greater focus to the enterprise teams. This is good for HP and for customers.

Software: The HP Software team is a large and diverse bunch. The areas I’m particularly interested in are the progress around HAVEn, Vertica and Autonomy. Initial conversations point to some really interesting progress for customers. As BigData, Analytics and data (in general) become front-and-center for organizations, look for this area to explode. We have only scratched the surface with more opportunities abound. I’m looking at ways HP is educating customers on the value opportunities in a way they can consume. While there are themes, we are moving to a ‘market of one‘.

Cloud: The HP Helion Cloud has a number of things happening at the conference. I’m particularly interested in the progress they’ve made around commercial offerings of OpenStack and private cloud. Overall, cloud adoption is still very anemic (not just for HP). I’m looking for ways HP is creating the onramps to cloud to reduce apprehension and increase adoption rates. Many of the challenges span greater than the technology itself. Look for ways HP is engaging customers in new and different ways. In addition, watch for changes in how the solutions are shifting from supporting enterprises directly to supporting service providers. Bridging the gap here is key and the needs are very different.

Infrastructure: Many enterprise customers still maintain a large infrastructure presence. Even if their strategy is to shift toward a cloud-first methodology, there are reasons to support internal infrastructure. Look for ways HP is evolving their infrastructure offerings to support today’s enterprise along with its evolution to a cloud-first model. As the sophistication of data increases, so will storage solutions to meet the ever-changing requirements. Similarly, the complexity from networking that solutions like Software Defined Networking (SDN) address will be interesting to watch for.

Wild Cards: There are a number of wild cards to watch for as well. The first is DevOps. DevOps is critical to today’s IT organization and moving forward. It applies differently to different orgs. Watch for the subject addressed in Keynotes. The second wild card is an update from HP Labs. HP Labs has a number of really interesting…and innovative solutions in the works. Look for an update on where things stand and how HP sees innovation changing.

Finally, I have a number of video interviews scheduled over the next couple of days where I dive deeper into each of these areas. Plus, will cover an update on the state of the CIO. Look for links to those either using the #HPDiscover hashtag or on the blog after the show.

As always, feel free to comment and join the conversation on Twitter. The hashtag to follow is: #HPDiscover

IBM and Weather Company deal is the tip of the iceberg for cloud, data and IoT

Technology and how we consume it is changing faster than we know it. Need proof? Just look at the announcement last night between IBM & Weather Company. It was just a short 4.5 months ago that I was sitting in the Amazon AWS re:Invent keynote on Nov 13, 2014 listening to Weather Company’s EVP, CTO & CIO Bryson Koehler discuss how his company was leveraging Amazon’s AWS to change the game. After the keynote, I had the opportunity to chat with Bryson a bit. It was clear at the time that while Amazon was a key enabler for Weather Company, they could only go so far.

The problem statement

Weather Company is a combination of organizations that brings together a phenomenal amount of data from a myriad of sources. Not all of the sources are sophisticated weather stations. Bryson mentioned that Weather Company is “using data to help consumers gain confidence.” Weather Company uses a number of platforms to produce weather results including Weather Channel, weather.com and Weather Underground. Weather Underground is their early testbed for new methods and tools.

Weather Company produces 15 billion forecasts every day. Those forecasts come billions of sensors across the globe. The forecasts for 2.2 million locations are updated every four hours with billions more updated every 15 minutes. The timeliness and accuracy of their forecasts is what ultimately builds consumer confidence.

Timing

The sheer number of devices makes Weather Company a perfect use-case of leveraging Internet of Things (IoT) powered by Cloud, Data and Analytics. Others may start to see parallels between what Weather Company is doing with their own industry. In today’s competitive market, the speed and accuracy of information is key.

IBM’s strategy demonstrated leadership in the cloud and data/ analytics space with their SoftLayer and Watson solutions. Add in the BlueMix platform and one can see how the connection between these solutions becomes clear. Moving to IoT was the next logical step in the strategy.

Ecosystem Play

The combination of SoftLayer, BlueMix and Watson…plus IoT was no accident. When considering the direction that companies are taking by moving up the stack to the data integration points, IoT is the next logical step. IoT presents the new driver that cloud and data/ analytics enable. BlueMix becomes the glue that ties it all together for developers.

The ecosystem play is key. Ecosystems are everything. Companies are no longer buying point solutions. They are buying into ecosystems that deliver direct business value. In the case of Weather Company, the combination of IBM’s ecosystem and portfolio provides key opportunities to producing a viable solution.

Next Steps…

That being said, the move by IBM & Weather Company should not be seen as a one-off. We should expect to see more enterprises make moves like this toward broader ecosystems like IBM’s.

Acquisitions are key for enterprise providers, but strategies are killing innovation

Over the course of a career, employees may take part in one or two acquisitions. Over the course of my career, I was part of the core team for more than a dozen acquisitions and mergers. Gaining access to such a high volume of transactions and from both sides (acquirer and acquired) provides a unique view of the process. And those numbers do not count the number of deals where due diligence was performed but the transaction ultimately did not proceed forward. Over the course of each transaction, the good, bad, ugly takes a whole new form.

It is no secret that large enterprise providers need a healthy funnel of acquisition targets that lead to deals in order to remain relevant. Smaller startups are driving toward being the next acquisition target. That part of the story is going pretty well. What happens next is where the train falls off the track.

Acquisition vs. internal development

If acquisitions are so problematic, why bother? There are several schools of thought here. One is that internal development may provide a more specific solution to support the business strategy. The challenge is that the company would need to start several initiatives to determine which one will ultimately pan out. This approach takes time, resources and money along with introducing risk. Of the five different research projects, maybe only one makes it. The other four go nowhere.

The alternative is to let startups develop the solutions and ‘test’ the market. Once a solution is proven to succeed, then the company can swoop in and acquire the solution. That seems like a great approach, but is fraught with risk, a premium on the acquisition and ultimately, potential failure.

The process of acquisition

Acquisitions are a hard. It is a form of courtship at a corporate level. There is a sort of dance that companies go through during the courting, due diligence and acquisition process. Each tries to entice the other while trying to fully understand the other party. And like relationships, it may look good on paper, but without good chemistry, the entire thing falls apart.

Goliath learning from David

In most strategies, the acquiring company is the larger and more established of the two. The larger firm looks at ways to ‘integrate’ the smaller one into their collective and portfolio. There are good reasons to take this approach: Scaling, established processes, larger customer footprint, etc. For the smaller acquisition target, these are all enticing traits to immediately gain access to.

The problem is when the smaller organization is made to operate just like the larger one. When the processes and (often) bureaucracy of the larger organization descend on the smaller organization, the innovation process starts to grind to a halt. Staff and customers that were once used to an agile, innovative company are now working with a larger, process-driven organization.

Goliath is taking over David. However, the process needs to work in the other direction. Goliath needs to learn from David to evolve and stay relevant. Yes, this is possible! In my past, we evaluated learning opportunities from the acquisition target. How could we evolve our company to leverage components as part of the process? Essentially, how do we leverage the best of both worlds?

Changing the acquisition strategy

Is it possible to avoiding killing innovation through acquisition? Absolutely! It does not matter whether the acquisition strategy is portfolio driven, gaining market share or customers, acquiring technology or an acqui-hire. All of them can leverage a Goliath-learning-from-David approach.

If the process is not changed, innovation will continue to develop outside the established enterprise providers, get acquired, then stagnate or die. This sequence is playing out over and over again with little change.

There is so much innovation happening right now in the technology industry outside of established enterprise providers. It would be a shame to see it flourish and stagnate without fully realizing it’s potential. By changing the acquisition strategy, innovation has the opportunity to fully develop and mature in a way customers are clamoring for.

If you are the acquirer, consider how to learn from David. If you are the acquisition target, consider how you could help Goliath evolve. If both sides work together, the impact and success of the acquisition will only increase.

Uncertainty is the only certainty in technology today

Last week was spent at the IBM InterConnect and Green Data Center conferences in Las Vegas and San Diego respectively. At each of the conferences, there were a ton of great conversations around the CIO, cloud computing, social media, big data analytics and data centers. While more details will come out in future posts, a common theme became crystal clear. We are squarely in a period of extreme disruption and no amount of Dramamine will settle the tides. The needs of the many far outweigh the needs of one, two, or three.

The power of social media

Social media plays a central role to gather our collective thoughts and banter. The conversations that ensue will further the development of innovation through the development of new ideas and critiques. However, today, we are only scratching the surface with social. Many of the ‘conversations’ happening across social media are one-way conversations usually sharing information, but with little interaction. The vast majority of tweets coming from the conferences are either a promotion or sound bite overheard during a session or conversation. In addition, there is quite a bit of ‘noise’ that contributes to the confusion. If one were to try and follow the threads, it would appear an eclectic mix of varied thoughts taken from some complex juxtaposition. A better approach is needed to improve the level of two-way engagement.

Cloud, the great equalizer

Cloud is very similar to social in terms of missed opportunities. Cloud presents the single-largest opportunity for organizations today regardless of size. At the InterConnect conference, cloud was in the forefront of many discussions. The challenge many had was how to effectively embrace and leverage cloud. Those tie back to a gap between the freeway and the on-ramps. We do not need more freeways, we need more on-ramps. Yet we continue to build new freeways.

Is it possible that cloud has gotten too far ahead of itself? One of the many discussions was that of the speed of innovation versus adoption. Is it possible we have reached a point where we are actually innovating too quickly without fully considering the ramifications? There is more to be written on this issue alone.

Understanding the customer

Ironically, much of this may go back to understanding the customer. For the vendor or provider, it is understanding who is buying (or should buy) the solution and why. It is about shifting from a transactional sale to a consultative one. That is easier said than done, as context is required to do so.

Enterprises are not immune from the confusion. According to a recent IBM survey of CEOs, 31% doubt c-suite executives understand the changes from customer and the marketplace. That is a huge number when looking across the entire c-suite. If the same question were asked of the CIO specifically, the number would most likely increase. That is not a good position considering the emphasis tech plays in the customer relationship today.

Changes in paradigms

The chasm may simply tie back to a difference in understanding evolution. The customer base is moving very quickly. For the past decade, the number of digital natives in the workplace has only increased. And they are having a strong influence on other generations. They are more familiar with technology and comfortable with rapid adoption. Yet the solutions we deliver leave them wanting.

Understanding the root of discomfort

And so the problem comes full-circle. As with any problem, it is important to understand the root of the issue. When I discuss this in detail with IT leaders and staff members the root issue comes back to uncertainty. There is a level of uncertainty with the solution, nerves, job loss and a general path forward.

Forging a path ahead

Change is hard. Change is confusing. Change is stress and burnout. And at the edge it kills. Think that is being a bit dramatic? Just read my friend John Willis’ moving post about Karojisatsu.

But change is not something we should fear. At this point, we must stick together and drive hard toward the future. Our very future depends on the success of our ability to adapt and change.

For the foreseeable future, the tech industry will continue to present confusion and uncertainty. Our ability to adapt and accept uncertainty is directly related to our ultimate success.

Originally posted @ Gigaom Research 3/2/2015

http://research.gigaom.com/2015/03/uncertainty-is-the-only-certainty-in-technology-today/