IBM Dev@Pulse Conference Impressions

The Dev@Pulse conference is IBM’s developer conference tagging onto the core IBM Pulse conference. This is IBM’s first year trying to hold a developer-centric conference (Dev@Pulse) alongside their flagship cloud conference (IBM Pulse). Dev@Pulse is about as far away physically as you can get from the main conference and held in a nightclub rather than conference center. While Hakkasan LV Nightclub is a trendy choice to hold a developer summit, the attendance seemed thin and pulled from the existing IBM user base.

In talking with folks at the event, it seems IBM is trying new things like lightning talks, demos and a playground to woo developers. There are several labs to choose from too. In talking with devs prior to the event, many of them had not hear of Pulse, let alone Dev@Pulse. This seems like a miss on IBM’s part to attract new developers to the IBM ecosystem. After attending several sessions at Dev@Pulse, I’d say that it has the makings of a decent developer conference.

A good example was IBM’s technical rundown of the just-announced IBM BlueMix. The session went soup-to-nuts on how to deploy an app using BlueMix as a great summary introduction of how BlueMix works and how to deploy an app using the new service. However, only 50 or so folks were in attendance for the talk here at Dev@Pulse. One presenter even asked the audience how many are Cobol programmers and several raised their hands. That is a very different crowd from the typical Node.js, Rails, Python crowd.

On the other hand, IBM is identifying the need to bridge the gap between cloud-based apps and traditional apps. They also note that the challenge is in extensively leveraging APIs. The API economy is a reality…not just with IBM but in the industry today. That’s pretty forward thinking (and refreshing) for a traditional enterprise organization like IBM. The question will be: How to get the permeation through the rest of the organization beyond just BlueMix and SoftLayer.

Trying to attract developers to a conference like IBM Pulse or Dev@Pulse will be hard to do in isolation. There are a number of other efforts in terms of location, audience, social impact and content that IBM needs to address in order to make significant, sustaining waves.

IBM Pulse Conference Impressions

IBM’s conference on cloud is taking place this week in Las Vegas at the MGM Convention Center. There are two core parts to the conference; The main Pulse general conference and the Dev@Pulse conference which is specifically targeting developers. The two conferences are…and feel…completely separate from one another.

IBM Pulse Core
The main IBM Pulse conference is IBM’s rebranding of their past Tivoli conference into a cloud conference. Pulse 2014 is branded as ‘The Premier Cloud Conference’. I’m not sure I would go that far, but it’s much more than that. IBM is bringing much more of their software portfolio to bear. Having attended, keynoted and presented at a number of cloud conferences around the globe, this one takes on quite a different vibe. It’s far more subdued and less energetic than its cloud rivals. That could stem from IBM past culture creeping forward. But there are many trying to shed the IBM of old for a fresh and modern perspective. Just talking with folks around the room, it’s clear that the individuals want to turn the corner, but it’s hard to turn a battleship in a bathtub. Changing culture is hard.

The presentations have been a mix of lukewarm depth and buzzword bingo. While the right things are being stated and the cloud-centric focus is decent, it is unclear the impact IBM has for its customers. This seems to be a pretty significant gap for IBM to fill in the coming months. One area IBM needs to address is how to attract new developers and customers. Looking around the room, 1/2 of the gentlemen are in suits while the other 1/2 are simply jacket-less. You rarely see the jeans, tee shirt or hoodie walking the floor. This speaks more to IBM’s current user base that the new millennials that are the real x-factor in cloud development today.

And speaking of developers, IBM’s Dev@Pulse developer conference is happening in tandem with IBM Pulse. More on Dev@Pulse and IBM Pulse perspectives coming soon!

Who are the biggest cloud providers?

Cloud computing has taken off like wildfire. In that vein there is a bit of discrepancy between who the largest cloud company is and isn’t. Is perception reality? One could argue that perception is reality. And the perception today is that Amazon (NASDAQ:AMZN) is the largest. That may be the perception, but the reality is far different.

Amazon by the Numbers

Today, Amazon’s market cap is $180 billion. In fiscal 2012, Amazon reported $61 billion in revenue. In Nov 2013, Alex Williams (@alexwilliams) of TechCrunch reported that the AWS (Amazon’s Web Services) portion is “…widely believed to be a $3.5 billion business.” That is a substantial business on its own, but pales in comparison with the revenue from the rest of the Amazon portfolio; namely amazon.com. Meaning, AWS may account for less than 6% of Amazon’s total revenue.

Largest Cloud Computing Providers

Now let’s look at the larger, publicly traded companies that focus on cloud computing solely:

Market Cap (as of Jan 24, 2014)

Google (NASDAQ:GOOG) = $379 billion

Salesforce (NYSE:CRM) = $36 billion

LinkedIn (NYSE:LNKD) = $25 billion

Workday (NYSE:WDAY) = $16 billion

NetSuite (NYSE:N) = $8 billion

ServiceNow (NYSE:NOW) = $8 billion

Concur (NASDAQ:CNQR) = $6 billion

Now, these are closer to pure-play cloud providers. Meaning, their only business is cloud-based services. And they don’t represent two other groups: Diversified public companies and privately held companies.

Diversified Cloud Companies

While cloud computing may not be their primary source of revenue, one can not exclude this group from the mix. Some of the larger companies include:

Microsoft (NASDAQ:MSFT) = $309 billion -> 365, Dynamics, Azure

Oracle (NYSE:ORCL) = $168 billion -> Cloud

SAP (NYSE:SAP) = $92 billion -> Cloud

IBM (NYSE:IBM) = $196 billion -> SoftLayer

HP (NYSE:HPQ) = $55 billion -> Cloud

Verizon (NYSE:VZ) = $137 billion -> Terremark

Rackspace (NYSE:RAX) = $5 billion -> Cloud

The revenue from cloud alone from each of these companies can easily reach into the billions of dollars.

Privately Held Cloud Companies

Aside from publicly traded companies, there are a number of privately held companies. Some of which are startups. Of late, companies such as Box and Dropbox have received valuations of $2 billion and $8 billion respectively. And this is before they go IPO.

Perception vs. Reality

The point is, Amazon’s AWS may garner the 800lb gorilla perception, but there are a number of other viable and larger cloud providers in the market today. And this doesn’t account for the up-and-coming providers that could provide Amazon with some healthy competition.

Are Enterprises Prepared for the Data Tsunami?

Companies are in for a major change to how they operate, manage and leverage data in the coming years. Data is quickly becoming the new currency and leading businesses are looking for ways to capitalize on this change.

The Data Deluge Problem

A recent IDC report on data suggests the sheer amount of data generated doubles every two years. By the year 2020, the total amount of data will equate to 40,000 exabytes or 40 trillion gigabytes. To put in perspective, that’s more than 5,200 gigabytes for every man, woman and child in 2020.

However, the problem is not the data itself. The problem rests with how and what to do with the data. To complicate matters, much of the data generated comes from new sources such as wearable devices, mobile devices, social media and machine data.

Social Data Streams

The impact from social media is significant on its own:

Twitter: 400 millions tweets per day

Facebook: 4.75 billion content items shared per day

According to one whitepaper, Facebook currently houses more than 250 petabytes of data with .5 petabytes of new data arriving every day. Facebook and Twitter only represent two of the more popular social data sources yet there are many more.

IoT and Machine Data

A relatively recent source of data is coming from the Internet of Things (IoT). IoT represents a collection of uniquely identifiable items. Individually, these items generate their own sets of data. Data may also come in the form of ‘machine data’ or industrial data, which is generated through the use of equipment.

For example, GE’s GEnx next generation turbofan engines found on Boeing 787 and 747-8 aircraft contain some 5,000 data points that are analyzed every second. Put that into perspective. According to Wipro research, a single cross-country flight across the United States generates 240TB of data. The average Boeing 737 engine generates 10 terabytes every 30 minutes of flight.

Using a bit of math, the problem becomes fairly apparent. Using data from MIT’s Airline Data Project and the total number of Boeing 787’s in use as of December 31, 2013, the problem becomes:

Total data generated every day by the global 787 fleet in operation today:

(20TB/ hr x 9hr ave operation per day) x 2 engines x 114 787 aircraft = 41,040 terabytes (or 40 petabytes)

For Southwest Airlines alone, their data challenge is more significant:

Total data generated every day by Southwest Airlines’ fleet of 607 Boeing 737 aircraft:

(20TB/ hr x 10.8hr ave operation per day) x 2 engines x 607 737 aircraft = 262,224 terabytes (or 256 petabytes)

256 petabytes is a lot of data. GE included a couple more examples of Industrial Data in “The Case for an Industrial Big Data Platform.” From these examples, the sheer amount of data from IoT and machine data becomes clearly apparent. And these examples only highlight a small, specific use-case that does not take into account other aspects of the airline industry.

Not All Data is Equal

In many cases, unlike traditional enterprise data, which is structured in nature, these new sources of data reside in many forms and are typically unstructured. This presents a challenge to traditional data warehouses that are accustom to consuming and managing structured data.

When thinking about how to ‘consume’ these new sources of data, several key considerations reside with the data itself. Much of the data, in essence, has a half-life that drives its value over time. An important consideration is in which data to keep and for how long. The challenge is in knowing now what data might be needed in the future. That is easier said than done.

The default action taken by many enterprises today is to simply keep all data, which is costly just for the storage in which to house it. Unfortunately, this is leading to ‘data landfills’ of mixed data with varying degrees of value. As the volume of data increases, so will the landfills unless a different approach is taken.

The Holy Grail of Data Correlation

In addition to stockpiling data, the real value for many will come in the form of correlation. Leveraging one data stream provides valuable insight. However, when paired or correlated with multiple data streams, a much clearer picture becomes visible.

Think of the value to a company when they can compare social data, operational data and transaction data. For many marrying these data streams present multiple challenges. Now imagine that the number of streams (sources) along with the volume of data is increasing. It becomes clear how the problem gets pretty complicated pretty quickly.

Consumer vs. Corporate

From the increase in consumer adoption of devices and services over the past few years, it is clear that consumers are ready to generate more data. Enterprises need to prepare for the oncoming onslaught.

As consumers, we want enterprises to succeed in leveraging the data we provide. Take healthcare for example. Imagine if healthcare providers could correlate data between lab results, pharmacy data, claims data and social media streams. The outcome might be pre-emptive diagnosis based on trends of epidemics and illness across the globe. In addition, the results would be highly personalized and overall lower the cost of healthcare. If done, it would present significant economic and social improvements.

The Data Driven Economy

In summary, the onslaught from data is both concerning and exciting at the same time. The potential information generated from the data presents major opportunities across industries from providing greater work efficiency to saving lives. Business, as a whole, is becoming even more reliant on information and therefore data-driven. Data ultimately provides greater insight, personalization, and accuracy for business decisions.

It is important for enterprises to quickly evaluate new methods for data consumption and management. The success or failure of companies may very well reside in their ability to address the data tsunami.

Top 5 Posts of 2013

Over the course of 2013, I wrote a number of posts about CIOs, Cloud Computing, Big Data, Data Centers and IT in general. Here are the top-5 most popular posts in 2013:

5. Time to get on the Colocation Train Before it is Too Late

In the number 5 spot is a post addressing the forthcoming challenges to the data center colocation market and how the ripple effect hits IT.

4. A Workload is Not a Workload, is Not a Workload

Number 4 is a post written in 2012 about the discrepancy between cloud computing case studies. Not all workloads are the same and many of the examples used do not represent the masses.

3. The IT Role in Value Creation is Not a Technology

The number 3 spot goes to a post that addresses the direction of IT organizations within the business and how it is evolving. It is this very evolution that is both very difficult and very exciting at the same time.

2. Motivation And Work Ethics: Passion Fuels the Engine

Another post from 2012 goes to the number 2 spot, which shows that some subjects (like: the importance of passion) have staying power. This post addresses important characteristics for a leader to consider. It addresses the intersection of passion, work ethic and motivation.

1. What is Your Cloud Exit Strategy?

Probably one of the most controversial titles goes to the number 1 spot. This post addresses the challenges faced with cloud when one doesn’t think about their end-state and evolution.

Honorable Mention: So Which Is It? Airplane Mode or Turn Devices Completely Off?

Back in Apr 2012, I was traveling and noticed that many didn’t turn off their devices even though they were instructed to…which prompted the post. Even though the FAA changed their rules in US, this post still gets quite a bit of attention.

Rise of the CDO…do you need one?

The CIO (Chief Information Officer) and CMO (Chief Marketing Officer) roles are in a state of flux and starting to evolve pretty significantly. In the past year, the role of CDO or Chief Data Officer started picking up steam. 2014 will bring the CDO role to prominence among the CIO and CMO roles. But what exactly is a CDO and why is it needed now? And where is the role headed especially in consideration of the changes to the CIO and CMO roles?

THE CDO (CHIEF DATA OFFICER) ROLE

First, the CDO role is not new. Enterprises have had people filling the role for several years now. The specific function of the role, however, varies greatly…and continues to do so. Organizations tag the CDO with functions ranging from ensuring regulatory compliance to IT system integration. With the advent of Big Data, companies are now looking at the CDO to manage the huge influx of data.

Late last year, Gartner published results from a recent study on the 5 Facts About Chief Data Officers. Namely:

  1. There are over 100 CDOs today (double the number from 2012).
  2. Most reside in Banking, Government and Insurance industries.
  3. 65% reside in the US.
  4. Over 25% of those are in New York or DC.
  5. Over 25% are women (almost twice that of CIOs).

WHAT SHOULD A CDO DO AND WHY NOW?

The sheer volume of data in both type and source is growing exponentially. The methods used by a traditional marketing and IT organizations are simply inadequate to keep up. A new perspective is needed. In addition, the CMO and CIO are fully bogged down with existing challenges to streamline and evolve their organizations.

The importance of data to a company is simply too valuable to wait for existing organizations to evolve. There is a half-life to the value of data. This is where the CDO comes in. The CDO can take a business-centric approach to the data without being inhibited by existing challenges. Data spans multiple facets from marketing to product development to customer service and beyond. By leveraging a separate organization that is not biased by the priorities of either the IT or marketing organizations, companies can leverage the true value of the data more readily.

Managing data is less about how to store it on physical drives and more about correlation of data points and trends.

CDO DIRECTION

There was a time that I felt that the CDO should be part of the CIOs role. I still feel that way, but it will take time. As outlined above, the CDO role needs to be autonomous from the CIO and CMO roles. However, as each of those roles evolves, so will the CDO role. More specifically, as the CIO role evolves, so will the CDO role.

As IT organizations shift from technology-centric to business-centric, their role with data also evolves. The prevalence of data as a business driver presents a unique challenge and opportunity for IT. Stronger companies will seize this opportunity through the evolution of their CIO and collaboration with the CDO function.

IN SUMMARY

Over time, the CIO & CDO roles become far more inter-related and eventually merge into a single CIO role. However, that will take time to happen. In the meantime, every company should be considering who is responsible for truly managing (and leveraging) their data for business intelligence and growth opportunities. In many cases, that may be the CDO role.

CIO Predictions for 2014

This year, I thought I would shift the focus from cloud-specific to the broader agenda for CIOs. But before I jump into my predictions for 2014, let’s take a trip down memory lane and review how I did on Cloud Predictions for 2013.

How did I do?

  1. Rise of the Cloud Verticals: We have seen an uptick of ‘cloud brokers’ but very little in the way of cloud verticals targeting specific industries or suite of services. There has been a feeble attempt at integration between solutions, but even that was lukewarm at best in 2013. (1/2pt)
  2. Widespread Planning of IaaS Migrations: Spot on with this one! Over 2013, the number of IT organizations planning IaaS migrations stepped up in a big way. That’s great news for the IT organization, the business units and the industry as a whole. It demonstrates progress along the maturity continuum. (1pt)
  3. CIO’s Look to Cloud to Catapult IT Transformation: This has been a mixed bag. Many have leveraged cloud because they were forced into it rather than see it as one of the most significant opportunities of our time. There are exceptions to this, but they are not as prominent yet. (1/2pt)
  4. Mobile Increases Intensity of Cloud Adoption: Mobile is taking off like wildfire. And cloud is enabling the progress, as traditional methods would simply be too challenging and slow. (1pt)
  5. Cloud Innovation Shifts from New Solutions to Integration & Consolidation: Over 2013, the number of new solutions has progressed at a feverish pitch. The good indicator is that new solutions are taking into account the requirement of integration with solutions within their ecosystem. While consolidation within cloud providers started to pickup in the 2nd half of 2013, I would expect it to increase into 2014. (1pt)

Total Score: 4/5

Overall, slower general adoption of cloud paired with strong adoption of specific cloud solutions lead to 2013’s progress. I had hoped to see us further along…but alas, 2014 is shaping up to be a very interesting year.

What to look for in 2014?

  1. Cloud Consolidation: Look for plenty of M&A activity as larger incumbents gobble cloud point solutions up. Also look for incumbents to flesh out their ecosystem more fully.
  2. CIOs Focus on Data: Conversations move beyond the next bell or whistle and onto items that really change the economic landscape for a company: data. Look for the CIO to shift focus to data and away from infrastructure.
  3. Colocation is in Vogue: As the CIO moves up the maturity model toward higher-value functions, look for IT organizations to move to colocation in droves. The challenge will be moving before it’s too late.
  4. CIO, CMO + Other Execs Become Best Friends: We’ve talked for some time about how the CIO strives for a ‘seat at the table’. The challenge is in how to be a relevant participant at the table. As the CIO role shifts from support org to business driver, look for the relationships to change too.
  5. One Size Does NOT Fit All: As we talk about newer technologies, CIOs, IT organizations, vendors and service providers get realistic about where their products/ services fit best…and don’t. OpenStack and HP Moonshot are great examples of awesome solutions that fit this statement.

As I’ve said before, this has got to be the best time to work in Information Technology. How will you embrace and leverage change? Here’s to an awesome 2014!

Simple and Standard…or Custom and Complex

Over the years, IT has had the ability to customize the heck out of applications. Even the industry enabled this addiction to feature creep. Vendors asked what new button, bell, and whistle customers wanted and then delivered what they could. Customization became a hallmark of IT trying to ultimately please the customer and meet their ever-changing requirements.

Custom configurations lead to the ability to do more and increase the value of the application/ service to the user. As the number of customizations increased, so did the level of complexity. Eventually, that very flexibility and customization starts to work against the value of the customizations themselves.

There is another nasty side effect with customization. It creates a sort of lock-in. Essentially, the further a solution is customized, the more unique it is and the harder it is to leverage alternative solutions. The customizations create such a unique solution that alternative solutions struggle to compete against. That is, unless they offer the exact same features and functionality…and customization options.

In the end, significant customization is ultimately only possible when using applications that are run internally. When moving to a shared or cloud environment, the level of potential customization drops precipitously. For many, this presents a significant hurdle to moving into a new solution like cloud (ie: SaaS).

The question really comes down to: What is the true value of the customizations? Are they providing more value than their cost? And is this really what customers want? Here at HP Discover in Barcelona, the very issue became a hot button of discussion. Ultimately, the outcome was: Customers want simple and standard over custom and complex. There is a difference between want, need and should.

Bottom Line: In IT, we’ve had the opportunity to customize the heck out of applications. Why? Because we could and truly believed it was valuable. That may have been the case in the past, but today, it is about business value. And there are larger considerations (like alternatives, agility and choice) that play a more significant role in our decisions.

HP Discover Barcelona: What to Watch For

Today kicks off HP’s Discover conference in Barcelona, Spain with a bevy of information on tap. Looking over the event guide, it is clear that HP is targeting the Enterprise customer with an emphasis on Cloud Computing, Data (including Big Data) and Converged Infrastructure. HP’s definition of ‘converged infrastructure’ does include a bevy of their core infrastructure components.

With an emphasis on cloud and data, HP is really targeting the future direction of technology, not just traditional IT. HP is a large company and can take a bit of work to evolve the thinking from traditional IT to transformational IT. It is good to see the changes.

Of note is the expansion of data beyond just Big Data. For many, the focus continues to persist on Big Data. Yet, for many enterprises, data expands well beyond just Big Data. Look for more information beyond the existing NASCAR example on both the breadth and depth. In addition, there are sessions that provide a deep dive specifically for HAVEn partners. It is good to see HP consider the importance of their partner program.

Core areas of both printing and mobility are making an appearance here at Discover. However, their presence pales in comparison with the big three.

So, what to look for… With cloud and data, the keys for HP will rest with how well they enable adoption. How easy do they make it for customers to easily adopt new technologies? Adoption is key to success. With converged infrastructure, has the story of integration moved beyond a reference architecture and single SKU approach? Look for more details on how far HP has come in developing their portfolio along with execution of the integration between the different solutions. This integration and execution is key.

Time to get on the Colocation Train Before it is Too Late

The data center industry is heading toward an inflection point that has significant impact on enterprises. It seems many aren’t looking far enough ahead, but the timeline appears to be 12-18 months, which is not that far out! The issue is a typical supply chain issue of supply, demand and timelines.

A CHANGE IN THE WINDS

First, let’s start with a bit of background… The advent of Cloud Computing and newer technologies, are driving an increase in the number of enterprises looking to ‘get out of the data center business. I, along with others, have presented many times about ‘Death of the Data Center.’ The data center, which used to serve as a strategic weapon in an enterprise IT org’s arsenal, is still very much critical, but fundamentally becoming a commodity. That’s not to say that the overall data center services are becoming a commodity, but the facility is. Other factors, such as the geographic footprint, network and ecosystem are becoming the real differentiators. And enterprises ‘in the know’ realize they can’t compete at the same level as today’s commercial data center facility providers.

THE TWO FLAVORS OF COLOCATION

Commercial data center providers offer two basic models of data center services: Wholesale and Retail. Digital Realty and DuPont Fabros are examples of major wholesale data center space and Equinix, Switch, IO, Savvis and QTS are examples of major retail colocation providers. It should be noted that some providers provide both wholesale and retail offerings. While there is a huge difference between wholesale and retail colocation space, I will leave the details on why an enterprise might consider one over the other for another post.

DATA CENTER SUPPLY, DEMAND AND TIMELINES

The problem is still the same for both types of data center space: there is a bit of surplus today, but there won’t be enough capacity in the near term. Data center providers are adding capacity around the globe, but they’re caught in a conundrum of how much capacity to build. It typically takes anywhere between 2-4 years to build a new data center and bring it online. And the demand isn’t there to support significant growth yet.

But if you read the tea leaves, the demand is getting ready to pop. Many folks are only now starting to consider their options with cloud and other services. So, why are data center providers not building data centers now in preparation for the pop? There are two reasons: On the supply side, it costs a significant amount of capital to build a data center today and having an idle data center burns significant operational expenses too. On the demand side, enterprises are just starting to evaluate colocation options. Evaluating is different from ready to commit spending on colocation services.

Complicating matters further, even for the most aggressive enterprises, the preparation can take months and the migrations years in the making. Moving a data center is not a trivial exercise and often peppered with significant risk. There are applications, legacy requirements, 3rd party providers, connections, depreciation schedules, architectures, organization, process and governance changes to consider…just to name a few. In addition to the technical challenges, organizations and applications are typically not geared up to handle multi-day outages and moves of this nature. Ponder this: When was the last time your IT team moved a critical business application from one location to another? What about multiple applications? The reality is: it just doesn’t happen often…if at all.

But just because it’s hard, does not mean it should not be done. In this case, it needs to be done. At this point, every organization on the planet should have a plan for colocation and/or cloud. Of course there are exceptions and corner cases, but today they are few and shrinking.

COMPLIANCE AND REGULATORY CONCERNS

Those with compliance and regulatory requirements are moving too…and not just non-production or Disaster Recovery systems. Financial Services organizations are already moving their core banking systems into colocation. While Healthcare organizations are moving their Electronic Health Records (EHR) and Electronic Medical Record (EMR) systems into colocation…and in some cases, the cloud. This is in addition to any core legacy and greenfield applications. The compliance and regulatory requirements are an additional component to consider, not a reason to stop moving.

TIME CHANGES DATA CENTER THINKING

Just five years ago, a discussion of moving to colocation or cloud would have been far more challenging to do. Today, we are starting to see this migration happening. However, it is only happening in very small numbers of IT firms around the globe. We need to significantly increase the number of folks planning and migrating.

DATA CENTER ELASTICITY

On the downside, even if an enterprise started to build their data center strategy and roadmap today, it is unclear if adequate capacity to supply the demand will exist once they’re ready to move. Now, that’s not to say the sky is falling. But it does suggest that enterprises (in mass) need to get on the ball and start planning for death of the data center (their own). At a minimum, it would provider data center providers with greater visibility of the impending demand and timeline. In the best scenario, it provides a healthy ecosystem in the supply/ demand equation without creating a rubber-band effect where supply and demand each fluctuate toward equilibrium.

BUILDING A ROADMAP

The process starts with a vision and understanding of what is truly strategic. Recall that vitally important and strategic can be two different things. Power is vitally important to data centers, but data center providers are not building power plants next to each one.

The next step is building a roadmap that supports the vision. The roadmap includes more than just technological advancements. The biggest initial hurdles will come in the form of organization and process. In addition, a strong visionary and leader will provide the right combination skills to lead the effort and ask the right questions to achieve success.

Part of the roadmap will inevitably include an evaluation of colocation providers. Before you get started down this path, it is important to understand the differences between wholesale and retail colocation providers, what they offer and what your responsibilities are. That last step is often lost as part of the evaluation process.

Truly understand what your requirements are. Space, power and bandwidth are just scratching the surface. Take a holistic view of your environment and portfolio. Understand what and how things will change when moving to colocation. This is as much a clear snapshot of your current situation, as it is where you’re headed over time.

TIME TO GET MOVING

Moving into colocation is a great first-step for many enterprises. It gets them ‘out of the data center’ business while still maintaining their existing portfolio intact. Colocation also provides a great way to move the maturity of an organization (and portfolio) toward cloud.

The evaluation process for colocation services is much different today from just 5 years ago. Today, some of the key differentiators are geographic coverage, network and ecosystem. But a stern warning: The criteria for each enterprise will be different and unique. What applies to one does not necessarily apply to the next. It’s important to clearly understand this and how each provider matches against the requirements.

The process takes time and effort. For this and a number of other reasons, it may take months to years even for the most aggressive movers. As such, it is best to started sooner than later before the train leaves the station.

Further Reading:

Applying Cloud Computing in the Enterprise

Cloud Application Matrix

A Workload is Not a Workload, is Not a Workload