Let the Kids Define our Technology Roadmap

Let the Kids Define our Technology Roadmap

2010 CeBIT Technology FairSo I think we all recognise that technology is evolving at an exponential rate, ten years ago we could see and track innovation in years, the new Nokia phone, or 15K Hard drives would have been anticipated for months, and you could prepare yourself for it well in advance. If you are a technologist in a business you had time to warm up the CFO or budget holder, you could work across the stakeholders to get them onside and by the time the product was launched you had the whole company champing at the bit. However since this took a long time other technologies had come out which either impacted the anticipated performance boost or features, or you had an incompatibility issue, which meant you had to now make another choice.

We have a problem today in that new products or new features are being released in much shorter timeframes, therefore you don’t have time to work the stakeholders in time, and if you did the product or feature you are promoting will be out of date by the time you got it agreed and deployed. And its great to a 5 year plan, but really who can currently forecast what technologies are going to be hot in 5yrs, Sure you throw the biggies at it Cloud, Big Data, Mobility but what does that really mean.

How many times have you heard this “We should use cloud to enable our business, and analyses the data on the cloud, then publish the results to a mobile enabled workforce”, Insightful statement, reality is our kids are using cloud on their smartphones and see dashboards on their likes and status already, of course business will adopt these things.

So lets think about that, what does a typical teenager today expect from their IT experience:

  • Being online, all the time through multiple devices
  • Access to Apps Stores for application purchase and provisioning instantly
  • Collaborative working, data can be shared easily, securely
  • Work from anywhere, consistant working experience
  • Multitask / application integration
  • Everything on a cloud

I would imagine if you looked at most CIO/CTO and CEO’s Strategy this would be a comparable list, maybe more business jargon thrown in just to help maintain the illusion but the reality is the younger generations are already doing this and more today,  by the way those teenagers in five, ten fifteen years time are going to expecting these things to be in-place when they enter the workplace.

An example oft this was recently demonstrated in our home, I tried to explain to my daughter that the cloud was actually a physical building somewhere, with servers, storage and networks which were hosting and providing the apps and data she uses on her phone and tablet. She looked at me, and said, “I know” and pointed me to the `Wikipedia App` on her smartphone. It was at this point I realised that daddy was no long the fountain of all knowledge.

So what do we need to do, well first thing watch the younger generation and how they operate with IT, they are not fused by the actual devices, ok I might bow down to the Apple brand a bit on that one, but mainly they want the user-experience to be easy, flexible and in real-time. If you wish to experiment with your own kids, remove a device, then remove two, and see how they adapt. If you wish to be really cruel shutdown the WIFI in your house, this may have two effects, one they do not talk to you for a few hours, or they find another WIFI AccessPoint outside your control and reconnect, note they still may not talk to you as well. It most cases the kids will continue to work and operate, maybe in a new location on a different device, but they are working.

So next time you develop a roadmap, or think about the technology strategy for your business, consider spending time looking at how the next generation of employees would wish to work and operate, it may help thinking about what direction your business needs to take.

Next Generation IT: Considerations and Conclusion Part 6 of 6

Other Considerations

Although we have covered the four main areas of Next Generation IT solutions, there are three other key elements to consider: Open Source, Mobility and As a Service. Let’s tackle each of these to help complete the picture.

In the past, Open Source was viewed as something that looked interesting, but when it came to mainstream production, commercial applications won the battle in terms of support and ownership. Since the commercial products came with support and maintenance contracts, we had the proverbial “throat to choke” if something went wrong. However, the open source world started to make its mark over 12 years ago with the introduction of the Linux operating system. Although it took 4-5 years to establish, we would now consider Linux as one of the preferred operating systems for data centres and application hosting. If we leap frog to today, we are seeing a range of open source products hitting the market and being considered for key production-based workloads. One of the main areas is around Big Data and the introduction of Hadoop, developed out of Yahoo and matured through the Internet Service Providers. This open source product has revolutionised business analytics. With open source products like Hadoop, the risk of feeling disconnected from the developer / product owner or having no real support framework is now mitigated by vendors providing third-party consulting support for your implementations. So you have the look and feel of a commercial product with the flexibility and resources of a crowdsource-developed open source product.

Mobility is a key feature today for any application. How you push data and information to your workforce is critical to productivity. At the same time, collecting data from the mobile workforce is beneficial to business operations. Enabling people to have access to systems securely and quickly means that the workforce can always be online and can operate at, or close to, 100% of their in-office productivity. As business applications enable the mobile workforce to access sales data, ERP and CRM systems, we should also consider pushing information about system operations and threat analysis so that events can be handled pro-actively versus reactively.

As a Service can be nicely aligned to cloud delivery models. The reason for raising the As a Service factor is its valuable purchase model and innovation potential. Traditional purchase models are good for businesses with large Capex budgets, but these are few and far between nowadays. And even the Capex-rich business are still looking to spend wisely and have a more predictable commercial model. The As a Service model allows businesses to buy defined services at an agreed-upon unit rate, charged on a consumption or allocation basis, typically with minimum volume and time commitments. Once over the minimum levels, organisations can flex up and down their usage to meet the peaks and troughs of the business. An example is retailers who need to increase their online ordering systems during the holiday season. With a traditional Capex model the retail organisation would have to purchase IT systems to handle the highest utilisation rate; therefore, during quiet times (i.e., non-holiday seasons) the systems would be underutilised. The As a Service model frees up funds for the organisation to spend on new, innovative solutions that drive the business forward rather than maintain the status quo.

Conclusion: Hybrid Stacks and The Art of the Possible         

There is no single solution stack that will address all your needs. Understanding this will allow you to think about each layer and what is needed to provide the right hosting platform, the right security and management services, and the right application delivery and development frameworks to meet the needs of the business process or question you are trying to resolve. The fact is, you will have a hybrid solution stack that combines public and private cloud solutions. Where possible, migrating to new, more agile platforms will provide future proofing and enable easier integration with other solutions. This makes good business sense, as every business must focus on maximizing the value of its applications and data, whether held internally or externally.

We started out by asking: What is Next Generation IT? Next Generation IT may be the latest buzz word, but what is new today is old tomorrow. That said, we can define Next Generation IT by focusing on some key areas:

  • The adoption of Cloud technologies and services is pivotal to Next Generation IT, whether for infrastructure, platform or business application services.
  • Cyber Security is always a threat. Ensure that the solutions and services you buy or build provide adequate levels of protection for your business and your clients.
  • To help businesses make better decisions, the ability to mine and query a wide variety of Big Data is critical to achieving better insight into business operations and direction.
  • Mobility should be a consideration across your application landscape, enabling the workforce and client base to operate from any location and feel connected to the business. This is essential in today’s world.
  • In order to achieve these business gains, enterprises must move forward with Application Modernisation, which should be treated as a driver of business change.

When taking on this journey, work with system integrators and service providers who can work with confidence across public and private cloud services, are able to operate from the Business Process layer to the Infrastructure layer, and can consider the service management and security wrappers that are needed. As open source products mature, consider them as a way to avoid vendor lock-in, which is key to having a more flexible and agile future. Above all, talk to your business not about the restrictions of legacy ball-and-chain infrastructure but about the art of the possible with Next Generation IT solutions.

Link to Part 1

Next Generation IT: Application Part 3 of 6

Application

We could simply break down the Application layer into industry-specific and cross-industry applications and be done with it. However, we need to ensure that applications support Big Data, Business Continuity and Mobility. This includes common APIs and protocols for easy integration. For this, consider the data and its relevance to the business.

One of the key challenges in the Application layer is that you may end up with application sprawl, and, depending on the size of the organisation and how long it has been operating, there is a likely chance you will have multiple applications performing similar if not duplicate tasks. This happens in large global organisations and presents big challenges to CIOs and CTOs who are trying to both consolidate applications and create a unified organisation.

Taking stock of your entire application landscape is key. It is typically not easy to retire applications, as you normally find that one or two business units depend on them and their productivity would stop. Just introducing new applications and asking people to start using them perpetuates the application sprawl; you end up with new and old apps, and data integration becomes people copying and pasting data from one application to the other. This is hardly productive, causes problems with data accuracy and consistency, and is a burden on the employees.

As you review your application landscape, the key concept to understand for Next Generation solutions is Application Modernisation, the practice of taking you legacy application estate and transitioning it to a new application and new platform, or upgrading to the latest versions to provide the features and functionality that business are expecting. The move could be small or large depending on your starting point and the end state you want to achieve. Many enterprises are looking to cloudify their apps, giving them a new platform and commercial framework.

However, we can now start to consider some of the various delivery mechanisms that can help us be more agile and improve our time to market. Let’s start with Cloud Apps, a key enabler in the Next Generation solution set. Although we typically think of Amazon and Google, there are many vendors and products in the enterprise cloud application space. Look at the success of cloud applications like Salesforce; five years ago we would have run for the hills at the thought of hosting our sales and other proprietary data on a public cloud.

A key focus now for CIOs and CTOs is how to migrate their legacy apps to the new cloud-enabled solutions. This can be an expensive but valuable exercise, as we see the maturity and coverage of cloud applications becoming the norm for a majority of businesses. This will provide a good stepping stone for future-proofing your estate and taking advantage of the new development and delivery processes like DevOps, which enable rapid development and roll-out of applications and code updates in a seamless and risk-free approach, making change the norm verses the exception. Anyone who uses Google or Amazon apps today knows that updates to their applications are rolled out without incident, new features or bug fixes are deployed continuously during the day, and no Change Control ticket or Service Outage notice is created. CIOs and CTOs want their business applications to inherit these principles that are rooted in the consumer space.

Link to Part 4 Platform

Next Generation IT: Infrastructure Part 5 of 6

Infrastructure

Infrastructure is the concrete physical foundation of any IT service. Don’t be fooled by the word “cloud.” Behind every cloud is a data centre with servers, storage devices and network gear. In the past we would take clients around data centres and show off shiny boxes and flashing LEDs. A lot of hardware vendors even made style design decisions about how sexy their product looked. Today you are less likely to walk around a data centre. Google and Amazon are good examples of cloud providers who would rather not discuss their infrastructure or data centres, though they have invested hundreds of millions of dollars to provide a global data centre footprint. So easy, right? Build one big data centre (two if you want redundancy), put all your applications and data into it, and job done.

Unfortunately, it’s not that easy; a combination of data regulations, regional restrictions and speed of access are some of the key considerations. This is why you see the cloud providers standing up more cloud data centres across the globe to handle these requirements. Your business may well be in the same position, and therefore you will end up with a dispersed infrastructure footprint.

Acknowledging that we need good infrastructure, what are the key considerations overall? Is it about the success of securely leveraging the resources you have? Let’s start with the data centre itself. This is a major investment, and running and maintaining these facilities must be considered. Data centres are key resources to be leveraged. Using physical segregated infrastructure within the facility can provide added security, ensuring there is no chance of data bleed between applications, business units or clients. If you have disaster recovery services, you typically need another data centre, suitably connected and managed.

Much focus in the data centre is on network connectivity. In today’s connected world, data no longer needs to flow just within the traditional Intranet networks. In fact, Intranet is becoming a thing of the past. Now we are simply connected; we need to connect to applications and data sources from both internal and external locations. The network should support secure and resilient connections with integrated secure VPNs, firewalls, intrusion detection systems and high availability configurations to ensure services are available in the event of an issue or outage.

In terms of compute and storage solutions, there has been a move from traditional server and storage infrastructure — a “build-it-yourself” mind-set — to the new converged infrastructure, which has pre-packaged server, storage and network products in an integrated stack with predefined sizing and growth options. This can be an accelerator, as these converged infrastructures are pretty much ready to go and can be deployed like an appliance, versus the traditional months of negotiating with vendors, arguing the values of preferred products, and delay in knitting all this together in the data centre. So with converged infrastructures, job done.

Well, not quite. The issue with converged infrastructures is that they come at a price. Typically the products used are enterprise grade, designed not to fail, and have the support and backing of major vendors. Today, these elements are being challenged by the application space, with new apps that are self-healing and able to operate at web scale. Therefore, all the resilience and high-end products in the Infrastructure layer are just adding cost to the architecture. Hadoop is a prime example of an open source product that is designed to be built on commodity hardware products; if a server fails, you throw it away and put in a new one. The cluster reforms and off you go again. As we look at email and other business applications, there are more of these cluster-based solutions that are challenging the infrastructure to keep up and meet the cost-to-function needs of the business.

If we also consider that not all applications need to be hosted in your own data centre, you start getting into hybrid solutions. Although you may wish to host your critical production applications and data in your controlled facilities, this typically means that those parts of your business get caught up in the change control and restrictions typically imposed in the data centre. However, less sensitive environments like development and testing can be hosted outside your facilities. If you use cloud-based services, you can typically improve response times and decrease your time to market.

Link to Part 6 Other Considerations and Conclusion

Next Generation IT: What is it, and How Do I Do It – Part 1 of 6

Next Generation IT: What Is It, and How Do I Do It?

Abstract

What does “Next Generation IT” really mean? This paper takes a holistic view of the various layers and components that comprise Next Generation IT and guides CIOs and CTOs on what to look for in modernising their applications and creating Next Generation IT solutions. While the layers and components are important in and of themselves, the real value comes from integrating all the pieces without technology bias.

Keywords: Big Data, Business Continuity, Mobility, DevOps, Service Management, Security, Orchestration, Open Source, As a Service, Modernisation, Business Process, Application, Platform, Infrastructure

————————–

What is Next Generation IT?

Next Generation IT solutions are becoming buzz words in the IT arena. Your business has to be considering Next Generation or you going to be Old Generation and that will never do. But what is Next Generation IT and how does it fit into my existing legacy IT estate? I can’t just rip and replace everything I have installed over the last 15 years; my CFO will have a heart attack! But every time my CEO meets with analysts or our vendor partners, all are energised to improve time to market, reduce cost, or improve operational efficiency by simply deploying Next Generation IT.

So what is Next Generation IT? What does it consist of, how it can it be placed within your IT estate, and how will your business benefit? This paper addresses these questions, targeting CTOs and CIOs who are considering adding or renewing part or all of their IT estate. This paper focuses on issues to consider to help your business move towards a more agile, scalable and future-proof IT estate.

Start with the Stack

StackThis stack diagram is not new; it forms the foundation for where we place and consider technologies and solutions. Before we get into detail, let’s align some terminology. I like using the familiar Lego bricks example. Think of a single technology product as an individual block — i.e., server, network switch, ERP app, etc. A solution is the combination of technologies integrated together to solve a business need. Solutions can only work if the pieces integrate successfully. Lego bricks link together because they have standard interfaces (you cannot link a Lego brick and Duplo brick together.). All Lego bricks can link together to create complex designs. If we go back to our four layer stack, we can consider both technology and solutions that fit into each layer. The focus of this paper is on the solutions versus the technology, as these are a key aspect of Next Generation IT.

Let’s start from the top and outline what should be considered when thinking about Next Generations solutions.

Link to Part 2 Business Process

Connecting the Boxes

Image

As we develop IT solutions, it is very easy to focus on the core elements: infrastructure, platform and application layers, and the big components such as storage and compute, ERP and middleware technologies. However, as we think about architectures and systems integration, focusing on the connectivity of the data and application is critical to a successful deployment and to satisfying both operational and regulatory requirements.

This focus on connectivity is particularly important as we move to modern, cloud-based applications. In today’s architecture we worry less about the basic interoperability of big components because the vendors typically have this well covered. Unless you’re trying to put the proverbial square peg in a round hole, your risk is low. As we look to make our applications more agile and consider moving workload from public cloud to private cloud or hosted solutions, and as we think about moving from testing to production, what we need to worry about more is the connection between data and applications. Is the line that connects these boxes well designed for today and tomorrow?

Consider the plumbing in your house. Would one type of pipe and fittings handle high and low pressure water, gas and oil-based systems? Fittings and pipe structure need to be designed specifically to ensure they integrate and operate with the appliances they connect. Now consider an IT architecture. Don’t confuse the lines that connect the boxes as being the network cables or network connection protocols. The OSI model handles these connections up to layer 4, typically in the infrastructure layer. The layers I want to focus on are those that deal with the data transportation between applications (layers 5-7), where the lines between the boxes are the protocols and APIs that connect the applications together.

These connections need to not only function as interconnections between applications but also take on the attributes of the overall solution. For example, if you are operating in a secure regulated environment, you must ensure you are using secure protocols (e.g., SSL, SFTP, HTTPS, SSH), making sure that data is encrypted as it moves between applications. Or if writing APIs, then Java with the Java Cryptography Extension (JCE) can be used to secure the data connections through encryption.

As part of the design when considering APIs and protocols, strive to future proof yourself. As we have seen in the Web space, RESTful APIs have become the protocol of choice. Risk is reduced around application integration, availability of skilled resources and support from the application vendors, providing flexibility and adaptability for future developments.

Consider a client looking to migrate from their legacy applications to new modern apps, migrating both platform and hosting to cloud-enabled solutions. A critical aspect is ensuring that connectivity for the migration itself and then the migrated operational components is enabled. Part of the success of most application modernisations projects is based on the ability to move and reconnect new applications and data sources into the legacy estates.

As we look forward we are already seeing products, both commercial and open source, that help solution designers interconnect their applications through common data connectors and APIs.  One to draw your attention to in the open source space is EzBake, developed by 42Six Solutions (a CSC company). EzBake is in the final stage of being launched. This open source project aims to simplify the data connectivity, federated query and security elements within the big data space.  There are already public cloud-based platforms that enable you to buy a service that connects your data source to a target through a common set of APIs and protocols. EzBake will likely sit in the private cloud space, focused on connecting big data applications and data stores, but the ability to make these application and data connections easily is usable across the IT landscape.

It all comes down to the line connecting the boxes. Ensuring that this is given as much thought and consideration as the data and applications when designing a solution will pay dividends, enabling your architecture to integrate and operate successfully. And with correctly chosen protocols, your solution will be future proofed for the next integration or migration project.