this is the archive page

Digital transformation in healthcare begins in the Cloud

The healthcare industry’s technological obstacles are numerous and sometimes difficult to overcome. Many practitioners rely on outdated IT infrastructures that cannot support changing industry practices and insufficient security measures that do not adequately protect against increasing cyberattack risk. Migrating to the Cloud is a cost-effective and reliable first step to jumpstart digital transformation in the healthcare industry.

This blog will examine the benefits of cloud technology in healthcare and review the emerging technology fueling modernization and transformation across the industry.

The need for digital transformation in healthcare

Many providers—especially hospitals—have relied on multiprotocol label switching (MPLS) infrastructures. While affording great privacy, these networks are rapidly aging and ill-equipped to meet modern challenges.

Cybersecurity is another core concern for the healthcare industry. Due to the sensitive nature of patient data and information, providers are often the target of ransomware attacks that can shut down vital systems for weeks. And with the influx of Internet of medical things (IoMT or IoT) devices, the number of avenues to breach security has multiplied exponentially.

Cost and lack of IT resources hinder healthcare providers from starting their digital transformation journey. Moreover, even when companies can take the plunge, maintenance often falls by the wayside because IT staff are overburdened and unable to update applications. As a result, providers’ systems risk security breaches and falling out of HIPAA compliance.

Also read: Cloud security controls that help mitigate risk

What is cloud technology, and how does it relate to healthcare?

For the healthcare industry, digital transformation enables companies to keep up with tech advances, making it critical to future success. Moving to the Cloud means data is stored and shared from remote servers, an alternative to onsite data centers. Any given organization may have one or many different cloud solutions (think Google Drive, OneDrive, Azure, and so forth).

But perhaps the most powerful functionality of cloud technology in healthcare is networking. Cloud technologies, such as software-defined wide area network (SD-WAN), Network as a Service (NaaS), and Unified Communications as a Service (UCaaS), are all methods of boosting the speed of network connections while utilizing existing data lines.

In addition, cutting-edge security protocols like secure access service edge (SASE) and AI-powered tools proactively seek out and block emerging cybersecurity threats.

Learn more: How Microsoft Cloud for Healthcare empowers your organization

Benefits of cloud technology in healthcare

As healthcare providers begin the process of digital transformation, cloud tech offers many benefits. They include:
  • Scalability.
  • Data storage and sharing.
  • Data loss prevention and disaster recovery.
  • Enhanced collaboration and communications.
  • Improved cybersecurity.
  • AI and machine learning deployments.
  • Increased networking speed and efficiency.
  • Greater data merging and access through various sets of applicatioins and automation.
  • Increased patient engagement through various sets of applications and automation.
  • Management of IoMT devices and data.

Also read: Revolutionize your Cloud disaster recovery capabilities with DRaaS

The technologies that are driving healthcare into the future

Emerging cloud technology offers solutions to many of healthcare’s most significant concerns. 


Doctors and providers can now access more significant insights into patient health through IoMT devices. Wearable monitors and apps afford providers real-time connectivity to patient data, a streamlined workflow through connected devices, and cost-effective scalability.

However, legacy IT systems can’t keep up with the sheer amount of data generated by IoMT devices. As a result, compliance and security are both top concerns with IoMT.   

Data analytics and management

Creating more intuitive, accurate, accessible, and compliant EHRs is a considerable undertaking that cloud technology in healthcare is perfectly aligned to address. Cloud systems help merge massive amounts of data while keeping it secure.


AI and machine learning technology offer many benefits for medical providers and businesses. Automation can eliminate record keeping or billing redundancies and send messages or reminders to patients. AI enables researchers to track, examine, and extrapolate data from subjects as diverse as cancer to protein folding. Machine learning security protocols help to track and block threats before they become breaches.


SD-WAN is a virtual wide area network that allows faster networking speed through a cloud-based architecture. With other solutions from CBTS, such as NaaS and UCaaS, SD-WAN generates the potential for explosive growth and productivity for providers, clinics, and hospitals.


SASE is a security methodology that works with SD-WAN to keep the Cloud secure.

Learn more about the CBTS cloud implementation process by downloading this e-book: CIO Field Guide: Cloud Assessment Services


Medical providers don’t often have the resources to oversee the overhaul of their IT systems. Moreover, training staff to use and maintain new systems is an ongoing challenge. It’s not enough that the new systems are adopted—they must be used correctly or risk falling out of compliance or a security breach.

As a seasoned provider for digital transformation, CBTS brings numerous critical capabilities to the development of hybrid cloud environments and managed services for all relevant cloud technology in healthcare. CBTS has broad experience helping our clients choose, implement, and maintain the right technology solutions.

Learn more about how CBTS can help you on your modernization journey.

Serverless vs containers: complementary or competing technologies?

Enterprise computing partners continuously seek the best way to develop, deploy, and manage applications in this era of new ideas, devices, and virtual experiences. A question at the forefront of this digital transformation is whether to implement a serverless solution or utilize containers. Each has pros and cons. Calling on its extensive knowledge of cloud technologies, CBTS guides enterprise clients through the process of determining which path is right for their unique needs.

This blog will explore the role of serverless vs containers in an IT environment.

What is serverless computing?

Using serverless computing, a developer can create and run applications free from concerns about server limitations such as provisioning, scaling, and managing.

Functions are executed in the cloud and are billed based on the time the process is running rather than by how long the server is up. An event triggers a function that runs for a set length of time. Then, the function remains inactive until an event triggers it again. For the most part, functions only run for a short time—usually five minutes or less. The brief runtime of functions is one of the advantages of using serverless computing, as it helps to keep costs low. However, it also can be a downside when you need a lot of computing power over a prolonged period, especially compared to the “always-on” model of containers.

Serverless computing does not actually eliminate the need for a server. Instead, the code is outsourced to the cloud provider’s infrastructure, where the application is run and ultimately returns the result. Serverless computing allows a developer to create applications without concern for the limitations of the server. Instead, the developer can focus exclusively on the code.

Benefits of serverless computing
  • The project is code intensive.
  • When traffic patterns change independently, a serverless system allows functions to ramp up or down depending on the needs of traffic flow.
  • Speedy launches are possible with serverless because the focus is on code over infrastructure. Apps, websites, and other products can be launched in days or weeks instead of months.
  • You need to keep costs down. Serverless only requires that you pay for the time that a function runs.
  • When you need to scale, serverless computing makes it easy and automatic.

What are containers?

A container is an isolated package for a service or application that is ready for deployment, execution, and scaling.

A container allows a user to run an application in isolation. This model improves efficiency by eliminating the need to run a virtual machine (VM) for each application.

By using containers, a developer can package code, configurations, and dependencies into easy-to-use building blocks that promote:

  • Institutional consistency.
  • Operational efficiency.
  • Developer productivity.
  • Version control.

Containers use less space than VMs, can handle more applications, and require fewer VMs and Operating Systems.

Key benefits of containers
  • For large or complicated applications, memory and size are not an issue with containers.
  • You need complete control over an app’s admin, security, and resources.
  • When migrating an old or large application, containers can be easier to implement than serverless functions.
  • You are working with a container across multiple OS systems or environments.

Similarities and differences in serverless vs containers

Serverless computing and containers can both be used to strategically position enterprise users to leverage the next phase of digital transformation to achieve optimal results. However, managing these two different technologies requires different strategies.

Similarities between serverless and containers

Serverless computing and containers both allow code to operate inside isolated, discrete environments. While they are not identical technologies, they achieve similar results, but in different ways.

Both serverless environments and containers are designed to meet future changes and leverage the latest innovations in cloud computing. Serverless computing and containers both:

  • Use finite pieces of code that function in microservice architectures. However, serverless generally works better with microservices.
  • Easily deployed across distributed architectures, they are commonly used in the Cloud.
  • Start quickly, often within a few seconds.
  • Rely heavily on APIs to coordinate integration with external resources.
  • Employ external resources to manage persistent storage needs.

Differences between serverless environments and containers

In a serverless environment, end users typically do not control the host server and the operating system on which applications run. Workloads may consume large amounts of data in a short amount of time. Because of this, avoiding unnecessary resource consumption becomes critically important in managing the computing bill. Most workloads are run on a public cloud using AWS Lambda or Azure Functions, which limits the number of tools available to manage and secure those functions.

Containers rely heavily on the host operating system. Efficiency is less important than in a serverless environment because container applications are designed to run for longer periods of time and may not constantly consume resources. Because containers are often deployed on-premises or on generic cloud infrastructure, the toolset is less restrictive than in a serverless environment.

Watch this video to learn how serverless computing and containers can be applied in a business environment.

Serverless vs containers – choosing the best path for your business?

Serverless solutions are best suited to short, small, single-function operations. Developers can quickly and efficiently access cloud-specific services for speedy development and deployment.

While a serverless environment eliminates concerns about over-provisioning, deployment, and maintenance, developers lose direct access to the containers. Losing direct access to the containers can make it difficult to debug issues. By choosing a serverless environment, developers sacrifice autonomy for increased speed and lower costs.

Containers are more portable and offer developers more control over how the application runs and performs. However, containers are more difficult to build and are more complex to orchestrate and deploy.

One approach is to use serverless computing strategies and containers in the same project but for different purposes. Serverless functions can be used for data processing and other triggered events. Containers can be used when you need control, scalability, and management through orchestration tools.

If your organization is struggling to answer the serverless vs containers question, work with a technology partner who can identify and implement the right tools and ensure they are being used to provide optimal results.

Contact the experts at CBTS today to begin strategizing your application modernization journey.

Benefits of a Managed Data Lake Solution

One of the most significant challenges facing modern businesses, especially large corporations, is how to store and manage their data. Even a small business could have dozens of different data streams from various platforms, apps, IoT devices, and more. Compounding the issue, many platforms save data in a proprietary file type that is unreadable outside the software. As a result, the contemporary data flow has been a stress test for on-prem data storage systems. Data lakes have emerged as a solution to these common data storage and management challenges.

This post will examine the benefits of a data lake and how CBTS leverages them to maximize results for its clients.

What is a Data Lake?

A data lake is a reservoir into which data streams can flow. The benefits of a data lake are numerous. They pull data from disparate sources and deposit them in one place. That data can be structured or unstructured, of multiple file types, and imported as-is without converting files. A data lake is easily searchable, fast, and more cost-effective than on-prem systems.

Users can then manage their data in multiple ways to create powerful business intelligence reports, track analytics, and generate custom dashboards. Data lakes are cloud-based and can be accessed remotely from anywhere worldwide if the user has proper permissions. Additionally, the consolidated data can be analyzed and manipulated with AI and machine learning.

Read the case study: CBTS solution modernizes, simplifies critical security environment

Why switch to a data lake?

An important benefit to the data lake architecture is that it avoids the pitfalls of on-prem data storage. Using a cloud-based system removes the cost of continually expanding or upgrading data resources and management. A data lake storage system lets you scale elastically, only paying for the storage and services you need in a pay-as-you-go model.

Other common on-prem challenges include:

  • Data constraints, both computationally and in terms of accessibility.
  • Maintaining or renovating legacy systems.
  • Time and resources sunk into correctly structuring data, double-checking for accuracy, and otherwise managing the database(s).

Read more: Howdy Partner panel discusses business benefits of Data Lake Kickstarter tools

Switching to a data lake from on-prem storage has many benefits, including:
  • Highly targeting, fast delivery, increased speed-to-value of customer data.
  • Leverage data in new ways and generate increased business intelligence.
  • Flexible scaling as needed.
  • Clean, pure data that is optimized and structured during import with a complete history of metadata available.
  • Improve customer experience and reduce operational inefficiencies.
  • Data stays up to date—pulled in regularly at 15-minute intervals.
  • Annual data durability of 11 nines, i.e., 99.99999999999%; lost data is a thing of the past.
  • Speedy deployment—customers can begin in as little as five minutes.

CBTS implementation of data lakes

CBTS implements a data lake solution by migrating high-value data to a cloud-native format for the client. Then, CBTS can effectively build out custom scripts and solutions using Amazon Web Services (AWS) and ServiceNow to create meaningful insights into the client’s rich data, including new provisioning and support systems. CBTS has worked with clients to pool data in data lakes from dozens of other services, apps, platforms, and websites. Deployment is speedy and highly targeted, helping achieve increased speed-to-value.

CBTS deploys data lakes for its clients using open structure as a guiding principle. Customers own their data and can use it however they see fit. Users aren’t locked into a single platform or aging technology. Instead, data is clean and stored in a serverless system operated by AWS Athena. Using this model, customers don’t need access to the AWS suite of tools to interact with their data. Instead, users log into a customized dashboard.

Security is a top concern for any data management system or tool. A benefit of the CBTS implementation model of data lakes is that no user may access the data without specific permissions. Additionally, all data is automatically encrypted, ensuring that the data remains safe from cyber attacks. The data is exceptionally durable as well. AWS cloud storage maintains annual durability of 11 nines (99.99999999999%). In other words, even with one billion pieces of data stored in the data lake, it’s improbable that even a single file could be lost.

Learn more: Streamlining the Data Lake to take on emerging security threats


In many ways, data lakes are a future-proof solution. Because AWS leverages the Cloud, your data lake can scale almost infinitely while keeping costs low. Additionally, as cloud tools and machine learning continue to emerge, the ability to manipulate your data will grow in new and meaningful ways. The experts at CBTS are experienced with deploying data lakes and can launch your lake in as little as five minutes. CBTS engineers leverage storage best practices to optimize your data, keep data encrypted, and maximize the speed of search and retrieval.

Contact CBTS to learn more about the ways a data lake could benefit your organization.

Cloud security controls that help mitigate risk

As I mentioned in my previous post on cloud security, depending on the kind of cloud solution you have, you might be the one responsible for implementing any and all security controls.

Woman looking at tablet in server room configuring cloud security controls

All major cloud providers have risks and also have ways of implementing controls to mitigate those risks. There are whole categories of security providers for various part of a cloud security program. As you begin to plan your move to a cloud solution you will see acronyms like CASB, CSPM, CWPP, and SASE.

It can get a little confusing with all the acronyms, but each product has a reason for existing.

Let’s start with CASB or cloud access security broker

A cloud access security broker ensures that the user trying to access a cloud service (think Salesforce or Office 365 or SAP) should be able to access the service, and that they are doing only the things they are supposed to do.

Obviously, there are some fundamental controls that you want to have in place for your cloud applications. You want to be able to see what your users are doing in the cloud (visibility), you want to detect threats to your systems and data, and you want to make sure you maintain compliance with the regulations that apply to your organization.

At the most basic level you want to make sure only the people you allow can access the cloud services you use. In other words, should John be able to access customer data stored in Salesforce?

In addition—and more importantly—you want to make sure they can only do things they are supposed to do. As a security professional, you want to make sure John does not delete or modify data he shouldn’t. CASB provides controls and visibility over what John does when he signs into Salesforce.

The basics just won’t cut it against today’s security challenges

You might think, I already have Active Directory (AD) or some other identity management (IM) tool (Okta, OneLogin, Centrify, etc.), why do I need a CASB solution? Well, your IM solution might only work for local access, or it might not be tied into or connected to your cloud solution. CASB is designed, as the name implies, to broker the access between the IM solution and the cloud service.

For example, think about the steps that go into giving a new hire  access to all the services they need to do their job. You want to give the new hire an e-mail account, access to the payroll system to enter their time, and then—if they are in sales—access to Salesforce or a similar tool to track and follow up on leads. If they are writing or reading reports, they need access to the collaboration tool/Office product (O365 or Google Workspace, etc.).

What is often overlooked is one of the big gaps for a lot of companies: de-provisioning services when someone leaves an organization. Provisioning a new hire with access to the applications they need to do their job is often automated with a well-designed workflow  with few manual steps. De-provisioning access is often not as well–automated;  frequently employees retain access days or weeks after they have left the company, even when the separation (i.e., firing) was not on good terms.

A CASB solution that controls who has access to what cloud services can help simplify both ends of the provisioning workflow. As a result, you can end up with an automated workflow that can very quickly grant and remove access with the click of a button.

Now we will look at cloud security posture management or CSPM

CSPM is a tool or set of tools that ensures that the controls you want to have in place for your cloud environment are correct. Your organization might have to follow a particular security standard like NIST 800-53 or ISO 27000 due to government regulations. A CSPM tool can ensure all your cloud infrastructure stays in compliance with those security standards.

Numerous security breaches have happened due to misconfigured permissions with cloud storage. Mismanaged Amazon S3 buckets have caused major data disclosures. Companies that thought they had good practices in place—like Booz Allen Hamilton and Deep Root Analytics in 2017—leaked data because of misconfigurations.

A CSPM will constantly monitor your cloud environment for configuration changes and settings to make sure that the rules and controls you want to have in place for your environment are in place. Additionally, some solutions will automatically fix incorrect settings to ensure compliance with privacy laws and government regulations regarding data privacy.

Go straightforward with a cloud workload protection platform (CWPP)

Cloud workload protection platform is designed—as the name sounds—to protect what you are doing in the Cloud from attacks by malware or viruses. Just as you run endpoint protection software on servers in your datacenter, you want the same thing happening in  your cloud environment if you are hosting your own servers or virtual machines. Most CWPP solutions offer an agent version, just like you use now, or an agentless version that pulls information from your cloud-hosting environment. While there are advantages to the agent version, you typically get better intelligence from the agent version at the cost of performance in your cloud environment. The agentless version usually has no impact on your cloud workload, but typically you will not get all the details that you get from an agent.

Relative newcomer secure access service edge (SASE) can give smaller business more security attitude 

Secure access service edge, known as SASE (pronounced “sassy”), is a cloud-based information technology model where both the network and the security for the network are offered on demand without having ownership of the hardware or security tools. This kind of solution is growing in popularity for small startup companies and companies that are very flexible because you purchase your networking and security as you need it.

SASE typically has four main components:

  1. A CASB solution to provide security for your cloud applications,
  2. A secure web gateway (SWG) for access to your cloud applications where you can implement
  3. Your zero trust network (ZTN), and finally,
  4. Firewall-as-a-Service.

This is a lot of acronyms and buzz words, but they can and do really work together, with the result that you can implement very good security controls if you design your cloud environment with SASE in mind.

SASE works best and easiest when you have a totally cloud environment. You can see why that would make it appealing to startup companies that do not have legacy hardware and storage and other technology that must have security “bolted” on later to make it cloud-friendly.

I can hear some of you saying, “What is the key takeaway?” 

For CIOs and IT Directors, the key takeaway is that there are advantages to moving on-premises storage and computer systems to a cloud service. However, you need to carefully plan what you are moving, why you are moving it, and what controls will you have in place to make sure the systems and data you move to a cloud service (SaaS, IaaS, PaaS) is as secure as you need it.

For security practitioners, you need to recognize that the security controls you use for on-premises assets are not always the same controls you use for cloud assets.  Consequently, your thinking needs to shift and you need to make sure the controls you use are appropriate for cloud hosted assets.

If your company is relatively new and does not have a significant investment in on-premises computer resources, your move to the cloud could be smooth and painless. On the other hand, if your company is a mature company with lots of assets on premises and in-house, as well as custom applications, your journey will likely be longer and require significantly more planning and preparation.

I hope this has been helpful, reach out and let me know if you have any questions.

Read more from John Bruggeman:

Weighing the risks and benefits of moving to the Cloud

2022 Cybersecurity Predictions

Cyber Insurance, part 1: What is Cyber Insurance and do I need it?

Cyber Insurance, part 2: Getting ready for the insurance company questionnaire

Cyber Insurance, part 3: Filling out the questionnaire

Cyber Insurance, part 4: What do you do if your cybersecurity insurance policy is denied?

Optimize your remote work migration with a VMware service provider

The coronavirus pandemic amplified the need for organizations to become more agile and resilient. As a VMware service provider, CBTS has been helping businesses across industries embrace remote work, redefine their digital strategy, and accelerate large-scale migration efforts.

According to the 2020 State of Remote Work, approximately 98% of remote workers said they would like to work remotely—at least part-time—for the rest of their careers. With remote work here to stay, organizations supporting a dispersed workforce can leverage the CBTS and VMware partnership to transform their enterprise WAN connectivity and manage thousands of devices securely.

Establishing a powerful WAN Edge infrastructure with CBTS and VMware

For the fourth year in a row, Gartner has recognized VMware as a Magic Quadrant leader for their completeness of vision for WAN Edge Infrastructure. CBTS is a top VMware service provider and VMware Partner Innovation Award winner for SD-WAN. Together, CBTS and VMware transform enterprise WAN connectivity by harnessing the public cloud to host critical applications. 

As part of the VMware SASE™ vision, the SD-WAN solution is built to be future-proof, with the speed, simplicity, and security today’s agile businesses need. CBTS delivers on that vision by providing organizations with cloud on-ramp services via global cloud gateways, implementation of VMware orchestrator, and rapid deployment of edge appliances based on assessed needs.

CBTS has demonstrated expertise as a VMware service provider, enabling organizations to take full advantage of the SD-WAN technology through deployment and ongoing monitoring and management across locations. The award highlights the significant ways CBTS empowers organizations of all sizes in their digital migration, with a full suite of flexible technology solutions that drive business outcomes, improve operational efficiency, mitigate risk, and reduce costs for clients.

Read more: How SD-WAN security enhances critical business applications

Adopting UEM strategies for modern desktop management and improved security

Last year, VMware was also named a 2021 Gartner Magic Quadrant leader for unified endpoint management (UEM) tools. VMware scored highest in three of four use cases in the 2021 Gartner Critical Capabilities for UEM tools. CBTS has been at the forefront of helping customers mobilize for the “anywhere” workforce spanning geographies, use cases, and a myriad of devices.

Organizations leverage CBTS expertise in VMware Workspace ONE UEM to establish modern desktop management, including BYOD and increasingly diverse endpoints, for Windows, Linux, macOS, and Chrome OS, among others. CBTS assesses network vulnerabilities to enforce security-centric UEM strategies to protect against cyberattacks on the perimeter.

CBTS and VMware partnership in practice: a powerful networking combination 

CBTS teamed with VMware for SD-WAN beginning in 2017 to help enterprises deploy flexible, secure WAN connectivity across their remote and branch locations. 

So, when a large healthcare system in Indiana approached CBTS to build out several pop-up clinics during an upswing in COVID-19 cases, CBTS engineers were ready to deploy an SD-WAN solution to support clinic operations in only 24 hours.

Read the full case study: IU Health deploys COVID-19 remote testing centers in 24 hours with SD-WAN

Partnering with CBTS as your VMware service provider enables your organization to:
  • Address performance issues associated with latency while accessing data sources distributed across locations, devices, and geographies.
  • Reduce overall costs and stabilize IT budget by eliminating sweeping redesigns and the expense of maintaining outdated, disparate networking equipment.
  • Improve network reliability, redundancy, scalability, and security.
  • Remove IT burden from staff through reliable automation and cloud management.
  • Transform business outcomes with enhanced real-time operations and agility.

Growing alongside an evolving ecosystem

CBTS and VMware power the world’s complex online networking infrastructure, an ecosystem that will evolve alongside ever-changing enterprise needs. CBTS SD-WAN technology powered by VMware optimizes available bandwidth for remote workers in three ways:

  • Giving higher priority to business traffic than to social media and streaming service traffic.
  • 24×7 monitoring of the traffic path for packet loss or delay while applying forward-error correction to increase throughput and reduce packet loss.
  • Using traffic-handling techniques to throttle unnecessary application traffic.

By leveraging the CBTS and VMware partnership, your organization can implement cloud-based connectivity—and all its benefits of ubiquity, high bandwidth, and low cost—with assured quality, reliability, and security.

Weighing the risks and benefits of moving to the Cloud, part 1

In this blog post, I’ll talk about the risks and benefits of moving some or all of your information technology to a cloud provider.

Cloud computing. Cloud storage. Cloud infrastructure. Everything seems to be moving to the Cloud and everything in the Cloud is better. Right?

Often people make the assumption that moving their computer hardware (servers, switches, firewalls, applications) to the Cloud brings all of the benefits – everything is secure and safe “up there” and a lot of your security problems are fixed, there is nothing to worry about now!

Not so fast there, partner.

As with so many things in life, it’s more complicated once you get into the details. If idioms like “God is in the details” or—conversely—“The Devil is in the details” come to mind, feel free to choose which one works best for you.

The reality is that things can be more secure when outsourced to a cloud provider, but that is not always the case. You need to be very clear and precise when you sign a contract with your cloud provider (AWS, Azure, GPC, or a local provider) so that you get the functionality that you want, need, and expect.

To start, ask yourself two fundamental questions before you sign on the bottom line.

1. What do you want to outsource to your vendor partner? Your data center? The day-to-day operations of your IT department? Your nightly backups? The patching and updating of your software and hardware?

2. Are you trying to defer risk or lower cost by using a cloud vendor? If so, what risk? The risk of a power failure taking your computer systems offline for hours or days? The risk that a tornado will destroy your computer facilities and take you offline for weeks or months?

Let’s look at the first question.

What do you want to outsource to your vendor partner?

Depending on your level of commitment, you can realize a range of benefits by moving to the Cloud. For some of you, the desire is to get the hardware out of your current space and move it to a trusted, more physically secure space that has good backups, redundant power supplies, a generator, etc.—that is to say, a secure cloud environment.

In this case, you are outsourcing the physical hardware to a third party so that your IT staff can focus on the software and applications you need to run your operations. This is a good choice if you have a limited staff because then they can focus on making sure the operating systems and applications stay patched and up to date. You can also transfer capital costs to your operations line, which can help with your budget. Instead of spending $100,000 or $500,000 (or more) every three years to upgrade hardware, you have a fixed fee for a fixed period of time (3-5 years), which makes it easier to budget. This is often called Infrastructure as a Service (IaaS).

In other cases, you need to outsource more than just the hardware. You want to outsource the hardware, software, licenses and your applications to a trusted vendor partner. As a result, you can remove several lines from your capital budget (CapEx) and perhaps some from your operations budget and turn them all into operations expense (OpEx). This can be very helpful for the CFO and the finance departments for budgeting. It can also improve the return on investment or cash flow and pay dividends (real dividends) to your stakeholders.  You now have a Platform as a Service (PaaS) to operate your business and you don’t need your IT staff.

You can also outsource just parts of your IT operations to improve efficiencies, ensure critical functions happen when expected (backups, patching, vulnerability scans), and document that you are meeting compliance requirements.

Key things to remember when reviewing your options for either PaaS or IaaS:
  • Purchase just what you need but make sure you can grow or shrink as needed.
  • Make sure you have service level agreements (SLAs) for the services you purchase. Do you need 99.999% uptime or 99% uptime? There is a big difference in price.
  • Have a way to get your data back if/when you want to change vendors.
  • Assume nothing; confirm everything in the contract.

Now we can look at the second question.

Are you trying to defer risk or lower cost by using a cloud vendor?

Most people move to the Cloud for one (or both) of two reasons: to lower costs (which doesn’t always happen) or—more often now—to defer risk.

What risk do you want to defer?

Downtime? If the goal is to minimize the risk to your company or organization from a power outage or a natural disaster, ensure your vendor partner has five nines (99.999%) uptime in the SLA for those guarantees that the site (or sites) will not be down for hours or days if you lose power or a flood or hurricane hits.

Keep in mind that moving to the Cloud will help minimize the risk of downtime from a natural event, but human error can be as big a factor in terms of taking a site down for hours or days. If you have granted too much privilege to a user who does something bad—either intentionally or accidentally—you can go down as easily as if your site was hit by a tornado. Make sure you have clearly identified the risk you want to mitigate. Review your risk assessment and mitigate that risk based on the value of the asset or assets. You don’t want the cost of the control or protection of an asset to exceed the value of that asset.

Read more: After the Smoke Clears – What we can learn about risk management

Other kinds of risk you might want to defer by moving to a cloud solution:
  1. Reliability of your current IT infrastructure: Your hardware might be old and unreliable.
  2. Managing your current environment might be hard or impossible due to poor or limited documentation. Your current documentation might be out of date or lacking detail that makes it hard to manage or audit for compliance or change management purposes.
  3. Growing your IT infrastructure might be hard or impossible to do because of the constraints of the current environment (for example, the server room might not have enough power to add more hardware).
  4. Physical security could be hard or impossible to implement due to where the hardware is deployed. Making sure only the right people have access to the servers and switches can be difficult when IT growth is organic and not planned.

Is it possible to lower costs if you move to the Cloud?

The answer to that question is a very firm maybe. Just like the cost of a car depends on the features you want, the cost of moving to the Cloud depends on what you want from your cloud environment.

A high-end sports car costing upwards of $130,000 or more will get you to the grocery store, but do you need that high-end sports car? Probably not. It will look cool and go fast, but a small SUV might be just as good to get the groceries, and that small SUV might only cost $30,000.

You can spend $130,000 a month (or more) on cloud services, but do you need everything all that spend buys? It depends. Ask yourself these questions as you begin your journey to the Cloud:
  1. What am I spending now for IT? Namely, what do the servers, switches, storage, processing power, cooling, and electricity cost on a monthly basis?
  2. Do I need everything I have now? Do I need 20 TB of storage or is some of that data legacy data that can be deleted as part of my data retention policy?
  3. Do I need redundant servers or the amount of capacity I have right now? Can I retire that legacy technology and consequently reduce my recurring spend?
  4. Do I have duplicate services for my information processing needs? Do I have one system for CRM, or do I have the “main” one and another department has a duplicate system? Can I remove inefficiencies from my information technology stack?

Decide which benefits of moving to the Cloud are most important to you

There are other questions to ask, but these are a good start as you evaluate your move to the Cloud.

It will help a lot if you have your risk registry strategic plan in hand so that you make good decisions based on data. Moving your IT operation—even just a portion of it—to the Cloud is not a decision that should be made quickly.

Taking your IT systems out of your realm of control and placing them with a trusted third party is a strategic decision, so take time and think it through.

Now that you know all the benefits, over the next few weeks, I will cover the major cloud providers and the risks that you need to consider with each of them along with the general risks inherent with moving to the Cloud.

Stay tuned!

Read more from John Bruggeman:

2022 Cybersecurity Predictions

Cyber Insurance, part 1: What is Cyber Insurance and do I need it?

Cyber Insurance, part 2: Getting ready for the insurance company questionnaire

Cyber Insurance, part 3: Filling out the questionnaire

Cyber Insurance, part 4: What do you do if your cybersecurity insurance policy is denied?

Webcast recap: What does the workplace of the future look like?

When the Coronavirus pandemic of 2020 sparked a nationwide shift toward remote work, many assumed physical offices would return eventually. Today, the business world realizes that there is no going back to the collaboration methods of the past. Which leaves enterprise leaders wondering: What does the workplace of the future look like?

Experts from CBTS recently hosted a webcast with the goal of shedding light on the future of remote work. Head of Business Development Jon Lloyd and Global Solutions Architect Justin Rice shared their thoughts and fielded audience queries. The conversation was centered around answering the burning question: What does the workplace of the future look like? For Lloyd and Rice, this issue boils down to three primary concepts.

Three must-know topics for building your workplace of the future:
  • Maintaining a consistent and satisfying user experience.
  • Ensuring your network is designed for a distributed workforce.
  • Security your remote workforce against any potential threat.

Also read: Key SD-WAN advantages your hybrid work from home model needs.

Experience is everything

The key to building toward the future of remote work is delivering reliability and ease of use to employees, Lloyd explains. Regardless of whether they’re in the office or working from home, employees expect their networks to function without difficulty. Members of a distributed workforce expect three things: no delays, no downtime, and “always-on” connectivity.

“Employees want the ability to work from anywhere,” Lloyd said. “The importance of the end-user experience is critical. That has to be (priority) number one.”

Lloyd urges enterprise leaders to think of remote work not as closing one centralized office but as opening hundreds of smaller offices. This means that each employee should enjoy the same performance at home as they would at company headquarters. One effective way to achieve this remote performance is to decentralize proprietary applications and shift away from physical data centers. That’s where the cloud shines.

“The first big winner in this new shift of work-from-home is cloud adoption,” Lloyd said. “There’s just no way for you to financially take an application and distribute it across the globe without utilizing a cloud provider.”

As organizations ask themselves, “what does the workplace of the future look like?” cloud-native network architecture is rapidly emerging as the answer.

Also read: How SD-WAN & NaaS come together to supercharge remote work productivity

Cloud-native, remote ready, and secure

Rice adds that prior to the pandemic, high costs discouraged many enterprises away from cloud adoption. As public health concerns pushed this technology to the forefront, many businesses leaped before they looked. This had the unfortunate result of distributed workforces utilizing applications that weren’t properly built for the cloud. That’s why cloud-native applications play a significant role in the future of remote work.

Cloud-native applications not only boast greater reliability but also tend to be more cost-effective. According to Rice, one network component that benefits heavily from decentralization is security. This is especially the case for enterprises struggling to secure the connections used by their distributed workforces.

That’s where Secure Access Service Edge (SASE) comes in. SASE works by implementing zero-trust network access at every endpoint. This means remote workers can safely utilize the applications they need without having to connect to centralized data centers.

When implemented and supported by an expert provider, SASE also has the benefit of superior visibility. With a single dashboard allowing access to your entire SASE framework, your enterprise can greatly simplify its cloud transition process.

These topics add up to one common goal: empowering remote workers to do their jobs from anywhere while maintaining high standards of network security and performance.

Also read: A Q&A on Microsoft Teams in the Big Picture

“What does the workplace of the future look like” and other questions

However enterprises decide to enable their distributed workforces, Lloyd and Rice were confident that remote work is here to stay. Whereas remote work was seen as optional pre-2020, the pandemic has made it mandatory and part of the “new normal.” As a result, enterprises are investing in giving their remote employees the highest-quality connections possible.

Flexibility is also becoming a key factor in the future of remote work, Lloyd and Rice added as they fielded audience questions.

“It’s not ‘work from home,’ it’s ‘work from anywhere,’ and it’s thousands of new offices on your network,” Lloyd said. “And I think—in the past—we treated work from home as ‘it doesn’t have to be as good’ or ‘it isn’t permanent.’”

“There’s certainly been a cultural shift,” Rice added. “Working from home, especially if you had a corporate office, it kind of felt like you’re on an island. Now, working from anywhere has become the new normal.”

Webcast attendees submitted questions on subjects ranging from outsourcing to how recruiters can make in-person office environments appealing again. Lloyd and Rice suggest that since the expectations of modern employees have changed, companies will need to adapt accordingly.

“It’s about making the office interesting and accepting that you’re going to have employees who just don’t want to come in,” Rice said. “What we’re seeing is more co-working spaces. It’s not coming back to a cubicle; it’s more of an open, collaborative environment.”

“We’re redefining what a workplace is. From a recruiting standpoint, for companies that are going to require (in-office attendance), they’re going to have to pay for it,” Lloyd added. “If you’re going to require folks to come in, you’re going to have a responsibility to keep them safe, which is going to increase costs, and you’re probably going to have to offer more than you typically would for that position.”

Also read: Give your remote teams the tools to connect seamlessly with Cisco Webex

“Howdy Partner” panel discusses business benefits of Data Lake Kickstarter tools

When it comes to explaining technical topics, there’s no substitute for an expert panel sharing knowledge in an audience-friendly format. That’s why CBTS joined a recent installment of “Howdy Partner” by AWS to discuss the business benefits of data lake optimization.

Howdy Partner is a weekly live stream series hosted by AWS, typically featuring solution architects and engineers discussing specific technical subjects. In early September, AWS hosted CBTS experts to discuss how the Data Lake Kickstarter initiative can enable enterprises to evolve their networks. AWS partner solution engineers and Michael Lanthier hosted the stream with CBTS solution architects Tim Selaty, Davis Gossett, and Scott Franke.

The panel discussed the business benefits of improving data lake functionality while demonstrating the CBTS Data Lake Kickstarter’s time-saving features.

How to create a Data Lake Kickstarter program with CBTS
  • Replace aging databases with scalable, flexible data lake infrastructures.
  • Easily access any data from any sector of the organization with the Simple Data Integrator (SDI).
  • Scan more data in less time and reduce the number of engineers required to execute deployments.

What is the Data Lake adoption?

Before delving into the business benefits of implementing Data Lake on AWS, the panel discussed the history of database architecture. For many years, relational databases were commonplace solutions for collecting, storing, and analyzing large quantities of data, Tim Selaty said.

However, these databases and their sources were often fragmented, and managing them was complicated and very time consuming. This meant that maintaining large-scale databases was expensive, and queries with large results tended to slow down operations. As a result, database engineers often spend more time maintaining than innovating.

Data lakes—centralized repositories capable of storing structured and unstructured data with equal ease of access—have since caught on as popular database solutions. Despite their many strengths, however, using data lakes effectively takes significant insight and experience. CBTS introduced the Data Lake Kickstarter to address this, which offers prescribed, serverless workflows and customizable admin interfaces.

The Data Lake Kickstarter program, which is built directly off of the original AWS Data Lake framework, is delivered as a managed service. This means bugs are fixed and features are added on behalf of the user. Also, since the data lake solution is built on AWS serverless, clients only pay for usage time. This means clients can deploy prescribed workflows while keeping full control of their AWS account while only paying for what they use.

Also read: Managed services solution enables client transformation

Get to know the business and organizational benefits of Data Lake Kickstarter

  • Eliminate the heavy lifting of creating a data lake and decrease time to insights from months to minutes with “one click” data lake creation and data loading.
  • Eliminate the need for developers, DevOps engineers, DBAs and data architects to create and maintain a data lake with this service from CBTS.
  • 95% savings in processing costs vs. a standard data lake in the cloud.
  • 95% faster processing vs. a standard on-prem data lake solution.
  • Optimized data lake tools for ServiceNow data for better and faster insights.
  • Scalable to billions of records.

Where legacy databases struggled with unstructured data, Data Lake Kickstarter can access nearly any data thanks to Simple Data Integrator (SDI). The serverless framework of SDI allows for simplified product deployment and extensive data access. With these tools, users will have full access to the data they need without struggling with the interface or backend barriers.

During the Howdy Partner live stream, Tim Selaty demonstrated the Data Lake Kickstarter’s features in real-time, showing viewers how workflows can be tweaked and customized to save time and effort. By simplifying both product deployment and data access procedures, Data Lake and its various features make life simpler for developers and end-users alike.

Also read: CBTS Managed Public Cloud: Powered by AWS

Find the right partner to help you build your data lake

Knowing how to build a data lake environment is only part of the battle—being able to call upon a qualified partner is also invaluable. As one of only 258 AWS Advanced Consulting Partners worldwide that is certified as Well Architected, CBTS is up to the challenge. Data Lake Kickstarter by CBTS offers a shared ownership model, meaning CBTS is accountable for the performance of your organization’s cloud environment.

Contact us to learn more about the business benefits of deploying an AWS data lake environment customized for your enterprise.

Delivering the promise of Microsoft Workloads on AWS

This blog will illuminate how CBTS can deliver the full potential of Microsoft Workloads on AWS to your enterprise.

A nimble, scalable public cloud has the ability to improve the bottom line and drive internal efficiencies for the busiest of enterprise organizations. A dozen years of satisfied customers running Microsoft Workloads on AWS indicates a platform known for high performance and reliability, not to mention top-of-the-line security, migration support, ownership cost, and flexible licensing options.

Microsoft Workloads on AWS woman dipping coffee cup at computer
Microsoft Workloads on AWS is known for high performance and reliability.

CBTS can enhance your digital transition on the service and management side, delivering deep functionality and a robust suite of services to lower the costs of running Microsoft Workloads on AWS. Benefits of this best-in-class technology include:

  • Dependability. Globally-dispersed availability zones result in fewer downtime hours.
  • Cost-effectiveness. AWS helps customers save on Microsoft Workloads compute costs through pricing models such as Spot and a full set of Amazon Elastic Compute Cloud (EC2) instances.
  • Powerful security. A selection of security, compliance, and governance services makes AWS the superior choice over the next largest cloud provider.
  • Performance efficiency. SQL Server on AWS enables faster performance when utilizing a TPC-C-like benchmark tool as compared to competing cloud providers.

Also read: The value of the cloud in our new reality

AWS brings premium performance and capabilities

AWS is the ideal cloud on which to run Microsoft Workloads, while CBTS will support everything your company needs to modernize Windows-based applications on the cloud. Partnering with CBTS brings top-tier performance to your network alongside a broad range of services.

Here’s a deeper dive into what CBTS on AWS can offer your fast-moving organization:

Optimized performance and dependability: The AWS global infrastructure includes 77 availability zones across 24 regions. Re-establishing your Microsoft Workloads on AWS will give your customers a 98% reduction in unplanned downtime, 71% faster deployment, and 26% higher developer productivity, per figures from IDC. According to Principled Technologies, AWS delivered twice the performance and 62% lower costs for an SQL Server workload when tested against the next largest cloud provider.

Top-flight security and identity services: AWS boasts 230 security, compliance, and governance features—about five times more than the nearest industry competitor. Through the AWS Nitro System, for example, virtualization functions offloaded to dedicated hardware and software results in a minimized attack surface. Busy enterprise organizations can rely on AWS Identity Services for the management of identities and permissions at scale.

Lower TCO: By running Windows on AWS, companies can reduce five-year operational costs by 56% and infrastructure expenses by 37%, according to IDC. AWS also offers clients pricing models like Amazon EC2 Spot, which reduced costs by up to 90% on compute instances for fault-tolerant workloads.

CBTS enhances the AWS adoption process in four key ways::
  • Assessment. CBTS provides a detailed analysis of existing network infrastructure, allowing clients to understand the entire IT environment and how it supports Microsoft Workloads on AWS.
  • Application Modernization. A team of certified AWS experts from CBTS will help you establish the most efficient, cost-effective cloud environments, including the refactoring of existing applications.
  • Easy transition. Once design is complete, CBTS engineers convert legacy applications and infrastructure, then oversee the entire atmosphere’s migration to AWS.
  • Management. A fully-managed Microsoft Workloads infrastructure provided by CBTS is bolstered by day-to-day monitoring and adjustment.

Also read: Efficiency, security, and savings are in reach with managed database services

A nimble and innovative enterprise cloud provider

AWS is known as an innovative cloud provider thanks to new capabilities and services introduced to enterprise customers. Take the case of a travel-booking outfit with plans to migrate 80% of its mission-critical apps to the cloud; while the company considered a host of other cloud providers, it chose AWS due to its global infrastructure support of Asia-Pacific customers.

Today, this customer relies on AWS to develop applications, troubleshoot problems, and scale to process large volumes of data. By using AWS to build a standard deployment model, IT teams can rapidly create infrastructure for new initiatives.

Task CBTS for your digital transformation 

Leveraging CBTS to migrate your Microsoft Workloads to AWS allows you to commission one, hundreds, or thousands of server instances simultaneously, with greater reliability across globally dispersed availability zones. As a seasoned provider for digital modernization, CBTS brings numerous critical capabilities to the development of hybrid cloud infrastructure and in the AWS environment itself.

Also read: How to build a full-spectrum cloud security strategy

Read this e-book to learn more about how CBTS can assist with your Microsoft Workloads on AWS transformation.

Read more from Kevin Muldoon, Sr. Director of Cloud Transformation:

Securing success with an AWS Well-Architected Framework

Take control of third-party programs with proprietary AWS platform solution

What does a successful migration to the public cloud look like?

Answering the vital question: What is NaaS, and how can it improve business outcomes?

Modern organizations face a host of communication obstacles, among them the issue of business growth outstripping network resources. When IT personnel are stretched thin, enterprise networks can suffer increased downtime and higher operating expenses. This can lead to lag and outages even during day-to-day operations.

Businesses across industry sectors are turning to Network as a Service (NaaS) to streamline their IT needs. With this solution, clients can offload their daily tasks to CBTS experts, freeing up resources better dedicated to mission-critical strategic initiatives.

So what exactly is NaaS, and how can it benefit your enterprise? NaaS from CBTS provides businesses with a scalable method of supporting, maintaining, expanding, and securing modernized commercial networks. This service also enables networks complete with cloud integration, security, switching, Wi-Fi, management, monitoring, and software-defined wide area networks (SD- WAN).

Whether you have a single location or multi-site operation, NaaS delivers features that support the unique needs of every client, including:

  • 24×7 monitoring, management, and support.
  • Instant service upgrades over any connection type.
  • Connections to single locations or multiple offices.

This article will cover the advantages, along with how this technology can be integrated into your business.

Watch: NaaS analytics provide business intelligence to drive growth

Simplifying the everyday needs of any enterprise

When asking the question, “What is Naas?” executives must understand the limitations of their outdated network environments. Traditional networks are simply not equipped to meet today’s high demand for reliable, full-spectrum communication.

NaaS can provide managed networking capabilities as both a comprehensive service and a simple transport method. When combined with SD-WAN and hosted unified communications (UC), NaaS can address a wide variety of Wi-Fi and firewall concerns. NaaS is easily scalable for small and midsize businesses (SMBs) requiring network expansion across multiple locations or facilities. 

MPLS architecture vs NaaS architecture

As a managed service, NaaS functions by clients essentially renting cloud networking services from providers. When deployed properly, it can replace hardware-centric VPNs, load balancers, firewall appliances, and multiprotocol label switching (MPLS) connections.

Naas is built to simplify the day-to-day processes of every client, regardless of vertical or size of their business.

NaaS is built to simplify the day-to-day processes of every client, regardless of vertical or the size of their business. For example, it can regulate guest Wi-Fi usage and provide unified threat management (UTM) protection. It can also turn retail networks into powerful tools capable of enriching the customer service experience.

NaaS can improve critical municipal data infrastructure functions such as emergency services, utility management, and even traffic control. Just as crucially, it allows municipalities to shift costs from a capital expenditure (CapEx) model to an operational expenditure (OpEx) model. The results are lower upfront costs for network rollouts and an organization-wide focus on improving public services for constituents.

To ensure client success, CBTS establishes a monitoring environment that allows for continuous system auditing, patching, and improvements. A partnership between CBTS and software vendor ServiceNow allows for automated monitoring processes that proactively maintain their network databases long-term.

Also read: What is Cloud Networking?

By harnessing NaaS, industries of all types are driving real business outcomes.
  1. A concrete manufacturer with 17 locations used NaaS to increase visibility in their network and save over 87% in network upkeep costs.
  2. An auto retailer with 16 sites reduced reliance on expensive multi-protocol label switching (MPLS) connections.
  3. CBTS solved a school district’s unique needs for a reliable, secure network.
  4. A specialty foods distributor in the Southern U.S. embraced NaaS to integrate new locations and create a consistent customer experience.
  5. Single-site, customer-facing business to regulate guest Wi-Fi experiences and keep customer data secure.

What is NaaS? A future-forward, managed solution for your business

Through a NaaS solution, enterprise networks can quickly expand to new locations and automatically provision site-to-site VPNs for authorized users. Enterprises can also use it to ensure access to organizational servers. CBTS has established NaaS for more than 600 clients at 5,000 locations since launching the solution in 2016. This was possible with the help of partners such as Cisco and VMware.

What are the benefits?

  • No-touch deployment and management, where certified experts support the network on a year-round basis.
  • A centralized, mobile-enabled cloud dashboard with real-time data and network analytics.
  • Application, user, and device control where users can be searched for by any device—PC, mobile phone, tablet, and more.
  • Improved, lower-cost performance and security compared to MPLS.
  • Flexibility in adding new network locations without the expenses associated with MPLS expansion costs.

What are the disadvantages?

  • Usage of a public cloud carries some inherent risk of vulnerability—keep this in mind when selecting a provider.
  • Downtime is another potential hazard of managed network solutions.
  • Dependency on managed services, or “vendor lock-in,” is another aspect of managed cloud networking that merits consideration.

Also read: How SD-WAN & NaaS come together to supercharge remote work productivity

Integrate Cisco Meraki-powered NaaS into your network

NaaS can be enhanced by Cisco Meraki, a platform that enables seamless expansion, maintenance, and security of a managed network. This puts the enterprise network in capable hands during scale-up, giving clients room to reallocate IT resources toward growth objectives.

Meraki-powered NaaS bolsters geographic expansion and increased revenue without the added burden of unnecessary operational expenses. Pricing packages that fold hardware, licensing, configuration, and implementation into one monthly expense can be scaled as needed.

The current iteration of NaaS by CBTS comes complete with Meraki’s single-pane-of-glass dashboard. This originated from a need to modernize outdated equipment without increasing costs or otherwise negatively impacting a client’s budget. The Meraki dashboard features a managed service provider’s (MSP) portal; this enables the management of multiple customer networks with fewer personnel than is possible with simple network management protocol (SNMP) tools. The partnership with Meraki has cut down on the provider’s quote-to-cash interval and operating expenses.

Also read: How Cisco Meraki + CBTS NaaS team up to deliver cost-efficient modernization for your network

A streamlined network solution built for your needs

Commitment, flexibility, and collaboration are the benchmarks for answering the question, “What is NaaS, and what can it do for my enterprise?” By utilizing the latest technology available and anticipating customer needs, CBTS delivers future-forward managed solutions for organizations of every size.   

Contact us for more information on how CBTS NaaS can work for you.