Agencies want control over their data as they embrace the cloud — here’s how

Although the initial pace has been slow, federal agencies have been quietly adopting cloud services in recent years. Federal cloud spending has been hovering in the hundreds of millions of dollars annually with current and future years federal spending on cloud expected to climb into the billions of dollars annually.

Many experts expect a significant expansion of federal cloud adoption in the near term for several key reasons. First, most agencies have already made initial forays into the cloud, gaining a better understanding, learning lessons, and building confidence that they can apply to more ambitious cloud ventures to follow.

Second, there are considerable internal and external pressures on agencies to look at the cloud as a critical tactic as they modernize IT operations for improved service delivery and mission support in the modern era and beyond. Internally, agencies are under heavy pressure to rationalize costly IT infrastructures, step up security, address mounting data storage needs, and keep pace with the customer-centric service-delivery models that citizens are used to experiencing in the commercial world. Externally, initiatives such as the recent President’s Management Agenda, the new IT modernization Centers of Excellence being established by the General Services Administration, the Data Center Optimization Initiative (DCOI), and the Modernizing Government Technology Act (MGT Act), are aligning to press agencies to look seriously at cloud options.

A key concern for federal agencies pursuing cloud options is having the flexibility to operate seamlessly across multiple on-premise and cloud-based infrastructures. They require the mobility to migrate data freely (and securely) from one IT environment to another, manage access to data wherever it is hosted, and operate across multiple environments as needed. Moreover, with this capability, end users should not need to be concerned with where their data resides, and security, access controls, and governance policies should follow the data wherever it goes, whether at rest or in transit. These concerns often stymie agencies’ efforts to migrate legacy applications and data into modern cloud-based environments.

Neola Group helps federal clients navigate these challenges by deploying a single, on-premise system that stores, shares, syncs, protects, preserves, analyzes, and retrieves file data for multiple on-premise and cloud-based environments. The system — called the Hitachi Content Platform (HCP) — effectively serves as a policy-based data router that “sees” all data, wherever it resides, and determines where those data are needed based upon the IT, mission, and security policies associated with them. Configurable policies regulate which users and applications can have access to certain data, how long data should exist, and what security controls should apply.

HCP also provides powerful archiving capabilities that eliminate the need for tape-based backups. Its high-density storage offers built-in compression, single instancing, and support for a variety of media to keep storage costs in control. With dynamic data protection, content integrity checks, data retention enforcement, erasure coding and many other technologies to preserve and protect content, HCP delivers compliance-quality data lifecycle protection.

HCP enables end users to rapidly access and query data and applications across physically disaggregated storage pools to get what they need. Data can reside in and be seamlessly accessed from a cloud-based application or a private data center, or both, depending on the policies in effect, providing maximum agility.

This capability is especially valuable for agencies looking to connect cloud-based applications to their legacy data stores or looking to connect multiple, disaggregated data stores — sometimes called “data puddles” — into an aggregated data lake. It also is ideal for helping to tame the accelerating growth of unstructured data and application proliferation that many agencies are experiencing. But this capability also is ideal for agencies that want to maintain their flexibility to employ multiple clouds or to more easily disengage from a cloud service provider in favor of another if they choose.

As federal agencies continue embracing cloud options going forward, they want to maintain their ability to easily adjust those deployments along the way. Employing a transparent data layer that sees and manages all data — regardless of type or where it resides — now offers agencies a critical and empowering capability to control exactly how they leverage those cloud investments without having to worry about how their data inventories may be impacted.


To learn more about how Neola Group can help your agency, go to www. neolagroup.com.

Partnered with:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s