Last year was a big year for the cloud. The provider landscape became more competitive in 2016 with the likes of Google Cloud Platform, IBM, Microsoft Azure and Oracle stepping up to AWS’s dominance to secure a position on the field of viable contenders for enterprise cloud business.
Legacy applications and systems are an important component in nearly all established organizations. With investments already made, it’s hard to abandon them when new technologies like cloud become available. Yet, many IT professionals are still eager to start leveraging the cost, scale and other benefits of cloud-based object storage. However, several obstacles present challenges when running legacy applications on the cloud. This "Ask Avere Anything" video takes a look at those challenges and discusses how they can be easily solved.
Adding cloud accessibility to your data center isn’t a decision most IT directors and systems engineers enter into lightly. A few posts back in Five Benefits of Hybrid Cloud Data Centers, we shared five benefits of going hybrid within a data center environment. We only scratched the surface and are back with yet another five reasons why hybrid cloud has enterprise data centers looking beyond their four walls.
When one thinks of healthcare IT, visions of doctors and nursing typing away on laptops mounted on wheeled carts comes to mind. Lately, to improve personalized service, hospitals have shifted these stations to “charting rooms” that look like a cramped college computer lab from the 90s, before every kid had a computer in their dorm room. However, healthcare IT that supports research looks very different. As innovative research into personalized and precision medicine increases, how these two opposed arms of the same system operate becomes of increasing focus.
This article originally appeared in modified form on Cloud Computing Today as a guest blog post on August 29, 2016.
THE ART OF THE CLOUD WARS
Chinese military strategist Sun Tzu once wrote that battles are won or lost before they are ever fought, but can the same be said for the cloud wars? Though many industry thought leaders have made predictions, the future of how the battle between public cloud providers will unfold remains hazy. Despite documented projections that global IT spending will fall in 2016, investment in public cloud services is expected to grow 16% this year, fueling the fire of the on-going war. Most industry experts agree that AWS, Google Cloud Platform, Microsoft Azure and IBM Cloud Services are the key players to watch, however many CIOs still struggle with determining which cloud service provider is right for them.
With so many initiatives targeting efficiency in government data centers over the past decade, many agencies have begun the transition to alternative cloud-ready solutions. But for many others, cloud readiness is still in the “someday soon” pile. Many things delay cloud adoption — resources and budget frequently top the list. While the shift may seem insurmountable, when broken into smaller steps, making the data center accepting of cloud resources is both manageable and rewarding.
In a recent webinar, director of cloud products at Avere, Scott Jeschonek, suggested three steps that not only assist in gaining access to the cloud, but overall will help modernize the federal data center. In this instance, modernization refers to the acceptance of new, recommended cloud compute and storage rather than efficient data center heating and cooling mechanics. Each step mentioned adds flexibility and performance while requiring little disruption to daily operations.
"And it sounds a little bit simple, but the reality is that there's so much inertia all over these organizations in continuing to do things the same way they've been done for the last number of years, for a variety of different reasons," Andy Jassy said from the stage AWS Public Sector Summit in Washington, D.C., on Tuesday.
Speaking in his keynote, the head of Amazon Web Services (AWS) spoke not only about cloud inertia, but how the retailer got into the cloud services business, the shifting market from servers to services, and the facilitation of innovation to support public sector missions.
One of the object storage use cases gaining traction is active archive, a storage library where users can relatively easily access older files compared to other archival options like tape or public cloud services for long-term cold storage like Amazon Glacier. Object storage offers a nice combination of benefits, which align well with growing collections of content — from digital media files to historical trend data commonly used by analysts in many industries.
When storing file data into cloud objects, many people automatically take a direct approach by writing applications to natively use object storage for data processing. While this can have some advantages, it isn’t always the best choice for all workloads. So, when storing cloud objects, what is the best way?
In a recent Ask Avere video, we answered this question by comparing file-to-object with direct native object calls, explaining the key differences between the two approaches. However, we felt the topic important enough that we wanted to expand our response. To do this, we’ll look at what you get with cloud file storage that you don’t get with direct native object calls to your chosen cloud storage provider.