Avere Systems: Joining Microsoft to bring Scalable Hybrid Solutions With Azure

Posted by Ron Bianchini on Wed, Jan 03, 2018 @ 09:00 AM

When we started Avere Systems in 2008, our founding ideology was to use fast, flash-based storage in the most efficient, effective manner possible in the datacenter. Along the way, our team of file systems experts created a technology that not only optimized critical on-premises storage resources but also enabled enterprises to move mission-critical, high performance application workloads to the cloud. Avere’s cloud solutions provide low-latency data access to datacenter storage resources, remote offices and the public cloud. Our customers efficiently share both storage and compute resources across multiple data centers, and effectively implement and use private and public cloud infrastructures.
Read More

Topics: Hybrid Cloud NAS, Technology Community

5 Predictions for What's Coming in 2018

Posted by Gretchen Weaver on Wed, Dec 27, 2017 @ 11:00 AM

As we wind down the year, we took some time to have some fun by asking two of our most vocal employees about what they predict 2018 to bring. Technical Director, Dan Nydick, and Director of Cloud Products, Scott Jeschonek, narrowed down all of their thoughts and settled on the following five trends that deserved to be captured as their official new year predictions.

Read More

Topics: Cloud Storage, Technology Community, Cloud Compute

2018 Outlook: Howard Marks Shares Insights on Data Center Trends

Posted by Gretchen Weaver on Thu, Dec 21, 2017 @ 02:00 PM

A swath of revolutionary new technologies are transforming the once static data center into a highly dynamic environment. Driving this innovation are the demands of today’s modern applications and workloads which require new levels of agility and performance. In a recent webinar with Avere, Founder and chief scientist at DeepStorage.net, Howard Marks, discussed some of the major 2018 data center trends that IT should be paying close attention to––including orchestration, containers and hybrid cloud––and why these are ripe for adoption. Let's review his thoughts as we get ready for the new year to begin.

Read More

Topics: Enterprise Storage, Data Center Management, Technology Community

The New FXT 5850 Edge Filer for High-Performance Hybrid Architectures

Posted by Jeff Tabor on Wed, Dec 13, 2017 @ 09:20 AM

When we launched the Avere 5000 Series last year, the FXT 5600 model added the most performance, density, and capacity. Customers quickly adopted it to support cloud-ready workflows that demanded performance.

But, one thing is for certain, fast today is not fast tomorrow. And, we keep developing to let these workloads run in both the cloud and on-premises at their optimal performance. Our next step is the introduction of the Avere FXT 5850 Edge filer, which delivers double the performance, capacity and network bandwidth as the Avere FXT 5600.

Read More

Topics: Enterprise Storage, NFS Acceleration, Technology Community

One Theorem You Should Understand Before Shifting Applications to the Cloud

Posted by Gretchen Weaver on Wed, Dec 06, 2017 @ 11:31 AM

Not understanding the give and take that exists between consistency, availability and partition tolerance has caused more than a few headaches as people try to shift and optimize applications in the cloud. Like those basic rules you learned in kindergarten, this is one of the biggies to keep in the back of your head now that you're all grown up and working in information technology.

While in Las Vegas for AWS re:Invent, we jumped at the opportunity to let our CEO and former Carnegie Mellon professor return to the board to review the CAP theorem. Grab a cup of coffee and watch this interactive discussion on the give and take between these three important characteristics of computing.

Read More

Topics: Data Center Management, Technology Community, Cloud Compute

Moving HPC Workloads to the Cloud

Posted by Gretchen Weaver on Mon, Dec 04, 2017 @ 10:15 AM

Scientists and engineers use High Performance Computing (HPC) in order to solve complex problems. The requirements for success frequently include big compute, high-throughput, and fast networking. For some time, the cloud simply hasn't been an option for these workloads as they were too big to move, the latency was too high, or it was simply too costly. But, now all of that is changing.

The cloud providers are looking to bring HPC workloads into their infrastructures, as was obvious at this year's SC17 in Denver. Both AWS and Google were talking scale and speed to attendees, promoting abilities to scale parallel tasks beyond what is realistic in traditional local infrastructure. But getting the workloads to the cloud services is another story. To learn how to move file-based applications to the cloud non-disruptively, attendees approached Avere.

In the below video, insideHPC editor Rich Brueckner spoke to Bernie Behn, principal engineer at Avere Systems about how to get high-performance workloads from network-attached storage (NAS) environments onto the cloud.

Read More

Topics: HPC, Enterprise Storage, Hybrid Cloud NAS

IoT, Automated Cars, and Data Growth: How to Leverage the Cloud

Posted by Christine Tompkins on Thu, Nov 30, 2017 @ 10:20 AM

The Internet of Things (IoT) is a term that's commonly used to describe everyday devices, such as lights, refrigerators, and even cars, that send and receive data via the Internet. This interconnectivity provides us with more information than ever before. For example, think about how much data you collect every day just walking around with your Fitbit. From counting steps to measuring heart rate, your Fitbit collects all of this data so that you can later visit their app to see your progress throughout the week, trends, and more. So you can imagine that this also means that there has been massive data growth requiring more and more storage resources.

Read More

Topics: Hybrid Cloud NAS, Cloud Storage

Choosing an Architecture For Simulations and Backtesting

Posted by Scott Jeschonek on Wed, Nov 15, 2017 @ 01:35 PM

Evaluating Your Options to Boost Alpha Throughput via Cloud Compute

 

 

In our last related post, we talked about what alpha throughput was and how technology can have an effect. By increasing their throughput, firms can work to increase potential and gain competitive advantage.

Read More

Topics: Hybrid Cloud NAS, Cloud Compute, Financial Analysis

How to Transfer Large Data Sets to the Cloud

Posted by Christine Tompkins on Thu, Nov 09, 2017 @ 09:53 AM

Each day, organizations are moving more workloads to the cloud, and these workloads are getting bigger. For those with decades of data that are often in the petabytes, transferring large data sets to the cloud isn't so simple. It's not only a matter of moving the data, but also of what happens once the data is in the cloud.

First, it can be time-consuming when trying to use traditional methods of data movement. Second, file-based applications that currently use NAS protocols in the data center are now going to be facing object-based protocols. Traditionally, this would involve re-writing of applications, which is also time-consuming and not cost-effective.

Read More

Topics: Data Center Management, Cloud Storage, Ask Avere Anything

Technology and Alpha Potential

Posted by Scott Jeschonek on Mon, Oct 30, 2017 @ 11:15 AM

People in the financial services industry, especially in the hedge fund space, are always looking for new and profitable trades. However, with large quantities of data available to more players than ever before, the challenge of find an 'edge' is greater than ever. This challenge reaches back from the analysts into the IT department, as the work is completely dependent on a firm’s ability to not only crunch data in less and less time, but access large amounts of it readily.

Read More

Topics: Data Center Management, Cloud Compute, Financial Analysis