Object Storage Isn’t Just S3 Access

High-throughput, feature-rich S3 object storage for hot/warm content and enabling search combined with the economics of tape

Many in the storage world still have a skewed view of object storage. This really isn’t a surprise given that there are three concepts that need to be defined when talking about object storage, and different storage vendors often have different definitions of those concepts. The 3 concepts are interface/protocol, object structure/system and storage media. In this blog, I am going to focus primarily on interface and protocol.

Object Storage Interfaces and Protocols

Before Amazon S3, all object storage solutions had their own proprietary RESTful interface enabling distributed access over HTTP. In fact, the now de-facto-standard Amazon S3 API is a proprietary interface that Amazon has been kind enough to open up to the broader market and it is now the interface for all major object storage solutions. Most vendors still provide their own proprietary API for feature sets beyond what the S3 API allows. Adding to the complexity is that the S3 interface is also available via non-object storage devices like filers, NAS devices and now tape. If you would like a high-level overview of storage interfaces and protocols, check out this blog.

S3 on NAS and Filers

For many traditional workflows, S3 access on an existing NAS and filer may be an adequate “quick fix.” It really depends on scale. As the capacity and underlying NAS and filer infrastructure grows, so do the traditional NAS and filer issues of cost, recovery times, expansion and management. Additionally, as distributed workflows become the norm and file count grows, the need to provide easy-to-manage distributed access to an increasing user base and to enable metadata customization and search becomes more important.

Additionally, I would caution that just because your NAS performs well for your traditional file-system-based workflows, it may not necessarily lead to similar performance for RESTful use cases. When you are utilizing an interface on top of a file system, there is a continual translation and conversion going on from the file system to the S3 interface. Ultimately, depending on load, your system will reach its limit and then you will need to expand. This is where both cost and management become an issue when you use NAS/filers with an S3 interface, versus implementing a pure object storage solution.

S3 on Tape

For many years, there was a competition between tape and object storage. This was a direct result of object storage being relegated to the “cheap-and-deep” category of storage. It should come as no surprise that tape is and will most likely always be the cheapest way to archive content (for the foreseeable future). That said, the use of pure object storage as a hot/warm tier in front of a cold tier of object storage enabled by tape has increased in popularity. This is how Amazon has architected the S3 service, utilizing disk-based object storage with millisecond response times and also offering S3 Glacier Deep Archive with hour response times.

The market has followed suit. Earlier this year, FUJIFILM Corporation announced their Object Archive software, which uses their open-source file format, OTFormat, that they developed specifically for object data. Caringo Swarm certification was just announced earlier this week, but we have been working with FUJIFILM for some time on integration.

Benefits of Caringo + Fujifilm Solution

There are numerous benefits that the Caringo Swarm Intelligent Data Management Platform + FUJIFILM Object Archive solution brings to organizations by combining high-throughput and feature-rich object storage for hot/warm content and enabling search with the economics of tape. This solution is ideal for organizations looking to provide a complete S3-like service internally or externally. Amazon has optimized availability, performance and cost by mixing the performance characteristics of HDD and tape under the S3 API, and the combination of Caringo Swarm and FUJIFILM Object Archive delivers the same result.

The Caringo + FUJIFILM solution is also ideal for highly secure deployments with petabytes of data, as the solution can enable RESTful-interface-based workflows while still benefiting from the cost-effectiveness of putting infrequently accessed data on a cold archive enabled by tape. And, what makes it even better is that integrating the storage products is straightforward, as demonstrated in this video.

S3 on a Pure Object Structure/System

When you get down to it, the underlying structure and system that directs the incoming data to storage media and infrastructure is what defines object storage. In all object storage systems, there is some form of key/value addressing system in a flat address space—meaning data is “checked” in and an ID for that specific piece of data is returned. All you need is that ID to retrieve it. This is the “key” to object storage (pun intended) and what makes object storage ideal for enabling distributed workflows.

You don’t need to keep track of server names, directory structure and file names. In addition, you have the ability to manage multiple tenants, customize metadata, search, and deliver content directly from the storage layer over HTTP (with range reads). As distributed workflows become the standard, storage devices optimized for distributed access will become more critical for all types of organizations.

An Ending Note

With all that said, this should not be an “either-or” discussion. You should take a tiered approach employing different storage devices and solutions based upon your evolving requirements.

I invite you to register for episode 13 of our Brews & Bytes webinar where I will host a panel to discuss the future of data archive. My guests will be Eric Dey, Caringo Director of Product, Rich Gadomski, Fujifilm Head of Tape Evangelism, and Nami Matsumoto, Fujifilm Director of Data Management Solutions. We will discuss:

  • The functional role of the archive in today’s workflows
  • Leveraging tape, HDD and object storage to optimize accessibility and cost
  • Where archive technologies are headed, and a look at active archiving

As always, we are here to help. Let us know if you want to set up a consultation with one of our experts to discuss your needs.

View Webinar

Swarm 12 Intelligent Data Management for Content Access, Delivery & Archive

Swarm v12 is still object storage, but has evolved into an intelligent data management platform for content access, delivery and archive

As if the data landscape was not already challenging, it became even more so because of COVID-19. With myriad companies suddenly moving to an almost completely remote workforce, the accessibility of files for remote workers was and remains on the critical path for just about every type of business you can imagine.

Challenges for Storing Data in the 21st Century

text

As Swarm release version 12 was in planning, we identified a number of challenges that organizations faced in dealing with data in this third decade of the 21st Century. As data piled up, traditional storage was often augmented with public cloud storage. For some, a public cloud solution was ideal, but for others, it was lacking in security, accessibility and affordability. So, with Swarm 12, we set out to make it easier for businesses to:

  • Add storage capacity continuously while transparently providing data access
  • Comply with industry and government regulations
  • Keep both TCA (total cost of acquisition) and TCO (total cost of ownership) within budget
  • Archive data securely long term
  • Ensure that remote staff can access the data they need
  • Leverage cost benefits of public cloud for disaster recovery

What’s New in Swarm Version 12?

text

In Swarm 12, we continue to evolve object storage into an intelligent data management system. From the beginning, we’ve worked to make Swarm smart (if you are not familiar with The Smarts of the Swarm whitepaper, it is worth a read). And, in our latest release, we’ve focused on enabling more flexible distributed protection and immediate global content use across geographically dispersed sites. We’ve done this through a variety of changes including:

  • Automated management of distributed synchronous workflows via remote synchronous write (RSW) policies
  • Ease of use for end-users as they store, organize, find and share files through an updated user interface (UI) that includes drag & drop, multi-upload and an improved content view that is familiar to users
  • Expanded identity management connectivity via single sign-on (SSO) with SAML 2.0 that includes support for services like Okta, OneLogin and Google GSuite
  • Cost optimization for long-term (cold) storage leveraging tape through AWS S3 Glacier and AWS S3 Glacier Deep Archive support and certification with FUJIFILM Object Archive (a tape object solution)
  • Architectural optimizations for flash and optimal utilization of dense storage nodes

intelligent-data-management-access-archive-content

Learn More About Swarm Version 12

text

Join me and Sr. Consultant John Bell for our Tech Tuesday webinar to learn more about Swarm 12 Intelligent Data Management for Access, Archive and Content. We will take a deep dive into new features and functionality as well as the benefits that Swarm 12 brings to organizations struggling with storing and managing data. Feel free to bring your questions to the webinar, or email them to us at info@caringo.com if you need more information immediately.

Register Now

Fighting Pandemic Boredom? Technology to the Rescue

From doom scrolling to binge watching to video conferencing, technology is used not just for work but to fight pandemic boredom.

With new lockdowns being announced across the world and travel restrictions in place (even between some states in the US), the world now has a second pandemic to add to COVID-19—boredom. For most of us, the cure for boredom is at our fingertips, and wowee—have we used those fingers!

From doom scrolling on social media to binge watching series on streaming services to video conferencing not just with colleagues but with our family and friends—technology has been a lifeline for us to find information, connect with others and keep ourselves entertained. And, of course, for many of us, it is how we get our work done.

What Do the Statistics Show About Rise in Use of Technology?

According to an annual study by Ofcom into UK media habits, adults spent nearly 6 ½ hours per day watching TV and online video and 12M people signed up for new streaming services (such as Netflix, Amazon Prime and Disney+). The statistics for use of video conference tools are also astounding. Zoom’s revenue has quadrupled during the pandemic, with the tool being used for work, education, telemedicine and socialization. Location has become increasingly unimportant for many workers and students, and it looks like many of these changes are here to stay at least for the foreseeable future.

The “New Abnormal” or Just More of the Norm?

We’ve all heard the term thrown around of “the new abnormal,” but, for many, this is just more of the normal given that the infrastructure for all of these activities was in place long before any of us heard of COVID-19. While certain activities we love might be off the table right now (for example, singing in large groups or going to live concerts and sporting events), for many people, life was already full of technology—from those of us who already worked from home to those of us who were, as my son says, “built for lockdown.”

What Are We Seeing in the Storage Industry?

For the answer to this question, make sure to tune into episode 12 of our Brews & Bytes webcast, Storage Trends to Be Thankful For!, featuring Caringo’s Adrian J Herrera, VP Marketing, TW Cook, VP Engineering, and Tony Barbagallo, CEO. As we head into the holiday season, they will take a look at characteristics of data storage that we can be thankful for and how technology, storage in particular, has played a role in keeping our world connected during the unprecedented events of 2020.

Questions they will address include:

  • How has storage evolved over the last decade?
  • What true stories can you share about times that organizations were “thankful” for storage?
  • Where is storage headed in the coming decade?

Register now to watch live on November 19 or to be notified when the recording is available.

Why Use a Cloud-Native Approach to Development?

We’ve worked with customers on cloud-native applications since 2006 and that approach is used in our own product development.

“What should we talk about next month on our Tech Tuesday webinar?”

This is a question I ask monthly, and this time, I got an answer that seemed to come out of the blue:

“developing cloud-native applications.”

What is Cloud-Native?

“Cloud-native” is just another way to refer to a native cloud application (NCA), which is a program specifically designed for a cloud-computing architecture, whether that be on-prem, remote or hybrid.

Developing cloud-native applications may not seem like something an object storage company would talk about. But then again, Caringo isn’t just any object storage company—it is an organization that has always been ahead of its time.

What Does the Cloud-Native Approach Have to Do with Object Storage?

According to Caringo’s Principal Solutions Architect Ryan Meek, who joined Caringo in 2006 as a developer, we’ve worked with customers on cloud-native applications since the initial launch of our object-based storage—long before the term “cloud-native” was coined. Ryan also explained that the cloud-native approach has greatly impacted the processes that we use for our own product development.

Why Use a Cloud-Native Approach to Developing Applications?

Applications developed for cloud-native services can scale as needed, and they can leverage immutable infrastructure, containers, service meshes and declarative APIs flexibly on private, public or hybrid cloud storage. Benefits to this approach include cost-effectiveness, scalability and portability of code.

Learn More About Developing Cloud-Native Applications

Join us November 3 for our Tech Tuesday webinar, Developing Cloud-Native Applications. Ryan Meek will be our subject-matter expert and, as host, I’ll be asking the questions, including:

  • What are the benefits of a cloud-native approach?
  • What are the top 3 mistakes developers make when developing cloud-native applications?
  • What best practices should be followed when developing cloud-native applications?

In addition, Ryan will reveal how our Caringo Swarm Intelligent Data Management & Object Storage platform uses the cloud-native approach to provide the massively scalable, self-healing object-based storage we are known for.

Make sure you stick around until the end of the webinar for the live Q&A, and bring your questions!

Register Now

In Conversation with Media Distillery

In this IABM TV interview, Roland Sars (CEO, Media Distillery) discusses Media Distillery’s main product offerings and how customers are using video intelligence to drive a better user experience. Roland also discusses their priorities for the next 6-12 months.

In conversation with Media Distillery
Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • captions off, selected

        In Conversation with Pebble

        We caught up with Peter Mayhead, CEO at Pebble to discuss their recent rebrand, asking why they chose to rebrand, what it means for them and their customers, will there be any changes as a result of the rebrand and where Peter sees the future of the industry and Pebble as we start to emerge from Covid.

        In conversation with Pebble - November 2020
        Video Player is loading.
        Current Time 0:00
        Duration 0:00
        Loaded: 0%
        Stream Type LIVE
        Remaining Time 0:00
         
        1x
          • Chapters
          • descriptions off, selected
          • captions off, selected

              In Conversation with Metaliquid

              We catch up with Mataliquid’s Head of Sales Maria Lodolo D’Oria to find out more about their recent project with Minerva where they delivered a revolutionary viewing experience of the recent US Presidential Debates.

              In Conversation with Metaliquid - November 2020
              Video Player is loading.
              Current Time 0:00
              Duration 0:00
              Loaded: 0%
              Stream Type LIVE
              Remaining Time 0:00
               
              1x
                • Chapters
                • descriptions off, selected
                • captions off, selected

                    Accelerating & Redefining Remote Workflow Technologies

                    With the pandemic bringing change across the globe, journalists and broadcasters are adapting to working from home with remote workflow technologies. Powered by IABM’s data-driven insights on the business of media technology, this session explores the drivers of change in the news sector, including the impact of COVID-19 on news consumption, workflows and technology demand. Learn more from a panel of experts as they debate how technology is helping accelerate remote news production and distribution workflows.

                    Charting The Uncharted – Exploring New Technologies Business Models In The Broadcast & Media Industry

                    The session looks at the challenges and opportunities in this new media ecosystem.

                    IABM’s Head of Insight and Analysis, Lorenzo Zanni, opens the event, sharing IABM’s latest research and analysis of the current state of Broadcast and Media, and how its future is likely to unfold over the coming months and years. This is then be followed by a panel of experts from the Broadcast and Media Industry who explore the challenges and opportunities that new technologies and business models will bring to the industry.