Friday, October 15, 2021

Tuesday, October 12, 2021

Lessons learned from Spectra Logic following its ransomware attack

Spectra Logic, the famous secondary storage vendor, learned from its Netwalker ransomware attack. The team shared this story during the recent IT Press Tour and had leveraged this tough period to significantly improve its products in such prevention direction.

The company designs, develops and build tape libraries such TFinity and T950 among others and also data management solutions named StorCycle or BlackPearl. Nathan Thompson, founder and CEO of the company launched his project more than 40 years ago in Boulder, Colorado. Well established worldwide with key partnerships, the firm protects data in 80 countries with more than 20,000 installations. But being a strong player in data protection doesn't immunize it from cyber attacks such the one they got in 2020.

Back to the story, Nathan Thompson shared himself the story of that stress moment for his company. On May 7th, 2020, something happened as clearly systems started to fail one after the other. Some console displayed that files are encrypted by Netwalker and the team started to realize that a serious attack threats the health of the company. All indicators move in the bad direction, servers failed, applications stopped, and the business was of course completely stopped. The management decided to notify the FBI about its attack.

Attackers asked Spectra Logic to pay $3.6 million in the next few days to recover and restart the business by decrypt all files. Nathan and his team decided to not pay the ransom and started to recover files from all the backup images they have internally. In fact, this attack served as a proof that Spectra is one of the key components in the cyber crime resistance. We could argue that Spectra shares this story because they exited positively from this nightmare, what could have happened if Spectra was not able to restart?

In less than a week, Spectra IT team was able to recover enough data and systems to allow an incremental business restart for the company. It took a few more weeks to clean everything but clearly it was a success.

They learned from that and realized that their story is not unique but what they develop as a storage vendor and how they recovered could be a trigger for additional data protection mechanisms within their product line. Common sense and obvious decision. We see here a positive combination of people + technology associated with right decisions at the right moment. This is the perfect proof of a good management in crisis time.

So product management and company management decided to think about implementing in their products some missing or needed features to resist to such attacks. The result is named "Attack Hardened" as a program to strengthen products.


In details, it means adding in tape libraires, BlackPearl and StorCycle some new capabilities:

  • Tape libraries
    • create a special zone within a tape library to isolate cartridges,
    • consider a real air gap extraction feature to export tapes,
    • of course encrypt all data free of charge and
    • make the media immutable as a WORM media
  • BlackPearl
    • add scheduled snapshots stored on immutable storage,
    • enrich access protection with multi-factor authentication,
    • coupled snapshot with backup software like Commvault, Veeam and others and
    • remote copy data sets to other location to create new level of redundancy and increase data durability
  • StorCycle
    • encrypt data and allow snapshots to be written to BlackPearl NAS,
    • support of multiple storage technologies such disk, tape and cloud and
    • multiply the usage to data tiering to minimize sensitive data on more exposed devices.

This story is just fantastic as Spectra is transparent on his attack and how they controlled it and restarted from it. They learned some key lessons and identified some missing or needed new features to enhance their products. It seems that this aspect is now a common attribute of the company. We also learned that something new is coming, a new approach to data management.

Share:

Monday, October 11, 2021

Friday, October 08, 2021

Quantum unveils a strong cold data storage model

Quantum, a leader of secondary storage, joined the IT Press Tour yesterday to cover in details the recent news about its Cold Data strategy.

Recognized for a long time as a trusted vendor thanks to its rich portfolio, Quantum continues to develop its products line with technologies developed internally but also from several acquisitions such Atavium, CatDV and I expect similar pattern with Pivot3 and EnCloudEn.

For sure it will contribute to improve the image of the company, again good on the product and technology side, but questionnable on the financial side. Since Jamie Lerner joined as CEO in 2018 I see some great progress. One of the visible effect is the hyperscaler adoption of Quantum product with more than 30EB deployed.

The company made a big splash with this Cold Data announcement that creates a real dynamism around several product if its portfolio.


When I read the press release I was surprised to read that tape and object storage can't be connected. But it exists a few solutions on the market that deliver this already such Fujifilm with Object Archive, Spectra Logic with BlackPearl, Atempo, Cohesity, Grau Data, Komprise, Nodeum, Point Software and Systems, QStar, StrongBox or Versity among others.

The company sells some point products to address this cold data need but the glue between them was limited form Quantum and very often relies on partners solutions. The firm has developed and release Artico but the product disappeared from the catalog and StorNext can do also some integration with tiering. Same remark with ATFS that should replace and extends Artico in the future, at least I hope.

So the new thing is the glue between the ActiveScale and Scalar tape libraries represented by a new storage class, the S3 and S3 Glacier API, a matrix-based model for tape libraries and tape drives, an easy consumption model inspired by the cloud and an attractive pricing model.

Quantum has made a very positive decision acquiring last year ActiveScale from Western Digital. Amplidata, the Belgium original company, behind ActiveScale software was first acquired by HGST, a business from WDC, in 2015. This surprising move was pretty strange as Quantum was an investor in the Belgium player. Identifying the need for Object Storage and without any product, Quantum has oemed this product as Lattus. The product disappeared for some time showing a hole not previously visible and then Quantum made this asset acquisition from WDC. The team has confirmed once again that object storage is a must-have technology for every storage vendor that addresses the secondary storage and cold data in particular. To get an update on the object storage market moves and dynamic, I invite you to check the timeline I refreshed a few months ago.

This initiative shows several key developments:

    - a global access method model based on S3 and S3 Glacier APIs coupled with a lifecycle mechanism to move data between a "normal" S3 zone and a cold zone. In each zone, data are protected with BitSpread, the erasure coding engine invented by Amplidata, that delivers 15 x9s of durability. These 2 zones are exposed via a single namespace that hides the complexity of this data movement. In term of lifecycle, data can be moved also to public cloud as soon as they exposed S3.

    - an extension to the disk-based cold data zone filled of tape drives. This entity is configured with multiple Scalar tape librairies arranged into a new protection scheme named 2D erasure coding. With this protection model, these libraries are organized into a RAIL (Redundant Array of Independent Libraries) mode. Data durability reach 19 x9s with the beauty to restore data from only one tape as the EC model works at the object group level and not at the object itself. The team had chosen a subtle projection for the equations over drives and librairies.

    - and this offering is delivered as a cloud service and is promoted with a very compelling pricing model.

    As of today it seems that only Quantum products are supported, I expect more tape libraries from HPE, IBM or Spectra Logic to be supported in a future release, same remark from public cloud supports.

    I understand that CatDV could be used to create large index of all the content submitted to the secondary data farm.

    This initiative and key announcement represents a major milestone for Quantum confirming that cold data is a real huge opportunity for vendors.from the vendor gives a new example of the new battle on the cold storage market segment. I expect some other key players in that segment to announce also some solution in the next few days so I invite the readers to watch and monitor carefully the market the news.

    Share:

    Tuesday, October 05, 2021

    Atempo accelerates on Miria

    Atempo, the European leader in data protection with the 3 famous products Tina, Lina and Miria, participated once again to the recent IT Press Tour. The team has decided to dedicate its session to unstructured data and especially demanding large environments covered by Miria.


    First it's paramount to mention that Miria is a product line with Miria for Backup, Miria for Migration, Miria for Archiving and Miria for Data Move and Sync plus a new member Miria for Analytics.


    Miria embeds several key technologies that illustrate the deep expertise of the domain by Atempo engineering. Among them, I list the Data Mover model, FastScan, SnapStor and Analytics.

    Data Movers are deployed as agent on various systems in the configuration to operate the data transfert between sources and targets. It is a common piece across backup, archive, move and copy tasks and brings load balancing, parallelism and failover to maintain a high level of service.

    The team has added specific ransomware capabilities or features very useful in a ransomware prevention campaign such encryption of course, 3-2-1-1 backup model and air gap.

    Miria also leverages its FastScan module that offers universal file system crawling via industry standard file sharing protocols like NFS or SMB and even POSIX for parallel file systems like IBM Spectrum Scale, StorNext or Lustre. This technology is very efficient and avoid proprietary modules exposed by some vendors.

    Beyond this, Atempo also developed SnapStor a specific module to provide DR function for Miria Backup. One key element given by SnapStor is its capability to expose the backup image via NFS or SMB to boost restore process and globally RTO.


    The new module built by Nextino, the AI lab of Atempo, is Miria for Analytics. It will be released the 20th of October and marks a new key milestone for Atempo and its Miria product line. This Analytics module contributes to the deep understanding of users data landscape supporting 2 crawling modes, the online mode i.e on-premise and the offline i.e cloud. This mandatory module named Analytics Essentials will be complement by Analytics Augmented planned for April 2022, probably aligned with the NAB show. We'll for sure have a preview of it during the January IT Press Tour when we'll meet again Atempo.

    And last but no least, the product supports a large variety of file, object, tape and cloud storage, at the source and at the target level and it is probably one of the most open and flexible solution on the market.

    Share:

    Monday, October 04, 2021

    Podcast - NetApp SnapDiff shakes the backup ecosystem

    I organized a new round-table episode for the French Storage Podcast around a topic that creates the debate in the backup ecosystem, the famous SnapDiff API from NetApp. Three speakers were invited: Jean-Michel Magnani from NetApp, Louis-Frédéric Laszlo from Atempo and Pierre-François Guglielmi from Rubrik. It is the episode #73 and we invite you to listen to it below but our talk is in French.

    Share:

    Thursday, September 30, 2021

    Tuesday, September 28, 2021

    Arctic World Archive Resists to the Pandemic

    Piql organized a few days ago a new deposit ceremony in the Arctic World Archive (AWA) located in Longyearbyen in the Svalbard archipelago, beyond the polar circle. This event was the first for 2021 due to the pandemic. Attending this event was a rare privilege and I thank Piql for the invitation.

    As the company is pretty confidential, I remind readers to check an article I wrote in August 2020 about Piql and its technology, you can refer to it here.


    Started as Cinevator, Piql was founded in 2002 in Drammen, Norway, to address the growing need of long-term data preservation with a radical new approach. For such requirement, I understand you need a media that lasts long, very long, a format on it, some elements to generate data and read them later and guarantee that this access will be possible in the future. The goal is to preserve data over several hundreds years later with favorable climate condition. And even if many of us know that one of media could be tape or other storage flavors, very often a technology refresh is mandatory every few years and it creates some complexity over time. So Piql is a technology provider developing archive product that can be used within a corporation. They also added an optional vaulting service in a unique place on the planet still based on their solution.


    Let's summarize a few informations about their products. 
    In their search of a media, Piql engineers eliminate magnetic media as it is sensitive to radioactivity and of course magnetic sources. This could impact the duration of the data on the media as the media itself could suffer from some damages. They decided to use a ultra-high resolution nano-film, in a 35mm format, optimized for data preservation. Tested by official verification processes, that film has more than 500 years of projected life without any need to refresh or jump to a new media. It is also offline, key attribute with the current ransomware pressure, immutable and as mentioned is a permanent medium. They encode data with a specific process of their own, and generate very dense QR codes in 4 levels of grey with 8 millions pixels per frame all encapsulated in open source TAR sessions. Each PiqlFilm has a capacity of 120GB.

    Speaking during the event with Rena Bjerkerstand, founder and managing director of Piql, I learned that the AWA was inspired by the Global Seed Vault project started in 2008 that used coal mine #7 a few hundreds meters from the airport well below the permafrost in a very stable location with specific temperature, humidity, sismic and others characteristics. Piql has imagined a similar approach to store and preserve digital data over long time with a special approach. They use also a coal mine, in that case mine #3, with a special zone in it, completely remodeled we mean reshaped to host a container, similar to cargo ones, and inside in a hermetic bag, the PiqlFilm placed in the reel and in a PiqlBox. External recognition of the data entity is made with metadata written in the box itself as it is shown below with GitHub examples.

    As of today, it seems that the AWA container contains more than 450 PiqlFilm that represents more than 50TB. As the first container is full, I anticipe very soon a second one in the same zone.

    For some of you who know the place, the seed vault and the AWA mine are located below a plan field of top of the mountain with hundreds and satellites antennas. This place is unique.

    We took time to ask Piql executive what is his next phase and in total transparency, Bjerkerstand told me that they're working on a SaaS offering named PiqlConnect, being an online service to submit data and generate film. They also planed to deliver a mobile application to make this even more transparent and give this project a sense of ubiquity even if the archive location is "hard coded" in the mountain. The other project will be to offer the PiqlWriter and PiqlReader service directly at the entrance of the mine to deliver more data bandwidth injection and flexibility and retrieval needs. Today, the deposit inside the mine is manual process, same thing for the data access and take times with some planing around these.

    And finally for those of you who wish to get live informations, I recorded an interview of Rune Bjerkestrand, Piql managing director, during our stay in Svalbard. You can listen to this episode here posted on "The French Storage Podcast" web site.

    Obviously I have to mention other storage projects that target fixed or reference data based on DNA, holographic, optical or glass.

    Share:

    Friday, September 17, 2021

    Monday, September 13, 2021