SUMMARY: Backing up millions of files

From: sburch@derwent.co.uk
Date: Thu Sep 25 1997 - 10:26:00 CDT


     Bit over due this one...my original query :-
     
     I have a requirement to backup a a large amount of smallish files,
     approximately 2 million averaging 5K in size (roughly 10GB worth).
     I don't want to backup the whole partition using ufsdump and tar takes
     too long (about a day) what would be the quickest method of doing
     this, I presume some sort of block dump as opposed to file orientated.
     
     
     A number of people suggested incremental backups and staging area
     scenarios what I really wanted was a means of backing the whole lot up
     to one tape for simple storage and retrieval. The answer was simple to
     use ufsdump on the area in question e.g.
     
     ufsdump 0f /dev/rmt/0 /data/images
     
     Should have tried this but I had always been under the impression
     ufsdump worked on partitions as opposed to individual directories.
     
     Many Thanks to all those who replied
     
**************************************************************************
* *
* Stuart Burch Derwent Information Publishing *
* (Unix & Internet Support Analyst) 14 Great Queens Street *
* London *
* Systems & Database Group, IT WC2B 5DF. *
* *
* Email: sburch@derwent.co.uk Tel: 0171-424 2149 *
* *
**************************************************************************
      



This archive was generated by hypermail 2.1.2 : Fri Sep 28 2001 - 23:12:04 CDT