Ticker

  
 
 
theme to black backgroundlet page decide themechoose your own theme
  norton ghost in linux?
Site Search:
 
Forums Tech and Talk OS and Software All Things Unix norton ghost in linux?
Search Topic:
Uniqs:
1643
Share Topic:
RSS topic:
toggle:
flat / full
normal / watch
Posting:
Post a:
Post a:
Wrong forum
squirrelmail and sendmail  
page: 1 2
AuthorAll Replies

ftzsee
Premium
join:2001-11-22
clubs:
reply to subcultured
Re: How to do backups in Linux...

You'll want to skip the /dev directory too right?


benyto
Premium
join:2000-07-09
Chico, CA
Yeah, you'll skip that, too.


benyto
Premium
join:2000-07-09
Chico, CA

reply to bluesea1
Re: norton ghost in linux?

I figured I'd better toss in something useful to this conversation. subcultured wrote up some pretty good stuff about using tar to create an image of a partition. I'd like to add a little to what he had to say.

My desktop has only two partitions on it, / and /boot. Let's say I want to image them separately. Here's what I would do. First, I would create a file called 'tar_ignore' in the root directory of whichever file system I am currently going to image. In this file I will include a list of directories I don't want to be included in the archive. Mine looks like this for my / partition:
dev
proc
tmp
mnt
This allows your command line to be quite a bit shorter, and also allows you to change which directories you want/don't want without having to alter the command line you use. This is helpful if you use a shell script to create the archive. I will then create the archive with the following command line:

# tar -cvplP -X tar_ignore -f /tmp/root.tar /

What are my switches doing?
•'c' is 'create'
•'v' is verbose (only use if you actually want to see what is going on. I don't use this one normally.)
•'p' maintains file permission information
•'l' will only archive directories within the current filesystem (i.e., if you are archiving /, it will automatically skip any directories which are part of other file systems)
•'P' does not strip the leading backslashes on file names
•'X' specifies the file to read to get a list of files to leave out of the archive (in my case 'tar_ignore')
•'f' specifies the name of the file to create ('/tmp/root.tar' in this case)

The final backslash specifies what file I want to archive, which is / in this case.

This creates a file called 'root.tar' in my /tmp directory. This file is an image of my / file system, minus any directories I don't need archived. I'd repeat the procedure for my /boot partition, but I don't need a 'tar_ignore' file there. There isn't anything I want to skip for that partition.


benyto
Premium
join:2000-07-09
Chico, CA

In addition to my above post, I will copy and paste a PM I sent to subcultured earlier. It may have some helpful information for other people (I'll even leave my spelling mistakes in there for authenticity ):

said by benyto:

Acutally, you've taken it farther than I ever have. You've done a real nice write up on how to do it. I'm guessing the stumbling block you're having is skipping certain mounted file systems. A good solution to that would be creating a shell script which first umounted any file system you didn't want backed up, do the tar bit, and then remount the file systems. That, of course, leaves those FSs unavailable during the back up procedure, which may not be acceptable.

A better solution, in the case that those file systems may be required during back up, is to use the -X option instead of the --excludes. This lets you specify files/directories in a text file to exclude. That way you can update the file as needed, without having to actually edit your tar command line. In that file simply specify the files/directories you did *not* want to be backed up.

Also, for automated backups, you may want to leave off the -v option. There is really no need vor verbosity if nobody is going to be awake to view output. It will only slow the whole thing down.

One last point: Where you have the -d option, I *think* you want to -u option. The -d switch checks for differences between the archive and the file systems. The -u switch updates the archive with only the files that are new than those in the archive itself.

Ok, more final notes You are correct in telling people not to compress the archive. Only backups that will be stored on a separate medium, like CD-ROM or DVD, and stored for long-term archival purposes should be compressed. And only then if space is tight. The problem with compressing archives is that if the file gets corrupted, the whole thing is basically shot. gzip can't recover well from corrupted archives. It may also be a good idea to store the archive on a different partition than root. It would be benificial to have a partition or HDD dedicated solely to the backup archive. That way, in case of FS corruption or failure, the backup should be safe.



subcultured
Premium
join:2001-08-21
Jamaica Plain, MA
  thanks benyto...

tar still rules ;]


guycad$
In Search Of Free Speech
Premium
join:2002-05-02
Pompton Lakes, NJ

reply to bluesea1
I happen to like both tar and dd. And the tar notes written up in the thread here are amoung the clearest I've seen.

There are situations where you will not be able to use these for whatever reason. This is because these instructions are essentially written from the perspective of backing up a working system. For non-working systems, Ghost or something like it can be more helpful.

Note: there are a number of caveats to what I'm about to say so follow along closely. I'm not trying to say do this or that because it's better. I'm pointing out that no one backup / image copy procedure works best for every situation and that it's a good idea to have a few different 'tricks up your sleeve' so to speak.

I tear down and rebuild systems alot. This means I do a lot of disk swapping between disparate systems including disparate OSes. I even swap partitions between drives from time to time.

To that end, I've got a minimal system with a itty bitty hard drive in it running MSDOS 6.4. On it, I have Partition Commander and Spinrite. In the case, I have two of those 'removeable' hard drive cases. So when I'm working 'directly' on a hard drive, I pop it into one of these cases, pop in the case, boot to setup, identify the HD to the bios and then re-boot.

Partition Commander allows me to copy hard drive partitions, shrink (some of) them, expand (some of) them etc. depending on the partition type. It also allows me to perform surface scanning and to not tie up one of my main machines as needed. I.E. I can set Spinrite up to 'refresh' a hard drives sectors for as long as it takes without losing the use of my normal machines.

Now granted, this is overkill for most people. However, if you need to initialise hard disks with the same partition (you're setting up several identical machines) or you need to do a lot of swapping / diagnostics, then you probably want to have this kind of tool on hand as well. You don't need to have a working destination system to untar the image. Just the hard disk you intend to use.

The only thing Ghost has to offer over Partition Commander is the ability to retrieve an image from over a network. I consider Partition Commander to be much more useful.

BTW, I used to use Partition Magic. Then they really annoyed me with their upgrade policy and the fact that they put out version 3 with lots and lots of bugs just to be able to say they supported FAT32. I don't like having someone elses bugs eat my data.

For backing up an image / partition on a regular basis, in my opinion, you're probably better off using tar.



Of course, as always - YMMV
--
People who describe M$ software as 'mediocre' don't know the half of it. My Pictures.


benyto
Premium
join:2000-07-09
Chico, CA

reply to bluesea1
I'm going to add another bit of information to this thread. Today I wanted to transfer the data from my desktop hard drive to another hard drive. I wanted to do this for two reasons: 1) I was running a bit shy on free space on the five gig partition; 2) It allows me to compare the exact same install on two different systems simultaneously.

Here's how I did it:

First, I removed the 20 gig hard drive out of one of my computers and plugged it in to my current desktop. I then wiped what were the /boot and / partitions on this drive by running mke2fs. Once that was finished I mounted what was going to be the / partition (/dev/hdd3) in /mnt. I changed to the / directory on my desktop drive and issued this command:

# tar -cplP -X tar_ignore -f - . | (cd /mnt; tar xf -)

What's happening with this command?

Just about everything before the pipe is the same as in my previous example. This time, however, I removed /dev from my tar_ignore file. If you don't copy over the /dev directory you won't be able to boot properly. The file that is being created is a dash (-). This option causes tar to write the information to standard output rather than to a file. This allows us to pipe it to another command. The parentheses after the pipe open up a new shell and run the commands within it. First, we change to the /mnt directory (where the / partition of the new hard drive is mounted). Then we use tar to extract a file, which is also a single slash. In the case of extraction, tar grabs information from the standard input. In this case it is extracting the stream that we sent to the standard output before the pipe, and placing it in to the current directory (/mnt, or the / partition of the new drive). And that's it.

I then did the same thing with my /boot partition (with the exception of the -X switch; there isn't anything in there I want to skip over).

I powered down, removed the drive, put it back in my other computer, and booted up. I had copied the hard drive and was up in running in a matter of minutes, and now had two different machines running identical software set ups.


subcultured
Premium
join:2001-08-21
Jamaica Plain, MA

  ^great explanation -- thanks :]

quick question, though: by piping the standard output of tar in such a way, is it creating unnecessary overhead? i think i remember reading that such a command doesn't actually archive anything, but can't remember... would a find command piped to "cp" be another viable option? a better option?
--
this must be thursday. i never could get the hang of thursdays...


guycad$
In Search Of Free Speech
Premium
join:2002-05-02
Pompton Lakes, NJ
reply to benyto
oOOOOooooo! This is kewl! I've never seen it done this way before. This thread is definitely a keeper!

Thanks Benyto!!!


benyto
Premium
join:2000-07-09
Chico, CA

reply to subcultured
quote:
quick question, though: by piping the standard output of tar in such a way, is it creating unnecessary overhead? i think i remember reading that such a command doesn't actually archive anything
That's pretty much the point. It allows you to make exact copies without having to first store the data in a file archive on your hard drive first. As far as overhead, ehh, I'm not sure that's a problem. It literally took just a few minutes to copy my root partition, and because the computer had to have been brought down in order to put the other drive in it, it can be assumed nothing mission critical was going on.
quote:

would a find command piped to "cp" be another viable option? a better option?
Quite possibly. I'd be interested to know if there are better ways to do this. However, I don't know them.


benyto
Premium
join:2000-07-09
Chico, CA
reply to guycad$
Glad you found it helpful, guycad. I agree that this is a good thread. Some of the more mundane things like this really aren't discussed an awful lot.


bluesea1

join:2001-07-25
Durham, NC
 reply to bluesea1
hehe. Thanks you guys.
--
bluesea


guycad$
In Search Of Free Speech
Premium
join:2002-05-02
Pompton Lakes, NJ

reply to benyto
Some of the more mundane things like this really aren't discussed an awful lot.

Nope, you're right. Most people just set up one way to do their backups and then never think about it.
--
People who describe M$ software as 'mediocre' don't know the half of it. My Pictures.
Forums Tech and Talk OS and Software All Things UnixWrong forum
squirrelmail and sendmail  
page: 1 2
Jump:


Monday, 21-Sep 10:46:08 Terms of Use | Privacy Policy | Hosting by www.nac.net - DSL,Hosting & Co-lo | feedback | contact
over 10 years online! 1999-2009 dslreports.com.republican-creole
logout
greEd
Most commented news this week
[31] AT&T Femtocell Website Comes Online
[6] T-Mobile Drops 21 Mbps HSPA+ On Philly
[3] Time Warner Political Pal Resigns Under Ethics Probe
[1] Monday Morning Links
Most people now reading
Buy Handgun for Home? [General Questions]
[BT] demonoid down 9/16/09?? [Filesharing Software]
Gonna buy a snow blower. [Home Repair & Improvement]
so if UBB billing does come into effect in November ... [TekSavvy]
Direct tv... have u had trouble with them? Who do you use? [General Questions]
IE8 InPrivate filter from adblock plus list [Microsoft Help]
Safe cracker? [Home Repair & Improvement]
Windows 7 boot manager editing questions [Microsoft Help]
So... [World of Warcraft]
[How to] Install Asterisk on an Asus WL-520GU router [VOIP Tech Chat]