How to copy directory in Linux
Copying files and directories is one of those things I do every day on Linux. Whether I’m backing up config files, moving photos around, or deploying code to servers, knowing how to copy stuff properly is essential.
Let me walk you through what I’ve learned about using cp, rsync, and why you probably shouldn’t use scp anymore.
The cp Command
The cp command is your basic workhorse for copying files and directories locally. It’s simple, reliable, and does exactly what you’d expect.
Basic syntax:
cp [OPTIONS] SOURCE... DESTINATION
Copy a single file:
cp fileA fileA.bkp
Copy a directory recursively:
cp -R source destination
That -R flag is crucial when copying directories—it tells cp to copy everything inside, not just the folder itself.
Real-World Example
Here’s something I actually do: backing up my home directory to an external USB drive before reinstalling Linux.
cp -R /home/bitslovers /mnt/media/usb01
My USB drive happens to be mounted at /mnt/media/usb01, so this copies everything in my home directory there.
Copy Multiple Files at Once
You can specify multiple files and a destination directory:
cp file1 file2 file3 destination-folder/
Copy Directory Contents Only
Sometimes I want to copy the files inside a directory without the directory itself. Like when I’m organizing photos by year from a camera’s SD card.
Using wildcard:
cp -R /mnt/media/usb01/DCM0001/* /home/bitslovers/Pictures/2021
The issue with wildcards? They skip hidden files (those starting with a dot).
Using -T flag:
cp -RT /mnt/media/usb01/DCM0001 /home/bitslovers/Pictures/2021
This copies everything including hidden files, but not the source directory itself.
Don’t Overwrite Existing Files
I’ve accidentally overwritten files more times than I care to admit. Here’s how to avoid that:
Skip existing files (-n):
cp -n /home/bitslovers/backup-apache2.sh /home/bitslovers/backup-scripts
Prompt before overwriting (-i):
cp -i /home/bitslovers/backup-apache2.sh /home/bitslovers/backup-scripts
You’ll see a prompt like:
cp: overwrite '/home/bitslovers/backup-apache2.sh'?
Type y or n to confirm.
Copy Multiple Directories
Need to copy several directories at once? Just list them all:
cp -R /etc /home/bitslovers /mnt/media/DCM0001
This copies both /etc and /home/bitslovers to the USB drive.
Preserve Permissions and Timestamps
When you copy files, they get new ownership and timestamps by default. That’s usually fine, but sometimes you need to preserve everything exactly as-is.
Preserve mode, ownership, and timestamps (-p):
sudo cp -p /home/bitslovers/.bashrc /home/bitslovers/backup
Preserve only timestamps:
sudo cp --preserve=timestamps /home/bitslovers/.bashrc /home/bitslovers/backup
Force Copy
Sometimes you need to force a copy operation:
cp -f ~/.bashrc /tmp
Copy Symbolic Links
By default, cp follows symbolic links and copies the actual files. If you want to copy the links themselves:
cp -d /home/bitslovers/backup-apache2.sh /usr/bin/
Dereference Links (Copy Actual Files)
If you want to copy what symlinks point to instead of the links:
cp -rL /tmp/docs/ /home/bitslovers/
Copy Only Newer Files
Handy for incremental backups—only copy files that have changed:
cp -ru /tmp/docs/ /home/bitslovers/
Create Hard Links Instead of Copying
This is a cool trick. Hard links point to the same data on disk, so changes in one location appear in the other. Unlike symlinks, hard links don’t break if you move or delete the original file.
cp -lr ~/Documents /tmp
I use this for quick local backups where I want changes to sync automatically between locations.
Copy Files by Extension
Combine ls and xargs to copy specific file types:
ls *.png | xargs -n1 -i cp {} /home/bitslovers/Pictures
Copying to Remote Servers
This is where things get interesting. You’ve got options, and some are better than others.
rsync: My Go-To for Remote Copies
rsync is fantastic. It’s fast, efficient, and has been around forever for good reason.
Check if you have it:
whereis rsync
Install if needed:
# Ubuntu/Debian
sudo apt-get install -y rsync
# CentOS/RHEL/Fedora
sudo yum install -y rsync
Basic syntax:
rsync -ar <origin-folder> <user>@<host>:<path>
Real example—backing up Apache configs:
rsync -ar /etc/apache2 [email protected]:/home/bitslovers/apache2_backup
The -a flag preserves permissions, timestamps, and other attributes. The -r flag makes it recursive.
Copy directory contents only:
rsync -ar /etc/apache2/* [email protected]:/home/bitslovers/apache2/
scp: Deprecated, Avoid Using It
Here’s the thing—scp is deprecated. OpenSSH 8.0 (released in 2019) downgraded the SCP protocol due to security vulnerabilities. Modern scp commands are actually wrappers around sftp internally.
If you’re still using scp, it’s time to switch.
Old way (don’t use):
scp -r /etc/apache2 [email protected]:/home/bitslovers/apache2
New way (use rsync instead):
rsync -avz /etc/apache2 [email protected]:/home/bitslovers/apache2
Why rsync Beats scp
- rsync only copies changed files—huge time savings for large directories
- rsync can resume interrupted transfers
- rsync has more options for compression, permissions, filtering
- rsync shows better progress information
The only advantage scp had was simplicity, but rsync isn’t exactly complicated.
Modern Alternatives to Consider
If you’re working with cloud storage or need something more modern:
rclone—Excellent for cloud storage (S3, Google Drive, Dropbox, etc.) but works over SSH/SFTP too:
rclone copy /local/path remote:bucket/path
sftp—For interactive file transfers:
sftp user@host
# Then use: put, get, mput, mget commands
Practical Tips from Experience
After years of copying files around, here’s what I’ve learned:
For local copies: Stick with cp. It’s fast, simple, and reliable.
For remote copies: Use rsync. The delta-transfer algorithm alone saves massive amounts of time.
For cloud storage: Check out rclone. It handles retries, checksums, and bandwidth limiting automatically.
For big transfers: Compress first. A single tar.gz file transfers faster than thousands of small files:
tar czf backup.tar.gz /path/to/files
rsync -avz backup.tar.gz user@host:/backups/
What Changed Since 2021?
- scp is now officially deprecated—use
rsyncorsftpinstead - GNU coreutils got a Rust rewrite (uutils)—faster and safer, though your distro might still use the C version
- rclone matured significantly—if you work with cloud storage, it’s worth checking out
- Better progress tools—
progress(coreutils viewer) shows progress for cp, mv, dd, and other commands
Final Thoughts
Copying files isn’t glamorous, but getting it right saves headaches later. Use cp for local stuff, rsync for everything remote, and don’t touch scp anymore unless you absolutely have to.
Want to dive deeper? Check the man pages:
man cp
man rsync
Or combine cp with find for more complex operations. And if you’re moving lots of files, consider zipping them first or using tar.gz to speed things up.
Comments