Linux Commands with examples

cp

Copy file1 to file2 preserving the mode, ownership and timestamp.

  $ cp -p file1 file2

mv

Rename file1 to file2. if file2 exists prompt for confirmation before overwritting it.

  $ mv -i file1 file2

rm

get confirmation before removing the file.

  $ rm -i file*

cd

change directory.

  $ cd path

change back to previous directory

  $ cd -

pwd

Print working directory, shows the path of the current directory

  $ pwd

more

Echo the contents of the file to the screen one page at a time, see also less and cat

  $ more /opt/example/file.txt

ls

Display filesize in human readable format (e.g. KB, MB etc.,)

  $ ls -lh
-rw-r----- 1 jsmith team-dev 8.9M Jun 12 15:27 arch-linux.txt.gz

Order Files Based on Last Modified Time (In Reverse Order) Using ls -ltr

  $ ls -ltr

Visual Classification of Files With Special Characters Using ls -F

  $ ls -F

vim or vi

Go to the 143rd line of file

  $ vim +143 filename.txt

Go to the first match of the specified
  $ vim +/search-term filename.txt

Open the file in read only mode.
  $ vim -R /etc/passwd

Write the changes and Quit vi

  :wq

whatis

Display one-line manual page description of a Linux command

  $ whatis mkdir

man

Display the full manual page description of a Linux command

  $ man mkdir

history

Display recent Linux commands

  $ history

 

Display the recent commands with git

  $ history | grep git

top

Display system utilization for CPU, Memory, etc.  CNTL-C to exit (bpytop pretty version)

  $ top

chmod

Change permissions on file to read write for user, read only for group and everyone else.

  $ chmod 644 /opt/example/file 

tar

Tar Files

  $ tar cvf archive_name.tar dirname/

View an existing tar
  $ tar tvf archive_name.tar

Extract Files
  $ tar xvf /tmp/mybackup.tar

Extract Tar.gz file to a directory (-C) Note: gzip will compress the tar file

  $ tar xvfz /tmp/mybackup.tar.gz -C ~/documents

Extract Tar.bz2 file to a directory (-C) Note: that bizp2 compresses better than gzip
  $ tar xvfj /tmp/mybackup.tar.bz2 -C ~/documents

grep

Search for a string in a file (case in-sensitive search).
  $ grep -i "the" demo_file

Print the matched line, along with the 3 lines after it.
  $ grep -A 3 -i "example" demo_text

Search for a given string in all files recursively
  $ grep -r "ramesh" *

Count lines that  don't match grep -cv John /etc/passwd

Show name of file that matches  grep -ril john /root

Begining of line  ^ grep "^start" messages

End of line  $ grep "term.$" messages

count empty lines  grep -c "^$" messages

Single charater .  grep ".ello" messages

find

Find files using file-name ( case in-sensitve find)
  # find -iname "MyCProgram.c"

Execute commands on files found by the find command
  $ find -iname "MyCProgram.c" -exec md5sum {} \;

Find all empty files in home directory
  # find ~ -empty

Find all files larger than 20M
  # find / -type f -size +20M

Find all files older than 60 days
  # find . -mtime +60

Find all files modified in last 2 days
  # find . -mtime -2

Finds all the files not modified in the last 60 days under /home/jsmith directory and creates an archive files under /tmp in the format of ddmmyyyy_archive.tar.
# find /home/jsmith -type f -mtime +60 | xargs tar -cvf /tmp/`date '+%d%m%Y'_archive.tar` days

ssh

Login to remote host
  $ ssh -l jsmith remotehost.example.com

Debug ssh client
  $ ssh -v -l jsmith remotehost.example.com

Display ssh client version
  $ ssh -V
  $ OpenSSH_3.9p1, OpenSSL 0.9.7a Feb 19 2003

scp

Secure remote copy recursively files on remote computer to home directory
  $ scp -r user@host.com:/opt/example ~/my_examples

sed

Convert the DOS \r\n  to Unix \r file format.
  $sed 's/.$//' filename

Print file content in reverse order
  $ sed -n '1!G;h;$p' thestuff.txt

Add line number for all non-empty-lines in a file
  $ sed '/./=' thestuff.txt | sed 'N; s/\n/ /'

Delete the comments from a file
  sed -e 's/#.*//' thegeekstuff.txt

Delete the last 3 charaters from each line in a file
  $ sed 's/...$//' thegeekstuff.txt

Delete the HTML Tags
  $ sed -e 's/<[^>]*>//g'
  This <b> is </b> an <i>example</i>.
  This is an example.t

awk

The Awk is mostly used for pattern scanning and processing. It searches one or more files to see if they contain lines that matches with the specified patterns and then perform associated actions.
Awk views a text file as records and fields.  Awk has variables, conditionals, loops and arithmetic and string operators. If the line has 4 words, it will be stored in $1, $2, $3 and $4. $0 represents whole line.

  awk '/search pattern1/ {Actions}
       /search pattern2/ {Actions}' file

Remove duplicate lines using awk
  $ awk '!($0 in array) { array[$0]; print }' temp

Print all lines from /etc/passwd that has the same uid and gid
  $awk -F ':' '$3==$4' passwd.txt

Print only specific field from a file.
  $ awk '{print $2,$5;}' worker.txt

diff

Ignore white space while comparing.
  # diff -w name_list.txt name_list_new.txt

< John Doe --- > John M Doe
> Jason Bourne

sort

Sort a file in ascending order
  $ sort names.txt

Sort a file in descending order
  $ sort -r names.txt

Sort passwd file by 3rd field.
  $ sort -t: -k 3n /etc/passwd | more

export

To view oracle related environment variables.
  $ export | grep ORACLE
declare -x ORACLE_BASE="/u01/app/oracle"
declare -x ORACLE_HOME="/u01/app/oracle/product/10.2.0"
declare -x ORACLE_SID="med"
declare -x ORACLE_TERM="xterm"

To export an environment variable:
  $ export ORACLE_HOME=/u01/app/oracle/product/10.2.0

xargs

xargs is a command that takes output of a command and pass it as argument of another command.

Copy all images to external hard-drive
  # ls *.jpg | xargs -n1 -i cp {} /external-hard-drive/directory

Search all jpg images in the system and archive it.
  # find / -name *.jpg -type f -print | xargs tar -cvzf images.tar.gz

Download all the URLs mentioned in the url-list.txt file
  # cat url-list.txt | xargs wget –c

locate

Show all files in the system that contains the word crontab in it.

  $ locate crontab
  /etc/anacrontab
  /etc/crontab

ip

Show all the network interfaces.

  $ ip a

mkdir

Make a directory.

  $ mkdir ~/newdirectory

df

Display filesystem information in human readable form with the type of filesystem.

  $ df -hT

du

Display disk usage for current directory path in human readable format

  $ du -h

$CDPATH

Similar to the PATH variable, you can add more than one directory entry
in the CDPATH variable, separating them with : , as shown below.
  $ export CDPATH=.:~:/etc:/opt

Correct Diretory Spelling

Use shopt -s cdspell to correct the typos in the cd command
automatically as shown below.
  # shopt -s cdspell
  # cd /etc/mall
  # pwd
  /etc/mail

2> /dev/null

Use Suppress standard error using 2> /dev/null
  # cat invalid-file-name.txt 2> /dev/null
Use this is in the crontab, suppress error message of a cron task
  30 1 * * * command > /dev/null 2>&1

cut

Display the 1st field (employee name) from a colon delimited file
  $ cut -d: -f 1 names.txt

ac

Access connect time for all users by day
  $ ac -pd

&

Execute command in the background

  $ nohup ./my-shell-script.sh &

at

Execute command at 10 am tomorrow

  $ at -f backup.sh 10 am tomorrow

dstat

Show CPU, Disk, Network, paging utilization
  $ dstat
You did not select any stats, using -cdngy by default.
----total-cpu-usage---- -dsk/total- -net/total- ---paging-- ---system--
usr sys idl wai hiq siq| read  writ| recv  send|  in   out | int   csw 
  1   0  99   0   0   0|2934k  201k|   0     0 | 367B  590B| 277   554 
  0   0 100   0   0   0|   0     0 | 262B  842B|   0     0 |  94   156  


  $ dstat -tcndylp --top-cpu 

this is identical to

  $ dstat --time --cpu --net --disk --sys --load --proc --top-cpu

traceroute

Snow network route to host

  $ traceroute myblog.com

 

nc

Scan host and port, formerly netcat

  $ 8.8.8.8 80

uname

print operating system name and information

  $ uname -a

ab

Apache benchmark tool, shows how many http requests per second server can perform.  Request 100 times through 5 concurrent threads the page at http://myblog.com

  $ ab -n 100 -c 5 http://myblog.com

wget

Get a web page, see also curl
  $ wget http://my.recipes.com

crontab entry that get the URL, and suppresses output
30  *  *  *  0 wget -O - -q -t 1 http://recipes.web3us.com

-t <n> change the default number of tries from 20 to n.

-q prevents wget from writing to standard output and makes it totally silent.

-O <file_name> Specify downloaded file name, use a file name of '-' to redirect output to /dev/null for example: -O -