Vim maintains a set of open files, called “buffers”.
A Vim session has a number of tabs, each of which has a number of windows (split panes).
Each window shows a single buffer. Unlike other programs you are familiar with, like web browsers, there is not a 1-to-1 correspondence between buffers and windows; windows are merely views.
A given buffer may be open in multiple windows, even within the same tab.
This can be quite handy, for example, to view two different parts of a file at the same time.
By default, Vim opens with a single tab, which contains a single window.
" Comments in Vimscript start with a `"`." If you open this file in Vim, it'll be syntax highlighted for you." Vim is based on Vi. Setting `nocompatible` switches from the default" Vi-compatibility mode and enables useful Vim functionality. This" configuration option turns out not to be necessary for the file named" '~/.vimrc', because Vim automatically enters nocompatible mode if that file" is present. But we're including it here just in case this config file is" loaded some other way (e.g. saved as `foo`, and then Vim started with" `vim -u foo`).setnocompatible" Turn on syntax highlighting.syntaxon" Disable the default Vim startup message.setshortmess+=I
" Show line numbers.setnumber" This enables relative line numbering mode. With both number and" relativenumber enabled, the current line shows the true line number, while" all other lines (above and below) are numbered relative to the current line." This is useful because you can tell, at a glance, what count is needed to" jump up or down to a particular line, by {count}k to go up or {count}j to go" down.setrelativenumber" Always show the status line at the bottom, even if you only have one window open.setlaststatus=2" The backspace key has slightly unintuitive behavior by default. For example," by default, you can't backspace before the insertion point set with 'i'." This configuration makes backspace behave more reasonably, in that you can" backspace over anything.setbackspace=indent,eol,start" By default, Vim doesn't let you hide a buffer (i.e. have a buffer that isn't" shown in any window) that has unsaved changes. This is to prevent you from "" forgetting about unsaved changes and then quitting e.g. via `:qa!`. We find" hidden buffers helpful enough to disable this protection. See `:help hidden`" for more information on this.sethidden" This setting makes search case-insensitive when all characters in the string" being searched are lowercase. However, the search becomes case-sensitive if" it contains any capital letters. This makes searching more convenient.setignorecasesetsmartcase" Enable searching as you type, rather than waiting till you press enter.setincsearch" Unbind some useless/annoying default key bindings.nmap Q <Nop>" 'Q' in normal mode enters Ex mode. You almost never want this." Disable audible bell because it's annoying.setnoerrorbellsvisualbellt_vb=" Enable mouse support. You should avoid relying on this too much, but it can" sometimes be convenient.setmouse+=a" Try to prevent bad habits like using the arrow keys for movement. This is" not the only possible bad habit. For example, holding down the h/j/k/l keys" for movement, rather than using more efficient movement commands, is also a" bad habit. The former is enforceable through a .vimrc, while we don't know" how to prevent the latter." Do this in normal mode...nnoremap<Left> :echoe"Use h"<CR>nnoremap<Right> :echoe"Use l"<CR>nnoremap<Up> :echoe"Use k"<CR>nnoremap<Down> :echoe"Use j"<CR>" ...and in insert modeinoremap<Left><ESC>:echoe"Use h"<CR>inoremap<Right><ESC>:echoe"Use l"<CR>inoremap<Up><ESC>:echoe"Use k"<CR>inoremap<Down><ESC>:echoe"Use j"<CR>
For macos, There might not have the default .vimrc file in ~.
But never mind, just type vim ~/.vimrc and paste the code into it, then type :wq.
There are tons of plugins for extending Vim. Contrary to outdated advice that you might find on the internet, you do not need to use a plugin manager for Vim (since Vim 8.0). Instead, you can use the built-in package management system. Simply create the directory ~/.vim/pack/vendor/start/, and put plugins in there (e.g. via git clone).
We’re trying to avoid giving an overwhelmingly long list of plugins here. You can check out the instructors’ dotfiles (Anish, Jon, Jose) to see what other plugins we use. Check out Vim Awesome for more awesome Vim plugins. There are also tons of blog posts on this topic: just search for “best Vim plugins”.
Create the plugins directory with mkdir -p ~/.vim/pack/vendor/start
Download the plugin: cd ~/.vim/pack/vendor/start; git clone https://github.com/ctrlpvim/ctrlp.vim
Read the documentation for the plugin. Try using CtrlP to locate a file by navigating to a project directory, opening Vim, and using the Vim command-line to start :CtrlP.
Customize CtrlP by adding configuration to your ~/.vimrc to open CtrlP by pressing Ctrl-P.
Solution
...TO DO
Problem3
To practice using Vim, re-do the Demo from lecture on your own machine.
(Advanced) Convert XML to JSON (example file) using Vim macros. Try to do this on your own, but you can look at the macros section above if you get stuck.
To assign variables in bash, use the syntax foo=bar and access the value of the variable with $foo. Note that foo = bar will not work since it is interpreted as calling the foo program with arguments = and bar. In general, in shell scripts the space character will perform argument splitting. This behavior can be confusing to use at first, so always check for that.
Strings in bash can be defined with ' and " delimiters.
Strings delimited with ' are literal strings and will not substitute variable values.
Strings delimited with "will substitute variable values.
As with most programming languages, bash supports control flow techniques including if, case, while and for. Similarly, bash has functions that take arguments and can operate with them.
Here is an example of a function that creates a directory and cds into it.
vimmcd.sh
# paste the code into it...sourcemcd.sh
mcdtest# And you will find that you are in the test directory, which is a # child dir in which you run the mcd program
Unlike other scripting languages, bash uses a variety of special variables to refer to arguments, error codes, and other relevant variables. Below is a list of some of them. A more comprehensive list can be found here.
$0 - Name of the script
$1 to $9 - Arguments to the script. $1 is the first argument and so on.
$@ - All the arguments
$# - Number of arguments
$? - Return code of the previous command
$$ - Process identification number (PID) for the current script
!! - Entire last command, including arguments. A common pattern is to execute a command only for it to fail due to missing permissions; you can quickly re-execute the command with sudo by doing sudo !!
$_ - Last argument from the last command. If you are in an interactive shell, you can also quickly get this value by typing Esc followed by . or Alt+.
Commands will often return output using STDOUT, errors through STDERR, and a Return Code to report errors in a more script-friendly manner.
The return code or exit status is the way scripts/commands have to communicate how execution went.
A value of 0 usually means everything went OK
anything different from 0 means an error occurred.
Exit codes can be used to conditionally execute commands using && (and operator) and || (or operator), both of which are short-circuiting operators.
Commands can also be separated within the same line using a semicolon ;.
The true program will always have a 0 return code
the false command will always have a 1 return code.
false||echo"Oops, fail"# Oops, failtrue||echo"Will not be printed"#true&&echo"Things went well"# Things went wellfalse&&echo"Will not be printed"#true;echo"This will always run"# This will always runfalse;echo"This will always run"# This will always run
Another common pattern is wanting to get the output of a command as a variable.
This can be done with command substitution. Whenever you place $( CMD ) it will execute CMD, get the output of the command and substitute it in place.
For example, if you do for file in $(ls), the shell will first call ls and then iterate over those values.
A lesser known similar feature is process substitution, <( CMD ) will execute CMD and place the output in a temporary file and substitute the <() with that file’s name.
This is useful when commands expect values to be passed by file instead of by STDIN. For example, diff <(ls foo) <(ls bar) will show differences between files in dirs foo and bar.
let’s see an example that showcases some of these features.
It will iterate through the arguments we provide, grep for the string foobar, and append it to the file as a comment if it’s not found.
#!/bin/bashecho"Starting program at $(date)"# Date will be substitutedecho"Running program $0 with $# arguments with pid $$"forfilein"$@";dogrepfoobar"$file">/dev/null2>/dev/null
# When pattern is not found, grep has exit status 1# We redirect STDOUT and STDERR to a null register since we do not care about themif[[$?-ne0]];thenecho"File $file does not have any foobar, adding one"echo"# foobar">>"$file"fidone
In the comparison we tested whether $? was not equal to 0. Bash implements many comparisons of this sort - you can find a detailed list in the manpage for test.
When performing comparisons in bash, try to use double brackets [[ ]] in favor of simple brackets [ ]. Chances of making mistakes are lower although it won’t be portable to sh. A more detailed explanation can be found here.
When launching scripts, you will often want to provide arguments that are similar. Bash has ways of making this easier, expanding expressions by carrying out filename expansion. These techniques are often referred to as shell globbing.
Wildcards
? match one character.
* match any amount of characters.
For instance, given files foo, foo1, foo2, foo10 and bar, the command rm foo? will delete foo1 and foo2 whereas rm foo* will delete all but bar.
Curly braces{} - Whenever you have a common substring in a series of commands, you can use curly braces for bash to expand this automatically. This comes in very handy when moving or converting files.
convertimage.{png,jpg}# Will expand toconvertimage.pngimage.jpg
cp/path/to/project/{foo,bar,baz}.sh/newpath
# Will expand tocp/path/to/project/foo.sh/path/to/project/bar.sh/path/to/project/baz.sh/newpath
# Globbing techniques can also be combinedmv*{.py,.sh}folder
# Will move all *.py and *.sh filesmkdirfoobar
# This creates files foo/a, foo/b, ... foo/h, bar/a, bar/b, ... bar/htouch{foo,bar}/{a..h}touchfoo/xbar/y
# Show differences between files in foo and bardiff<(lsfoo)<(lsbar)# Outputs# < x# ---# > y
Note that scripts need not necessarily be written in bash to be called from the terminal. For instance, here’s a simple Python script that outputs its arguments in reversed order:
The kernel knows to execute this script with a python interpreterinstead ofashell command because we included a shebang line at the top of the script.
It is good practice to write shebang lines using the env command that will resolve to wherever the command lives in the system, increasing the portability of your scripts.
btw, if you really want to figure out where your program is, type this: which python, namely which ***, and add the output to the shebang line
To resolve the location, env will make use of the PATH environment variable we introduced in the first lecture. For this example the shebang line would look like #!/usr/bin/env python.
Some differences between shell functions and scripts that you should keep in mind are:
Functions have to be in the same language as the shell, while scripts can be written in any language. This is why including a shebang for scripts is important.
Functions are loaded once when their definition is read. Scripts are loaded every time they are executed. This makes functions slightly faster to load, but whenever you change them you will have to reload their definition.
Functions are executed in the current shell environment whereas scripts execute in their own process. Thus, functions can modify environment variables, e.g. change your current directory, whereas scripts can’t. Scripts will be passed by value environment variables that have been exported using export
As with any programming language, functions are a powerful construct to achieve modularity, code reuse, and clarity of shell code. Often shell scripts will include their own function definitions.
the first-order approach is to call said command with the -h or --help flags.
A more detailed approach is to use the man command. Short for manual, man provides a manual page (called manpage) for a command you specify.
For interactive tools such as the ones based on ncurses, help for the commands can often be accessed within the program using the :help command or typing ?.
Sometimes manpages can provide overly detailed descriptions of the commands, making it hard to decipher what flags/syntax to use for common use cases.
TLDR pages are a nifty complementary solution that focuses on giving example use cases of a command so you can quickly figure out which options to use.
To use tldr, just type: tldr *** (just remind, you need to install it first)
# Find all directories named srcfind.-namesrc-typed
# Find all python files that have a folder named test in their pathfind.-path'*/test/*.py'-typef
# Find all files modified in the last dayfind.-mtime-1
# Find all zip files with size in range 500k to 10Mfind.-size+500k-size-10M-name'*.tar.gz'
Beyond listing files, find can also perform actions over files that match your query. This property can be incredibly helpful to simplify what could be fairly monotonous tasks.
grep, a generic tool for matching patterns from the input text.
For now, know that grep has many flags that make it a very versatile tool. Some I frequently use are -C for getting Context around the matching line and -v for inverting the match, i.e. print all lines that do not match the pattern.
For example, grep -C 5 will print 5 lines before and after the match. When it comes to quickly searching through many files, you want to use -R since it will Recursively go into directories and look for files for the matching string.
But grep -R can be improved in many ways, such as ignoring .git folders, using multi CPU support, &c.
Many grep alternatives have been developed, including ack, ag and rg. All of them are fantastic and pretty much provide the same functionality. For now I am sticking with ripgrep (rg), given how fast and intuitive it is. Some examples:
# Find all python files where I used the requests libraryrg-tpy'import requests'# Find all files (including hidden files) without a shebang linerg-u--files-without-match"^#\!"# Find all matches of foo and print the following 5 linesrgfoo-A5# Print statistics of matches (# of matched lines and files )rg--statsPATTERN
The history command will let you access your shell history programmatically.
It will print your shell history to the standard output. If we want to search there we can pipe that output to grep and search for patterns. history | grep find will print commands that contain the substring “find”.
In most shells, you can make use of Ctrl+R to perform backwards search through your history. After pressing Ctrl+R, you can type a substring you want to match for commands in your history. As you keep pressing it, you will cycle through the matches in your history.
[root@iZbp12idmwavjjcx2k19kjZcourse2]# ls -a -l -h -t --color=autototal88K
-rw-r--r--1rootroot54KDec321:46install_lts.sh
drwxr-xr-x5rootroot4.0KDec321:46.
-rwxrwxrwx1rootroot83Dec320:04script.py
drwxr-xr-x11rootroot4.0KDec320:02bar
drwxr-xr-x11rootroot4.0KDec320:02foo
-rwxrwxrwx1rootroot50Dec319:21mcd.sh
-rwxrwxrwx1rootroot485Dec319:21f.sh
drwxr-xr-x2rootroot4.0KDec319:17testdrwxr-xr-x14rootroot4.0KDec319:12..
-a : includes all files
-l : use a long listing format
-h : print sizes like 1K 234M 2G etc.
-t : sort by recency
--color=auto : colorize output
Problem2
Write bash functions marco and polo that do the following. Whenever you execute marco the current working directory should be saved in some manner, then when you execute polo, no matter what directory you are in, polo should cd you back to the directory where you executed marco. For ease of debugging you can write the code in a file marco.sh and (re)load the definitions to your shell by executing source marco.sh.
Say you have a command that fails rarely. In order to debug it you need to capture its output but it can be time consuming to get a failure run. Write a bash script that runs the following script until it fails and captures its standard output and error streams to files and prints everything at the end. Bonus points if you can also report how many runs it took for the script to fail.
#!/usr/bin/env bashn=$((RANDOM%100))if[[n-eq42]];thenecho"Something went wrong">&2echo"The error was using magic numbers"exit1fiecho"Everything went according to plan"
#!/usr/bin/env bashcount=0echo>out.log
whiletruedo./task3.sh&>>out.log
if[[$?-ne0]];thencatout.log
echo"failed after $count times"breakfi((count++))done
Problem4
As we covered in the lecture find’s -exec can be very powerful for performing operations over the files we are searching for. However, what if we want to do something with all the files, like creating a zip file? As you have seen so far commands will take input from both arguments and STDIN. When piping commands, we are connecting STDOUT to STDIN, but some commands like tar take inputs from arguments. To bridge this disconnect there’s the xargs command which will execute a command using STDIN as arguments. For example ls | xargs rm will delete the files in the current directory.
Your task is to write a command that recursively finds all HTML files in the folder and makes a zip with them. Note that your command should work even if the files have spaces (hint: check -d flag for xargs).
. specifies the current directory as the starting point.
-type f filters the results to include only files (not directories or other types).
-name "*.html" matches files with the .html extension.
Output: A list of .html file paths.
| (Pipe symbol)
Passes the output of find (list of .html files) as input to the next command, xargs.
xargs -d '\n'
xargs converts the input (list of file paths) into arguments for the tar command.
-d '\n' specifies that each line of input is treated as a separate file. This ensures paths with spaces or special characters are handled correctly.
tar -cvzf html.zip
tar creates compressed archives.
-c : Creates a new archive.
-v : Displays a list of files being added to the archive (verbose mode).
-z : Compresses the archive using gzip.
-f html.zip: Names the output file html.zip.
Problem5
(Advanced) Write a command or script to recursively find the most recently modified file in a directory. More generally, can you list all files by recency?
uh... I find that I'm still quite uncomfortable with the grammar of bash script. The next step I think, before next lecture, maybe is to pursue a deeper and better understanding of bash script.
All kinds of shell share one common core: they allow you to run programs, give them input, and inspect their output in a semi-structured way
As I use macos, my shell is terminal or namely zsh. (uh... btw, to be honest, I don't quiet know the differences between zsh and terminal and shell, hope I'm right..
This is the main textual interface to the shell. It tells you that you are on the machine missing and that your “current working directory”, or where you currently are, is ~(short for “home”). The $ tells you that you are not the root user (more on that later). At this prompt you can type a command, which will then be interpreted by the shell.
If you want to provide an argument that contains spaces or other special characters (e.g., a directory named “My Photos”), you can either quote the argument with ' or " ("My Photos"), or escape just the relevant characters with \ (My\ Photos).
But... it seems that my macos doesn't need those quote or back slash, it will also print out the correct arguments
If the shell is asked to execute a command that doesn’t match one of its programming keywords, it consults an environment variable called $PATH that lists which directories the shell should search for programs when it is given a command:
First, the d at the beginning of the line tells us that missing is a directory.
Then follow three groups of three characters (rwx). These indicate what permissions the owner of the file (missing), the owning group (users), and everyone else respectively have on the relevant item.
A - indicates that the given principal does not have the given permission.
Above, only the owner is allowed to modify (w) the missing directory (i.e., add/remove files in it).
To enter a directory, a user must have “search” (represented by “execute”: x) permissions on that directory (and its parents).
To list its contents, a user must have read ® permissions on that directory. For files, the permissions are as you would expect.
Quick summary: rwx represents the permissions, r for read, w for modify, x for execute, - for don't have the given permission.
Notice that nearly all the files in /bin have the x permission set for the last group, “everyone else”, so that anyone can execute those programs.
5. other else
mv (to rename/move a file)
cp (to copy a file)
mkdir (to make a new directory).
man ( It takes as an argument the name of a program, and shows you its manual page. Press q to exit.)
cat is a program that concatenates files. When given file names as arguments, it prints the contents of each of the files in sequence to its output stream. But when cat is not given any arguments, it prints contents from its input stream to its output stream (like in the third example above).
You can also use >> to append to a file
The | operator lets you “chain” programs such that the output of one is the input of another:
The command tail means print the tail lines of a file, -n means the index of the line(starting from 1 in the end), -n1 means to print the exactly last one line of the file.
On most Unix-like systems, one user is special: the “root” user.
The root user is above (almost) all access restrictions, and can create, read, update, and delete any file in the system.
using the sudo command. As its name implies, it lets you “do” something “as su” (short for “super user”, or “root”)
One thing you need to be root in order to do is writing to the sysfs file system mounted under /sys. sysfs exposes a number of kernel parameters as files, so that you can easily reconfigure the kernel on the fly without specialized tools. Note that sysfs does not exist on Windows or macOS.
For example, the brightness of your laptop’s screen is exposed through a file called brightness under
Simply speaking, the error occurs because the shell (which is authenticated just as your user) tries to open the brightness file for writing, before setting that as sudo echo’s output, but is prevented from doing so since the shell does not run as root.
The first line might be tricky to get working. It’s helpful to know that # starts a comment in Bash, and ! has a special meaning even within double-quoted (") strings. Bash treats single-quoted strings (') differently: they will do the trick in this case. See the Bash Quoting (Bash Reference Manual) manual page for more information.
7. Use chmod to make it possible to run the command ./semester
First, quick dive to chmod
chmod : change file modes or Access Control Lists
Usage: chmod [选项]... 模式[,模式]... 文件...
mode define the permissions of the file or directory, usually 3 number. Each number represents the permissions of user, group and other respectively.
Each write, read, and execute permissions have the following number value:
r (read) = 4
w (write) = 2
x (execute) = 1
no permissions = 0
To find out the file’s permissions in numeric mode simply calculate the totals for all users classes. For example, to give read, write and execute permission to the file’s owner, read and execute permissions to the file’s group and only read permissions to all other users you would do the following: