AboutWelcome to Free Software Daily (FSD). FSD is a hub for news and articles by and for the free and open source community. FSD is a community driven site where members of the community submit and vote for the stories that they think are important and interesting to them. Click the "About" link to read more...
Reading files is no big deal with bash: you just redirect the input to the script or pipe the output of another command into the script, or you could do it inside the script if the file names are pre-determined. You could also use process substitution to pass in the open files (command pipelines actually) from the command line.
The Bash shell provides a builtin command that allows you to test the exit status of a command to determine if the command was successful or not. The test command provides a way to make decisions about what will happen next in a script.
In addition to the fairly common forms of input/output redirection the shell recognizes something called process substitution. Although not documented as a form of input/output redirection, its syntax and its effects are similar.
Oftentimes you’ll be in a situation where you want to run a command on a remote machine that will take a long time to complete, but you want to be able to issue the command and then log off and have that command run in the background. There are many ways you could achieve this, perhaps by using cron or at to schedule the command to run right away. However, there is a better way.
In the last article I talked about simple command pipelines, one of the features that makes the Linux command line so powerful and so worth learning. So if you want to get comfortable using the command line, here are some tips that will make it a lot easier.
There are times when you will want to trim some information from the output of a command. This may be because you want to feed that output into another command. Whatever the reason for wanting to manipulate the output, awk is one of many tools available in GNU/Linux to perform this task.