One of my grad students just went to remove some unwanted, automatically created files in his directory and accidentally deleted some things he wanted. I use a script to do clean ups to prevent these kinds of silly errors (which we're all prone to). Here's the script:
#!/bin/bash if [ ! -e $HOME/.rmd ] then mkdir $HOME/.rmd fi find $HOME \\( -name '.rmd' -prune \\) -o \\ \\( -name '*~' \\ -o -name ',*' \\ -o -name '#*#' \\ -o -name '*.bak'\\ -o -name '*.backup' -atime +5\\ -o -name 'core'\\ \\) \\ -print -exec mv -f {} $HOME/.rmd \\; find $HOME/.rmd -atime +5 -exec rm -f {} \\;
The script creates a directory called .rmd if it doesn't exist, finds files matching a certain set of patterns to that directory, and finally removes things in that directory that were moved there more than five days ago. It's not perfect--files with the same name are just moved over the top of each other.
I name it "clean" and put it in my personal bin directory. You might add or delete individual line items depending on what kinds of files your programs create. When I was a grad student, disks were expensive, and worked on a system that enforced quotas, I ran it in a cronjob once a day. Now I just run it whenever things look ugly--the same approach I have to dusting.
Building or modifying a script like this can be dangerous since a bug could cause things you care about to be systematically removed. I recommend testing it on an account that doesn't have anything you care about in it before you blindly trust it.
One last thing: I used Linux in the title, but this will obviously work in anything with bash and find including varieties of Unix and OS X. These days I'm running it on OS X rather than Ultrix or 4.3BSD. Not all versions of find have a "prune" option.