Press "Enter" to skip to content

Posts published in February 2008

Senate passes bill to prohibit gun confiscations

Thanks in no small part to your calls to legislators, the state Senate today passed Assembly Bill AB581. Authored by Representative Scott Gunderson (R-Waterford), this is the bill that prohibits the governor and other public officials from confiscating guns during declared states of emergency.

Under the bill, public officials will not be able to exercise emergency powers to restrict the possession, transfer, sale, transport, storage, display or use of firearms or ammunition.

AB581 passed in the Assembly in December by a vote of 84 to 13, with two members not present. The vote in the Senate today was 26 in favor, 5 opposed, with two members not present.

Voting against the bill were Senators Spencer Coggs (D-Milwaukee), Lena Taylor (D-Milwaukee), Judy Robson (D-Beloit), Mark Miller (D-Monona) and Fred Risser (D-Madison). By their votes, these senators are on record as being in favor of allowing the governor or other public officials to confiscate firearms. Not just Evil Black rifles, or handguns, but ALL guns.

Governor Doyle’s office has said that he has not yet decided if he will sign the bill. He’s known about this bill for months, knows what it’s all about, and certainly should have decided what his position would be on the bill by now.

Please help Governor Doyle decide what his position should be by contacting his office at 608-266-1212.

Also call your state senator to thank him or her for a vote in favor of gun rights. If you don’t know who your state senator is, go to and enter your street address to get your senator’s name and contact information. Please also thank Senate Majority Leader Russ Decker (D-Wausau) for scheduling a vote on this very important bill.

Towel Heads

Okay, this passed along via email… just too good not to post for folks to get a good chuckle from;

In these troubled times, it has become very difficult to distinguish the good towel-heads from the bad towel heads. Just where are the moderate Muslims, anyway? Do they actually exist?

The following is provided, to help you distinguish between a BAD “towel-head” and a GOOD “towel-head.” You must study the pictures carefully so that you will not confuse the two in a moment of indecision… it could save your life!

Bad TowelHead

bad towelhead


GOOD towel-head

Good Towelhead

Now you got it straight? 🙂

Check web page contents

Another simple script I’ve written and use often (actually runs via cron every 5 minutes) to verify the contents of a particular page to determine of it has changed for any reason like a webserver cracked/hacked or if the dynamic content is not what is expected.

In this case it is specifically monitoring a page from a PeopleSoft application that I do not entirely trust, if the content doesn’t match my checksum that generally means the Tuxedo application server is hosed and needs to be restarted.

Easily adapted to fit your particular needs, is use it on all of my public websites that are on shared hosting accounts etc…

Comments for bigger sections are thrown in to give you an idea of what to tweak for your needs, remember to update $file2 with a new copy of the markup if you modify the landing page you are monitoring!


# This just checks what the current working directory is,
# if dev or test then set the debug flags to on…etc.
unshift(@INC, “/pshome/psmgr/bin”);
unshift(@INC, “/pshome/psmgr/bin.test”) if (“/pshome/psmgr/bin.test” eq “$ENV{‘PWD’}”);
unshift(@INC, “/pshome/psmgr/stat/bin”) if (“/pshome/psmgr/stat/bin” eq “$ENV{‘PWD’}”);
require sr;

$debug = 1;
$debug = 1 if (“/home/bin.test” eq “$ENV{‘PWD’}”);
$debug = 1 if (“/home/stat/bin” eq “$ENV{‘PWD’}”);
$execute = 1;

$file1 = “/home/files/sitename.tmp”;
$file2 = “/home/files/sitename_known_good.txt”;
$url = “”;
$url = “” if (“/pshome/psmgr/bin.test” eq “$ENV{‘PWD’}”);
$url = “” if (“/pshome/psmgr/stat/bin” eq “$ENV{‘PWD’}”);
$restarting = “/pshome/psmgr/files/careers.restarting”;
$restartingold = “/pshome/psmgr/files/careers.restarting.old”;
$check = “0”;

# check to see if the restart is already in progress so we don’t interrupt it
# likely a cleaner way to do this, but it works for now!
if (-e $restarting) { # check if restart file exists
print “Restart file exists, servers may be restarting, check again in 5 minsn”;
print “Renaming file and Exitingn”;
$cmd = “mv $restarting $restartingold”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
} #end if restart file exists

if (-e $restartingold) { # check if restarting.old file still exists
print “Restarting file existed for at least 5 mins, servers should be back up, rechecking”;
print “Remove restart filen”;
$check = “1”;
$cmd = “rm $restartingold”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
} #end if restarting.1 file exists

$cmd = “wget -b –no-check-certificate –output-document=$file1 $url”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);

$cmd = “sleep 15”;
print “$cmd – sleeping 15 secs to allow complete xfer of filen” if ($debug);
system ($cmd) if ($execute);

$diff1=`cksum $file1`;
$diff2=`cksum $file2`;
$diff1value = substr($diff1, 0, 9);
$diff2value = substr($diff2, 0, 9);

print “diff1value = $diff1valuen” if ($debug);
print “diff2value = $diff2valuen” if ($debug);

if ($diff1value != $diff2value) {
} else {
print “Files match, this site is up!n”;
print “check = $checkn” if ($debug);
if ($check == “1”) { #check if recovering from restart
} # end check if recovered from restart

sub notifydown
print “This site appears to be down, sending page and emailn” if ($debug);
&sr::send_email($debug,”Site Down!”,”Site is down, restarting app servers now!”,”David Cochran”);
&sr::send_page($debug,”Careers site is down, restarting app servers now!”,”David Cochran”);
} # end sub notifydown

sub repair
{ # restart HRMSEXT App servers
$cmd = “mv $file1 $restarting”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
$cmd = “/commands to restart the application server go here”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
} #end sub repair

sub cleanup
{ # cleanup temp files
$cmd = “rm -f wget-log*”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
$cmd = “rm -f $file1”;
print “$cmdn” if ($debug);
system ($cmd) if ($execute);
} #end sub cleanup

sub notifyup
print “site is back up, sending page and email.n” if ($debug);
&sr::send_email($debug,”site is OK!”,”Careers site is OK.”,”David Cochran”);
&sr::send_page($debug,”Careers site is back up”,”David Cochran”);
} # end sub notifyup

Checking SSL Certificate expiration dates

Managing a lot of SSL certificates? Hate being surprised when they expire on you and break your web site? How about a simple process to notify you in advance?

All of the above sound about right? It does to me, I literally manage close to 100 web sites that require SSL encryption for sensitive data transfers, it seems almost impossible to get and keep them all lined up for expiration dates. Even when they were something new would come along and mess up the rotation, in short order it looks like a shotgun blast to a calendar was the deciding factor.

Here is an easy perl script I wrote to check the dates of existing SSL certs, it gets the URL list to check from certs.urls, compares the certificate expiration date to the current date and send emails at the specified intervals. Pretty simple, but like most, it’s very effective when run daily via cron.

I’ve edited some of the partially confidential stuff out, but not enough to render the script unusable by any means, just set the to your local email faciltiy.

#! /usr/local/bin/perl
#, v 0.01
# Inital version – Dave Cochran 9/13/07
use Switch;
use Time::Local;

unshift(@INC, “/pshome/psmgr/bin”);
unshift(@INC, “/pshome/psmgr/bin.test”) if (“/pshome/psmgr/bin.test” eq “$ENV{‘PWD’}”);
unshift(@INC, “/pshome/psmgr/stat/bin”) if (“/pshome/psmgr/stat/bin” eq “$ENV{‘PWD’}”);
require sr;

$debug = 0;
$debug = 1 if (“/pshome/psmgr/bin.test” eq “$ENV{‘PWD’}”);
$debug = 1 if (“/pshome/psmgr/stat/bin” eq “$ENV{‘PWD’}”);
$execute = 1;

$path = “/pshome/tmp”;
@urls = `cat /pshome/psmgr/files/cert.urls`;

foreach $url (@urls) { # start for each url
chomp $url;
print “nChecking $urln”;
$cmd = “echo “” | openssl s_client -connect $url:443 > $path/certificate”;
print “nn $cmdnn” if ($debug);
system ($cmd) if ($execute);

$cmd = “openssl x509 -in $path/certificate -noout -enddate > $path/outdate”;
print ” $cmdnn” if ($debug);
system ($cmd) > $result if ($execute);

open (OUTDATE, “/pshome/tmp/outdate”) || die “couldn’t open the file!”;
$enddate = ;
chomp $enddate;
print “SSL enddate is: n” if ($debug);
$expire = substr($enddate, 9, 20);
print “Expire date : n” if ($debug);
$month = substr($expire, 0, 3);
$day = substr($expire, 4, 2);
$year = substr($expire, 16, 4);

switch (“$month”)
case “Jan” { $month = 0 }
case “Feb” { $month = 1 }
case “Mar” { $month = 2 }
case “Apr” { $month = 3 }
case “May” { $month = 4 }
case “Jun” { $month = 5 }
case “Jul” { $month = 6 }
case “Aug” { $month = 7 }
case “Sep” { $month = 8 }
case “Oct” { $month = 9 }
case “Nov” { $month = 10 }
case “Dec” { $month = 11 }
else { print “$month is not a valid month. You have problems!n” }
} # end switch on month
$day =~ s/ /0/;
$expire_date = timegm(1,0,0,$day,$month,$year – 1900);
$today = `date +%Y%m%d`;
chomp $today;
$today = timegm(1,0,0,`date +%d`,`date +%m` -1,`date +%Y` – 1900);
$thirty_days = $today + (86400 * 30);
$fifteen_days = $today + (86400 * 15);
$seven_days = $today + (86400 * 7);
$one_day = ($today + 86400);

print “Today is and the cert expires on n” if ($debug);
print “1 day is n” if ($debug);
print “7 days is n” if ($debug);
print “15 days is n” if ($debug);
print “30 days is n” if ($debug);

$subject = “$url certificate expiration”;
if ($today > $expire_date) { # start if the certificate is expired
&sr::send_page($debug,”$url cert expired”,’Paul Hofmann’,’David Cochran’);
$message = “The $url certificate is expired”;
print “t$messagen”;
&sr::send_email($debug,$subject,$message,’David Cochran’);
} # end if the certificate is expired
elsif ($one_day == $expire_date) { # start else if the certificate will expire in 1 day
&sr::send_page($debug,”$url cert expires in 1 day”,’David Cochran’);
$message = “The $url certificate will expire in 1 day”;
print “t$messagen”;
&sr::send_email($debug,$subject,$message,’David Cochran’);
} # end if the certificate will expire in 1 day
elsif ($seven_days == $expire_date) { # start else if the certificate will expire in 7 days
$message = “The $url certificate will expire in 7 days”;
print “t$messagen”;
&sr::send_email($debug,$subject,$message,’David Cochran’);
} # end if the certificate will expire in 7 days
elsif ($fifteen_days == $expire_date) { # start else if the certificate will expire in 15 days
$message = “The $url certificate will expire in 15 days”;
print “t$messagen”;
&sr::send_email($debug,$subject,$message,’David Cochran’);
} # end if the certificate will expire in 15 days
elsif ($thirty_days == $expire_date) { # start else if the certificate will expire in 30 days
$message = “The $url certificate will expire in 30 days”;
print “t$messagen”;
&sr::send_email($debug,$subject,$message,’David Cochran’);
} # end if the certificate will expire in 30 days
} # end for each url


URLs are stored in cert.urls with no http:// prefix


You get the idea…

Happy hacking!

Finding large files on Linux/Unix

Depending on how you build your filesystem, locating large files such as logs or other output from scripts, programs, or daemons can be frustrating at best. Since I’m not a GUI kind of guy this little snippet for the shell can save tons of time. (tested with KSH and Bash shells, likely should work with most if not all)

[code]find / -type f -size +20000k -exec ls -lh {} ; | awk ‘{ print $9 “: ” $5 }'[/code]

This will search beginning at the root / directory for all files over 20M. To adapt it for your use simply change the / to represent any beginning path you wish and the 20000k to the minimum filesize you would like to find.

Hard to get any easier than this. A detailed description of the all powerful awk can be found by reading the associated man pages should you be so inclined.

Happy hacking!

Killing run-away jobs

Occasionally recurring jobs will get hung, especially with jobs fired off from cron that are scheduled to run every minute. In my case there is a job that runs every minute to pick up status files from numerous remote servers, the script gets the files, parses them, and then plugs them into a MySQL database. That data is later parsed on demand by a php page to display different status information about the remote servers. All in all it works quite well, until something gets hung, especially if the NFS mount for the database has a hiccup, once the first database connect hangs it tends to hang the subsequent jobs as well… quickly filling up the available memory and of course crashes or hangs any other web pages trying to access the database.

A quick and dirty way to right things without typing or copy/pasting a bazillion pids to a kill -9

for pid in ` ps -ef |grep process_name | awk ‘{print $2}’`
kill -9 $pid

Nothing magical about it, the for loop scans the output from the ps command for the process_name, awk parses out the pid #, the loop performs a kill -9 for each pid returned.
Like I mentioned, nothing magical about it, but a quick efficient method of killing what can be hundreds of processes.

Happy hacking!