Let cURL do your heavy lifting

curlrobot

In a previous post, I talked about using cron to do your bidding. If you do a lot of web development like I do, you may have some web based tasks that you need to automate. A good example of this could be clearing a file cache or tipping of a script to rebuild your xml site map.

One of the best ways to do that would to be cURL and cron together. cURL is an application used to transfer the contents of a url. Insistently, to transfer the contents of a url, curl has to connect to that url, and the server has to process that url/script… do you see what I am going with this? cURL can be used to do anything you would normally do from the url of your browser.

This is the usage of cURL:

curl [options] [URL…]

lets say you have the following URL that you want to connect to on a regular schedule
http://SomeDomain.com/SomeFile.php?var1=abc&var2=zyz

you would use the following cURL command:
/usr/bin/curl -G -d var1=abc -d var2=zyz http://SomeDomain.com/SomeFile.php

We are using the “-G” argument to simulate a “GET” command, and each “-d” is a variable/value pair

note: “/usr/bin/curl” is the location on my computer that cURL is installed

running cURL from the command line will show you results that get returned from the processing of the script.

To fully automate this processs, add the curl command into cron. now you are one step closer to world domination!

CRON will do your bidding

cron_robot

It’s common knowledge that everyone dreams about one day having their own evil robots to do their bidding… ok so maybe that just me

I’m not an electrical engineer, so my dreams of having a physical robots under my control most likely won’t come true, but I am a decent programmer and computer nerd, so the option of having invisible virtual robots is very realistic.

I would like to introduce you to CRON. Cron is a deamon program (I told you I liked evil robots!) that lives in most unix and linux powered machines (that includes you mac users!). To sum it up, cron runs tasks on a schedule for you. For example, backups of data are important, but time consuming, and are easy for a system admin or user to forget to do. Cron doesnt forget….being the evil robot that cron is, he will do whatever you tell him to do, including grunt work like creating back ups. If you want to get fancy you can mix in some php and a little mysql, and do all sorts of wild things like have him manage your twitter account, but I would never do that!

Here is a very helpful image that will give you an idea of how to schedule tasks for cron to handle.

cron commands

Click to make larger

As you can see, cron is very flexible and can be commanded to do anything, at any time. Cron will handle anything you throw at him, he can email you the results if you want.

Some of the things that I use cron for include:

  • backing up files
  • backing up databases
  • consuming rss feeds into wordpress
  • optimizing MySql database tables
  • emailing me error reports on my server
  • generate site maps and submit them to google
  • a few things here and there with twitter (shhhhh, its my secret!)

All in all, having automated tasks can really free you from a lot of admin duties, and make your life a lot easier, and can empower you to do a lot of interesting things with your server / personal computer.

Do a google search on cron and you will find all sorts of good resources that will teach you better then I can on how to use cron succesfully.

I will be making some posts in the future showing some of the scripts that I have cron run for me. Until then, read up on cron and start build you own evil robot army!

Solution for: MySQL server has gone away at mysqlhotcopy line 528

mysql-logo

Recently I was backing up a large MySql database (several hundred megabytes), using the awesome MySqlHotCopy script, when I started getting the following error:

DBD::mysql::db do failed: MySQL server has gone away at mysqlhotcopy line 528.

I have no clue what that error means. mysqlhotcopyworked great on all of my other smaller databases. I did a little searching on my old friend google, and after sniffing around a bit, I came up with a resolution to the problem… the script was timing out, so I just had to increase the allowed time in the /etc/my.cnf file.

Here are the steps I took

pico /etc/my.cnf

add these lines to the file:
interactive_timeout = 3600
wait_timeout = 3600

save file

/etc/init.d/mysqld restart

I ran mysqlhotcopy again, and everything worked and the backup was made.

Troubleshooting: SoftException in Application.cpp:544

fixing the server

I recently moved all of my sites to a new dedicated web server. In the move my webhost helped me mirror my old applications and tilt them up on the new server. This is where my problems started, several of my applications just wouldnt run, they would throw out “500 errors”.

Checking my logs, the errors looked like this:

[Tue Dec 30 09:20:18 2008] [error] [client ip.address.here] SoftException in Application.cpp:544: Directory “/home/username/public_html/someDirectory” is writeable by group

I searched the error on google like any good geek would

I found several posts about the problem, but it all pointed down to the way that apache was configured on my new server.

I came across this post, and it really shed light on the subject for me:

“The specified directory name has been made writable by group. That suggests that your server’s apache configuration doesn’t allow you to make directories writable by group.”

In english, you’ve got the wrong permissions set on one or more folders.

I checked the directory location from the 500 error, and I saw that it’s permissions were 777, world writable. I changed the directories permissions to 755, and Apache was happy again and let the code run.

Simple as that. My new Apache install won’t run code that is in a public folder. This is a good thing for sure. If you are having issues like I was, check your permissions, and be careful what is 777!