Posts tagged 'script'

Automatically upload screenshots in XFCE4

published on February 13, 2012.

XFCE4 has a nice little tool for making screenshots - xfce4-screenshooter. My only gripe with it is that it can't automatically upload the images to a server and give me the URL to the image (to be honest, it can, but it uploads the images to a shady looking website, and I don't like that). And then one day I saw Evan Coury's GtkGrab - a set of scripts which does exactly what I want! But, sadly, that's for Gnome. So, based on Evan's work, I put together this little script:

# based on GtkGrab by @EvanDotPro
function rename_file()
    NEWFILE=$(echo $1 | md5sum | cut -c-5)'.png'
xfce4-screenshooter -r --save=$LOCALPATH
LOCALFILE=$(ls -tr $LOCALPATH | tail -n 1)
rename_file $LOCALFILE
while [ "$I" -lt "$LIMIT" -a -f "$LOCALPATH$NEWFILE" ]
    rename_file $NEWFILE
    I=`expr $I + 1`
echo "$DOMAIN$NEWFILE" | xclip -selection clipboard
notify-send "Screenshot uploaded, URL in clipboard"

Save this script somewhere on your computer, configure the DOMAIN, LOCALPATH and REMOTE variables, set the script to be executable and then create a shortcut combination for it via Settings -> Keyboard -> Application Shortcuts. Programs you'll need to have installed for this to work are xfce4-screenshooter, xclip and notify-send. If you don't want to be prompted for the password/passphrase for the scp command each time, set up a passwordless login for your user on your remote server.

Happy hackin’!

Benchmarking pages behind a login with ab

published on November 09, 2011.

Tonight I decided to relax a bit and what better way of relaxing is there for a geek then to do some bash scripting?! So for fun and no profit I decided to try and benchmark pages with ab, Apache HTTP server benchmarking tool, which are behind a login. Turns out, it's pretty easy after reading some man pages ;)

ab's help pages gives a few possible leads. We can POST data with the -p option, which would be great if we would like to benchmark the login process itself. But, we want to test the page after the login. So we'll need the ab's -C option, which allows for passing cookies in cookie-name=value pairs.

The login process itself is done with curl as it allows us to POST data to a server and store cookies received from the server in a cookie jar. curl writes the cookies in a Netscape cookie file format, whatever that is. Sample line is:

# Netscape HTTP Cookie File
# This file was generated by libcurl! Edit at your own risk.	FALSE	/	FALSE	0	PHPSESSID	[RANDOM_SESSION_ID]

From this output we're interested in the [RANDOM_SESSION_ID] cookie value, as the cookie name is simply PHPSESSID and we can just hard-code it. To get the value, we use some obscure *nix magic: grep and cut. grep to grep the line with the PHPSESSID cookie and cut to cut out the 7th column from that line. Easy!

Now that we have the value of the cookie, we just pass it along with ab and done! We're benchmarking pages behind a login.

The entire script is:



echo "Logging in..."

curl -c $COOKIE_JAR -d username=user -d password=h4x0r

echo "Getting the session id..."
PHPSESSID=$(cat $COOKIE_JAR | grep PHPSESSID | cut -f 7)

echo "The session id is:"
echo "=================="

ab -n 10 -c 10 -C PHPSESSID=$PHPSESSID

The script is also on Github here.

Tip: use ab's -v option to test for HTTP codes and/or redirects to see if you are really on the page you want to be.

Happy hackin’!

Backup script for mysql

published on November 05, 2010.

This post is more of a reminder for myself. Anywayz, a little bash script that backups a database, gzipit and deletes all backups older than 3 days.


NOW=$(date +"%Y-%m-%d-%H-%M-%S")

mysqldump -u$DBUSER -p$DBPASS $DBDB > "$BACKUPSQL"
find $BACKUPROOTDIR -type f -name "mysqlbackup\*" -mtime +3 | xargs rm

Kudos to @zsteva for looking at it to spot any errors I might have made.

Tags: backup, mysql, script, shell.
Categories: Development.

pywst - setting up web projects quickly

published on February 22, 2009.

I wrote a Python script for automating the steps required to setup a web project environment on my local dev machine that runs on Ubuntu. Called it pywst: Python, Web, Svn, Trac. That's the best I could do, sorry :P

The main steps for setting up a new project are:

  • Create a virtual host
  • Add it to /etc/hosts
  • Enable the virtual host
  • Import the new project to the SVN repository
  • Checkout the project to /var/www
  • Create a TRAC environment for the project
  • Restart Apache

After these steps I have http://projectName.lh/ which points to /var/www/projectName/public/, SVN repo under http://localhost/repos/projectName/ and the TRAC environment under http://localhost/trac/projectName/.

As I have this ability to forget things, I always forget a step or 2 of this process. Thus, I wrote pywst (note, this is a txt file, to use it, save it to your HDD and rename it to It's not the best and nicest Python script ever wrote, but gets the job done. All that is need to be done to setup a project with pywst is:

sudo ./ projectName

2 things are required: to run it with sudo powers and to provide a name for the project.

Future improvements

The first, and the most important is to finish the rollback() method. Now, it only exits pywst when an error occurs, but it should undo all the steps made prior to the error.

Second, to make it work on other distros, not only on Ubuntu. That would require for me getting those other distros, set them up, look where they store Apache and stuff, where's the default document root, etc. Hmm… This will take a while :)

Third, support PHP frameworks - Zend Framework, CodeIgniter and CakePHP — ZF is a must :P Under support I mean to create the basic file structure for them automagically.


Robert Basic

Robert Basic

Software developer making web applications better.

Let's work together!

I would like to help you make your web application better.

Robert Basic © 2008 — 2020
Get the feed