Monday, 18 July 2011

Picasa Photo Manager

Some background

When I first started using Picasa I wasn’t too impressed; yet another photo editor I thought. It was only after I discovered that photos and videos could be stored online when I realized how useful this could be as a photo backup solution. It is not as cheap as Windows SkyDrive but at 5 USD per year it is not too bad either.

Picasa works really great for a single-user, single-machine configuration. As soon as you reinstall your OS or try to set it up on a different computer it gets ugly. After installing Picasa on my second computer I learnt the the first bad thing about it. I noticed that downloaded albums would have a different folder structure than my original setup. I had to manually fiddle with the files to put it into order.

Strike one!

Another problem was trying to keep all my photos in sync on my NAS server. Since it is not a very powerful machine I didn’t want to run full-blown Picasa desktop application on it but would rather run a small footprint service that will keep my photo collection up-to-date. Ideally this application shouldn’t require any user input beyond the initial set-up. Picasa will not download new files when they are available online neither it will upload a file created locally to the online web album.

Strike two!

Before I started using Picasa Web Albums I used Snapfish to share my photos. You can upload any number of them to the free online storage and then ask your friends to create their own Snapfish account and view your snaps. This was a bit too painful so when I learnt about Picasa Web Albums I was very pleased. I could finally organize my photo library into logical albums. Sharing pictures with family was great but there was a problem – how do I share photos from holiday without sharing those few pictures, which I would rather not take? The hard truth was – there is no way other than creating a separate album.

That was strike three!

After a few years of using Picasa I still like it but I started looking for a better solution. I wanted Picasa Web Albums to work in the same way as DropBox. I create a folder locally and Picasa creates WebAlbum for me with the same name. I drop a picture into a folder and it gets synchronized to my on-line album.

The New Era

Picasa Photo Manager
I started a small project Picasa Photo Manager which supposed to address those limitations using Java and Picasa API. I defined the initial requirements for the application as:
  • Browse Picasa Web Albums
  • Selective download of the Picasa albums (one off download)
  • Watch web albums for changes and update them with new/modified pictures
  • Watch file system folder for new albums and upload them to Picasa Web Albums
  • Watch file system folder for new/modified pictures and upload them to Picasa Web Albums
  • Edit Picasa Web Album properties (album name, date, location, visibility, etc.)
  • Multi account support
  • Coping pictures between accounts
  • Virtual albums to add granular permission management
  • Run the application in the background (minimize to tray in windows and Mac; not sure about Linux)
  • Integrate with Picasa desktop application (not sure to what extend and how to do it yet)
  • Embed Picasa photo metadata (stared photos, face tag, tags, etc) into the image EXIF and use this data to update Picasa database (could be tricky :)

Wednesday, 8 December 2010

Generating random strings with VBScript

How to generate random strings of various lengths for use as a password database in bulk user creation or any other purpose? It is relatively simple task with some VBScript skills. Of course it won’t be really random as VBScript can only generate pseudo-random numbers. That means that if someone will know the random seed and entropy used to generate the passwords it will be possible to reproduce the same set of passwords.

The script which I’m presenting here will generate random number and then use it as an index to the entropy array (i.e. array of all characters which could be used in the password). Number of characters in the password is also random between the PASS_LEN_MIN and PASS_LEN_MAX boundaries (inclusive). To make all passwords the same length, use the same number for PASS_LEN_MIN and PASS_LEN_MAX.

If security is the main concern than this utility shouldn't probably be used. Having said that here are some tips on how to improve passwords quality:
  • expanding entropy  – to add special characters
  • mix entropy – entropy character order could be altered and then removed after the passwords are generated to make the process even less predictable
  • use different seed – in order to make the random number sequence less predictable a user specified seed could be used. This can be done by passing a parameter to the Randomize function:
    Randomize 767554354

    Note: Provided that the script is unchanged using the same seed will always generate the same result file. By default there is no seed value passed to the Randomize function which will use current timestamp.
  • random pick – generate more passwords than required and pick a random subset

By default the script will generate 100 passwords and will save them to the PasswordDictionary.txt file in the script working folder. Running the script from the Desktop will generate password file called PasswordDictionary.txt on the Desktop.

The Random String Generator utility can be downloaded from here.

Saturday, 16 October 2010

LoadRunner Clean-up utility is a simple VBScript tool which will remove all unnecessary files from the LoadRunner script folder to reduce its size. Utility is pre-configured to remove all standard files which are generated by LR during script or scenario execution.

All default settings can be modified by editing first few lines of the script file. Following is the first section of the script which defines which files and folders should be removed from the script. Please note the in order to simplify and make easier to read I’ve removed full list of files and folders for each of the four filters.

To see full list and explanation of VUGen files which can safety be removed from script folder please read this post.

Dim aFolderFilter: aFolderFilter = Array(".DS_Store", "result1")
Dim aFileNameFilter: aFileNameFilter = Array("pre_cci.c", "output.txt")
Dim aFileExtentionFilter: aFileExtentionFilter = Array("idx", "bak", "ci")
Dim aFileFilterRegEx: aFileFilterRegEx = Array("combined_*.c")

Dim aIgnoreFolders: aIgnoreFolders = Array(".Trash")

'Leave blank to use script location as a start
Dim sStartFolder : sStartFolder = ""

Const CLEANUP_SCRIPT_NAME = "CleanUp.vbs"
' You shouldn't need to change anything below this line

The four first lines in the script define four filters which will be used to recognise files and folders to remove.
  • aFolderFilter – list of folder names which should be removed
  • aFileNameFilter – exact file names for files which to be removed
  • aFileExtentionFilter – exact extension of the files to be removed
  • aFileFilterRegEx – file names to be removed where asterisk (*) represents any sequence of characters
To add new entry, simply insert additional parameter to the Array function call. In the following example new folder name called data is added to the folder filter:

Dim aFolderFilter: aFolderFilter = Array(".DS_Store", "result1", "data")

Start folder
Utility will scan all files and subfolders of the folder where it is located and remove all (see description of Trash folder below) files matching filter description. It scans the folder structure recursively going into all sub folders and using the same filters for each folder.

This behaviour can be overwritten by setting start location in the script configuration section. If the sStartFolder value is set to empty string (represented by two double-quotes) the default behaviour is used and the script starts processing from the folder where the script is located.

Using following line instead of the default one will start processing from the “C:\LoadRunnerScripts” folder regardless of where the script file is located:

Dim sStartFolder : sStartFolder = "C:\LoadRunnerScripts"

Clean-up script will always process folders before attempting to remove any files. This ensures that no individual files will be removed using file matching filters in case the whole folder is due to be removed.

Script will always attempt to locate custom CleanUp script in each subfolder it tries to process. That means that if the script finds CleanUp.vbs file inside any subfolder it will delegate responsibility of processing that subfolder to the custom script and will move to the next folder.

Name of the CleanUp script can be set to other one by changing value of the constant variable called: CLEANUP_SCRIPT_NAME.

Trash folder
Clean-up utility will not remove any file or folder from the machine but only move it to the Trash folder located in the same folder as the script. If the script is copied to and run from “Desktop\LR_Repository” Trash folder will be created within “Desktop\LR_Repository” folder.

Utility will recreate folder structure of the original files and folders within Trash folder. If the file which is to be removed is located in “Desktop\LR_Repository\Scripts\HR\HolidayApproval\output.txt” it will be moved to “Desktop\LR_Repository\.Trash\Scripts\HR\HolidayApproval\output.txt”

The same rule apply to folders.

Sunday, 10 October 2010

Put your LoadRunner scripts on diet!

There are multiple ways of making LR script folder a bit smaller in size which could be beneficial, if the script needs to be archived and stored for extended periods of time. Just to name a few methods here:

Operate from ZIP file
  • Saving and opening LoadRunner scripts in ZIP archives
  • Saving LoadRunner scripts in compressed Windows folders
  • Removing unnecessary files
Maintaining big script repository and taking regular backup of the entire script repository could be very disk-space hungry exercise particularly if all runtime data files and execution results are also unnecessarily being backed up.

As an example simple few steps script could have size of about 10MB after recording and the folder size could grow to 13MB after script execution which is about 30% increase. This increase in size caused by run-time data files, logs, VUGen execution result folders and backup files which are not required for the script to work properly.

I want to make clear here what I mean when I say VUGen execution result folders I don’t mean a result folder which is created during LR scenario execution. VUGen results are stored within a script folder and are created when the script is run on its own from within VUGen and are called result1, result2, result3, etc.
The simplest solution to save some space quickly (which I’ve been using for the past years) is to remove VUGen execution result folders which will usually release few hundred MB of disk space without removing any vital part of the script.

Additional space can be released by removing variety of other files which I will now explain.


  • data – useful folder containing html files, pictures and other items captured by LaodRunner during recording. I would recommend keeping that folder within the script folder but if the disk space usage is crucial and scripts size needs to be reduced for archiving this folder could also be deleted. It is not required for successful execution of the test script.
    result1 – default result directory for VUGen execution result folder. Note that tester can specify different result folder manually. LR will create folder with next number if it is not possible to obtain full rights to result1 folder.
  • .DS_Store – this folder is not created by LR or even Windows operating system but by Mac OS. It can be created if someone accesses script folder from Mac which might be the case if the scripts are stored in shared location such as corporate shared drive.

File extensions:

  • idx – binary index file for used to store parameter values
  • bac – backup files for script, output, and other types of VUGen files
  • ci – it is a final Compiled Image of the LR script

Full file names:
  • pre_cci.c – concatenation of all C code and lrun.h header file
  • output.txt – same text which is displayed in LR GUI in Replay Log
  • options.txt – options passed to the LR compiler pre-processor
  • logfile.txt – additional log file
  • logfile.log – additional log file
  • mdrv_cmd.txt – parameters passed to the mdrv.exe utility which compiles and runs the LR script
Other files with random filename parts (asterisk represents random part in the file name):
  • combined_*.c – concatenated/combined list of header files used by LR script
  • mdrv*.log – log file for the mdrv.exe process
All of those files could be removed manually with a help of windows build-in search utility. In order to simplify that task I’ve built small but mighty tool in VBSctript which can automate this task.

To learn more about a clean-up tool for LR script folders please read on here. You will find a link to the utility and brief explanation on how to configure it to your linking.

If you want to understand how LaodRunner compiler is working "under to hood" have a look at this the LoadTester website.

Friday, 8 October 2010

On-line tools useful in scripting

With the arrival of Web 2.0 technologies such as AJAX and its variation based on JSon (AJAJ) performance testers have to deal with new kind of requests and responses.
Although JSON structure is more lightweight it is harder to read JSON code then it is to read XML.
Code prettifies can make it a bit simpler to understand the structure of JSON object however it is still a bit difficult to read in the plain text form.

While looking for a solution I come across this fantastic JSON to HTML converter tool which is available online at:

Posix timestamp
Occasionally web application can use timestamps in the web request or response in the POSIX/Unix time format. It is a 10 or 13 digit number which represents number of seconds (for 10 digit number) or milliseconds (for 13 digit number) since 1 January 1970.

Since converting the POSIX time to the form we can understand is probably not an option for anyone here is a link to the online tool which can be of use to everyone. It allows for converting POSIX timestamp to human readable version and vice versa:

XPath test bed
While working with XML documents it is sometimes useful to refer to the individual nodes by their location and attributes. Having to re-run the whole test each time we get the XPath query wrong can be very frustrating. This online allows for uploading XML document and running XPath query against it:

URL encode and decode
Any string which is passed to the server in the Query String has to be URL encoded before it can be sent to the server.
In most of the cases it is just a single value or just few words which can be easily read with some basic understanding of URLEncoding structure.

For those more complex cases where a longer value is URLEncoded following online tool can be used. It can encode and decode any string:

Syntax highlighter
This last utility is probably less useful then the others but nevertheless can be handy at a times. As the name suggests syntax highlighter can apply style colours to many types of code making it easier to read

Tuesday, 28 September 2010

Random data generation tool

In performance testing is is often required to have sufficient amount of volume data (i.e. data present in database before commencing with the performance test) and parameter data (i.e. data used for parametrisation) for the realistic load test. 
Random data generator

Although random names, street names, email addresses can be generated using random characters it is sometimes required to use realistically looking test data. This can happen if the system under test is validating data against some kind of pattern in order to filter out potential attacks.

Random data generator is a simple Java based tool which can generate random data based on the user specified input file. Generated data is saved in a CSV format which can then be opened in Excel for easy viewing.

The tool comes with sample database of base data (in CSV format) which should be sufficient to generate enough quality and random test data for most of the projects. Bundled data source could be further expanded or even replaced by your own data making the output data more relevant for the specific use.

Following table summarizes type of test data the tool can generate
 Name from data source
 Surname from data source
 Company name from data source
 Date between two bundary dates
 Flat number random number
 House number random number
 House name not implemented
 Street name from data source
 Town name from data source
 Email address combination of name surname and company name
 Phone number random number with predefined prefix
 Generic random number random number between two bundary values

Since the tool is Java based it can be run in any operating system which supports Java.

Tool can be downloaded from here.

Wednesday, 22 September 2010

Basic Unix/Linux commands

In any type of testing (let it be functional or non-functional) some basic Linux/Unix skills could be very beneficial (if test environment is using that Operating System). Testers could connect to the server and perform basic tasks on their own releasing development team from some duties such as:
  • restarting the application server and/or the Operating System
  • accessing server logs to verify that the scripts are not causing server side exceptions not displayed in the browser
  • parsing server logs to calculate application load profile (i.e. number of concurrent users, pauses between requests, etc.)
The most common way to connect to the Unix system is to use SSH client such as Putty. This utility connects to the remote Unix server and allows for unlimited access to the server, provided that the user has sufficient access rights.
Open Putty session

It is considered a bad practise to share Administrator’s (or how it is called in Unix environment root’s) password with everyone and thus you should always use username with limited privileges for the day-to-day tasks.

The basic commands anyone connecting to the Unix server should know could be categorised into few groups.
  • Directory navigation and discovery
    • cd - change current folder to another (e.g. “cd /etc/init.d”)
    • ls - list content of the current folder often used with -lah attribute for additional info about file size, last modification data and permission (e.g. “ls -lah”)
    • pwd - displays current folder (e.g. "pwd")
    • find - search for files is a specified folder (e.g. “find /var/log -name .log”)
    • du - checks file or folder size (e.g. “du -hs /var/log”)
  • Text file processing
    • vi - very popular text editor for Unix operating systems; might be a bit difficult for a non-experienced user. Easier alternatives would be “nano” and “pico”
    • cat - prints file content in the console (e.g. “cat /var/log/messages”)
    • less - text viewer which allows backward navigation (e.g. “less /etc/passwd”); to quit press Ctrl+C
    • echo - prints text passed to the command in the console or sends it to the file (e.g. “echo “Hello World” > /tmp/filename.txt”). Can also be used to clear content of the file (e.g. “echo -n /tmp/zimbra.log”)
    • grep - filters the file or result of other command and prints lines containing search term (e.g. “cat /tmp/server.log |grep error” or “ps aux |grep apache”)
    • sed - command line version of the search and replace utility (e.g. “cat file.txt| sed -e "s/SearchFor/ReplaceWith/g" > ResultFile.txt”)
    • awk - pattern scanning and text processing language. Following example will parse /etc/passwd file using colon as a field delimiter (-F:) and print the content of the second column for each record (e.g. “cat /etc/passwd| awk -F: '{ print $1 }'”)
  • Operating System - Process control
    • ps - list all processes running in the system (e.g. for linux “ps aux” and for unix “ps -ef”)
    • kill - forces the process to quit (e.g. “kill -9 ProcessID”); Process id can be found by executing ps command
    • killall - terminates all processes by the process name (e.g. “killall zimbra”)
    • free - displayes amount of free and used memory in the system (e.g. “free”)
    • top - lists all processes and sort them by CPU usage (e.g. “top”); to quit press Ctrl+C

If the only reason for accessing Unix server is to read/modify files or transfer files between the server and the local workstation it might be a better option to use other tool with graphical user interface (to which Windows users will be more used to) such as WinScp.

WinScp uses SSH protocol (which is the same protocol Putty is using) to connect to the Unix server and transfer files over secure connection to and from the local host.

The tool provides two different connection options (i.e. scp, ftp). FTP will only work if the FTP server is running on the remote server wheres scp will work always as it is using SSH connection for file transfers. Both options are equally secure as all the data is sent over secure SSH channel.

Both tools are free to use and their small size (around few MB) makes them ideal to use out of the memory stick.