Now you're going to record a script that tests several functions of the Name Database. Testing multiple functions in a single recorded test script isn't necessarily the recommended way to build tests--Chapter 2, "Using a Modular Approach" discusses this in more detail--but for now it provides a tour of the JavaStar features. It also gives you a deeper appreciation of why you'll later want to compose tests of many small scripts instead.
Because this process contains a lot of steps--click here, type this, click that--this part of the tutorial is broken down into sections. These sections are all part of one script, though, so you can't selectively carry out sections and have the script run properly. If you're the type of person who learns best by reading these instructions through before, instead of carrying them out, you'll find this organization a bit more readable.
Create Test Script dialog
If you created a Project File as described in "Creating a Project File," the Class and Classpath fields should already be filled in.
This brings the advanced options to the forefront. See Figure 2-3 for an example. Here you can override more project file settings, including the directory settings you defined in the Project Settings Test panel. For now, you're just going to override the logfile name.
Advanced tab for Create Test Script
tutorial.log
.
The Record/Playback window and the Name Database application open.
Record/Playback window
If you get the message:
There is some problem accessing the class classname. Either:
1. It is not in the CLASSPATH.
2. It is being accessed in the wrong way.
(e.g. String is invalid, java.lang.String is correct.)
Here are some things to try:
.class
extension. Make sure this class is the "main" class of your application. Use Browse to navigate to the directory and make sure the .class
file is still there.
Verify that you are using a fully-qualified class name. If your class is within a package, be sure to type packageName.className.
TestNameDB
.Note - If you were recording a script that interacted with a Canvas component, recording with delays would be a good idea. With a Canvas, the speed of user interaction has an affect on the result, especially when combined with the speed of the system on which you run the test. Taking advantage of the delay feature gives you more control--you can later scale that delay to compensate for system speed.
You're now in record mode. Anything you do in a Name Database window will be recorded to your script.
The first step of the test will be to load the database into the Name Database application. Whether or not this first step succeeds is critical to the integrity of the test--if it does not succeed, your later results are compromised. To ensure this doesn't happen, you'll insert a synchronization comparison.
JavaStar provides two types of comparisons: verifications and synchronizations. The difference between the two is not in what you can compare (that's identical) but in how JavaStar responds to the results during playback.
With a verification, JavaStar performs the comparison, evaluating the results as you specified. If the verification fails, JavaStar checks the component again, repeating until a specified timeout interval has elapsed. If the verification hasn't succeeded by the time the timeout expires, JavaStar logs this as a verification failure and continues on with the test.
A synchronization works the same way with one exception--if the comparison fails at the timeout, JavaStar throws an exception and the script ends abnormally. Terminating the current script after a failed synchronization and indicating that the termination was abnormal is important, because a synchronization error requires recovery. (In the next chapter, you'll learn how to compose a test that automatically handles this kind of recovery.)
While you follow the steps in this section, the Record/Playback window should be open and visible on your desktop. This window contains the controls for recording (to the left) and dynamically displays the log file for this session in the text panel to the right.
test.db
.tutorial
directory, navigate to that directory. Select test.db
from the list of files. Click Open. The file dialog closes.
Note - JavaStar doesn't record specific mouse-clicks or movements because, if it did, the tests wouldn't be platform independent--what worked for a UNIX file browser might not work for a Windows 95 file browser. By passing only a relative file call to the dialog, JavaStar keeps your test platform-independent, even if the file dialog itself is not.
Select for Synchronization
Selection code for the object you selected appears in the synchronization select panel. Immediately below, JavaStar lists the default method of synchronization, and gives you the option to choose the default or customize the method of comparison.
The bottom panel displays the log file for this session.
The panel changes to prompt you for a purpose for the comparison. The object to compare and the method of comparison are displayed here, as well.
Synchronization prompt for purpose
Continue only if correct file loaded
.
While you don't have to type a purpose for the comparison, it can be helpful to do so. JavaStar includes the purpose in the test results, making it easier for you to evaluate results. Because this string also appears in the script code, it can help you understand the script if you decide to edit it in the future.
JavaStar returns you to the first synchronization panel.
The right portion of the window changes to display recording data, and record mode resumes.
Count von Count
.Count von Count
TAB
key to advance through the fields, enter the following information into the remaining fields:Now you're ready to verify whether the search function will locate the record you just added. As mentioned earlier, the verify option works similarly to synchronize--the dialog is almost identical. The difference is how JavaStar processes the two types of comparisons. Synchronizations that fail throw an exception, but do not affect the pass/fail count for comparisons. In contrast, verifications that fail do not throw exceptions (meaning that your test is not interrupted) but JavaStar notes them as comparison failures in the test log. Each type of comparison is useful in different situations.
For comparing the search results, a verification makes sense. If the search fails, it doesn't necessarily affect the integrity of the tests that follow. Noting the failure in the log is sufficient.
This part of the test uses two types of verification:
When you verify the search results list, you'll be looking for the number of items returned by the search. For this exercise, you'll obtain that value using the VerifyAny feature to compare the return values for any of the component's methods and variables. You'll compare the return value of the list component's getItemCount()
.
To verify each field of the record that the search operation returns, you'll use the "using text" option. This is the same option you used when synchronizing to the database filename.
Search window
This operation will now only search the address2 field.
Transylvania
.
Note - At this point, instead of selecting a object--or to override an object you already selected--you could toggle on the All visible windows option. This option would instruct JavaStar to verify all objects in all visible windows for your application under test. For this example, though, you need specific information on a single component, so leave this option unchecked.
The panel changes to show the object to verify and to prompt for a verification method.
How to Verify panel
The panel changes to show you a list of all available simple data members and methods, sorted by name.
int getItemCount()
, and select it.
To select the method, click anywhere on the line other than on the returns button. A black bar highlights the line to let you know it is selected. You can select multiple lines when you verify using simple methods and data members, and JavaStar will compare them all. For this exercise, though, you'll only compare getItemCount()
.
If you want to preview the return value to make sure this is the correct method, click the returns button next to the name. This shows you the current return value based on your interaction with the application--in this example, the return value should be new Integer(1)
.
Select methods to use for verification
The panel changes to show you the object you selected for comparison and the comparison method. It also prompts you for a string to identify the purpose for the comparison.
Verify number of items found
in the purpose field.
JS.frame("Search").member("java.awt.List").verifyAnyMethod
(this,false,true,"getItemCount",new Integer(1),
"Verify number of items found");
verifyAnyMethod()
method from the JavaStar API library.
Verify text entry
.JS.frame("Name Database").member("namedb").member("java.awt.TextField", 0).verify(this,"Count von Count", "Verify text entry");
You might need to create a test that exercises a feature of the product that, at the time you create the test, returns an incorrect value. If you insert a comparison (whether a verification or synchronization) you will be setting the test to compare to an incorrect value--meaning that as long as the tested feature returns the wrong value, the test will pass. What can you do to create a test that checks for values that are not yet correct at the time you record the script?
In this case, you can manually edit the script and replace the string JavaStar uses for comparison with the correct text. Because this is a text comparison, this is relatively easy to do. You'll learn how to edit scripts in the chapter "Adding Parameters for Flexibility."
.java
file and compiles the code into a .class
file. It writes these files to your work directory.
Send feedback to
JavaStar-feedback@suntest.com
Copyright © 1998
Sun Microsystems, Inc. 901 San Antonio Road, Palo Alto, CA 94303.
All rights reserved.