Tuesday, May 26, 2009

Application Security: OWASP top 10

Application Security: OWASP top 10

By: Avinash K Tiwari

The Open Web Application Security Project (OWASP) is a worldwide free and open community focused on improving the security of application software. The OWASP community includes corporations, educational organizations, and individuals from around the world. This community works to create freely-available articles, methodologies, documentation, tools, and technologies. Purpose of OWASP is working for finding and fighting the causes of insecure software.

Official web site: www.owasp.org

OWASP’s most successful projects include the book-length OWASP Guide and the widely adopted OWASP Top 10 awareness document.

In this post, I am going to focus on “What OWASP Top Ten” is all about.

The Open Web Application Security Project (OWASP) Top Ten Project provides a minimum standard for web application security. It lists the top ten most critical web application security vulnerabilities, representing a broad concensus. Project members include a variety of security experts from around the world who have shared their expertise to produce this list. You should consider adopting security standards and begin assessing that your web applications do not contain these security flaws. Addressing the OWASP Top Ten is an effective first step towards changing your software development culture into one that produces secure code for your web applications.

Following are the OWASP top 10 vulnerabilities with a brief description

Cross-site scripting (XSS) flaws:
Hackers can impersonate legitimate users, and control their accounts.
Impact : Identity Theft, Sensitive Information Leakage, …

Injection flaws:
Hackers can access backend database information, alter it or steal it.
Impact: Attacker can manipulate queries to the DB / LDAP / Other system

Malicious File Execution
Execute shell commands on server, up to full control
Impact: Site modified to transfer all interactions to the hacker.

Broken authentication and session management:
Session tokens not guarded or invalidated properly
Impact : Hacker can “force” session token on victim; session tokens can be stolen after logout

Cross-Site Request Forgery
Attacker can invoke “blind” actions on web applications, impersonating as a trusted user
Impact : Blind requests to bank account transfer money to hacker

Information Leakage and Improper Error Handling
Attackers can gain detailed system information
Malicious system inFORMATION may assist in developing further attacks

Insecure storage
Weak encryption techniques may lead to broken encryption
Impact: Confidential information (SSN, Credit Cards) can be decrypted by malicious users

Insecure Communication:
Sensitive info sent unencrypted over insecure channel
Impact: Unencrypted credentials “sniffed” and used by hacker to impersonate user

Failure to Restrict URL Access
Hacker can forcefully browse and access a page past the login page
Impact : Hacker can access unauthorized resources

Insecure Direct Object Reference
Web application returns contents of sensitive file (instead of harmless one)
Impact: Attacker can access sensitive files and resources

We will be discussing each one the vulnaribilities in detail in the coming posts.

Moreover, more information about the following critical web application security vulnerabilities is on the OWASP website: http://www.owasp.org/index.php/OWASP_Top_Ten_Project

(Copyrighted by CresTech Software Systems Pvt. Ltd.)

Your Testing Partner


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Sample code to click on dynamic link using Browser DOM

Sample code to click on dynamic link using Browser DOM

By Navneesh Garg

Let us try to understand a practical scenario. You have webpage. On this web-page the total number of links changes dynamically. Also the links on this webpage are not static links and are created dynamically based on inputs on the previous page. User needs to click on a link with a particular title on this page.

There can be multiple solutions to this problem. Please find below the solution which uses Document Object Model of Browser to browse through the links and click on the specified link.

Solution:

1. User uses DOM of Browser to get reference to the webpage
2. User get reference to collection of links on the webpage
3. Use for Loop to get to each link and based on the required property match click on the required link

Sample Code

************************************************************************

Function (Expected_Title)

Set obj = Browser(”Simple Validation”).Page(”Simple Validation”).Object.body.document
set Linkcollections= obj.Links

msgbox “No. of Links is” & Linkcollections.Length
dim counter
counter=0
For each Element in Linkcollections

StrTitle = Element.GetROProperty (“title”)
If StrTitle = Expected_Title then
Element.Click
Exit For
Else
Counter = Counter + 1
End If

Next

End Function
************************************************************************

Another Possible solution could be to use Description Object to get reference to all objects in the page.

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

A Beginner’s Blog to Performance testing’ continued

HTTP (’A Beginner’s Blog to Performance testing’ continued…..)

By Happy Himanshu Gupta

As I read more on Performance testing, I thought of going on a practical application to test my college website. The search results on Google presented the name of an open-source performance testing tool named OpenSTA. It was not difficult to get familiar with the interface of the tool and I began with what is called as script recording of the application.

After the recording was complete, I opened the script to see how it actually looks like. You know guys, something more horrible than any general software code appeared on the screen, and I had never read such kind of a text before in any of my course books. It was then I realized that something more important needs to be learned, before I play around with performance measuring tools. The concept I missed out reading was HTTP, the global language of web.

Most of us open our mailbox daily and go through a number of other websites. But do we ever think, what helps us in accessing our mails and the unlimited information around the globe. Well, I never thought of it, before this question actually struck me a few days ago. Is my dumb machine intelligent enough to obey my orders every time I ask it to do so? Well friends lets now understand the way the computer listens to our request and returns back the response.

HTTP which stands for “Hypertext Transfer Protocol” is a common language which lays the path of communication for all the web clients, servers and the related web applications. HTTP Clients and HTTP Servers together make the basic component of the World Wide Web (www).

The browser that we use every day is the one that plays the role of a web client. When we wish to access a page, say, www.google.com, the browser sends an HTTP request to the server (web server) www.google.com sitting at the backend. On accepting the request, the server, makes a search for the desired object/page. On a successful search it returns back the object to the waiting browser in the form of an HTTP response.

The content that transfers over the web is composed of various Resources. The web content can be as simple as a static file. These files are composed of images, html content, video contents, movies, word files etc. The web content also has the dynamic resources which are generated on demand.

Now, the question that comes to my mind is how to locate these resources on the World Wide Web. The Uniform Resource Locator (URL) is the most common form for resource identification on www. The descriptive format of the URLs, tell us the way to fetch a resource from some particular location on the server.

Now let’s take a look at the concepts of HTTP request and response messages. The messages which are sent from web clients to web servers are called request messages. The response which the web client receives from the server is called response message. The structure of response and request messages is almost similar. HTTP messages are composed of three parts primarily: Start Line, Header Fields and Body.

Start Line:
All HTTP messages begin with a Start Line. The Start line for the request messages contains information which asks the server to do something to the resource. It contains a method which describes the function that the server needs to perform on the resource described by the URL. Similarly in a response message, the start line conveys back the status information and any resulting data to the client, thus completing the operation.

Header Fields:
HTTP headers just add more info to the request and response messages, appending after the start line. They are basically the name-value pairs that give additional info about the message being transferred. A simple HTTP header has the following syntax: a name, followed by a colon (:), followed with whitespace (optional), followed by the field value, followed by a CRLF.

Body:
The HTTP messages were actually designed to transfer this part of the message, which carry varied kinds of digital data. Images, videos, html documents, software applications, e-business applications, e-mail applications and so on, all form a part of the arbitrary binary data contained in the HTTP body. Of course, the body can also contain text.

Friends, I would recommend you all to read more on HTTP, and explore the terms better. I personally find it interesting and a rich source to understand the areas of World Wide Web which are difficult to comprehend. HTTP also forms the core for performance testing of any web application.

I will continue with a few left over headings covered under HTTP, in my next blog. Till then, happy reading… :)


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

SOA Testing Simplified (Series-II)

SOA Testing Simplified (Series-II)

By Pallavi Sharma

In the last article ‘Series-I’ we got familiar with SOA architecture, and its some components. We saw how a web service ‘wsdl’ file looks like and learnt how we decipher the complex information present in the file, so that we understand the web service better. In this series we will dig deeper into SOA architecture and try to figure out how exactly the various components of SOA architecture communicate with each other to solve the complex business need.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Sending Mail from QTP using OutLook

Sending Mail from QTP using OutLook

By Navneesh Garg
As a generic Automation requirement most of the projects look at automatic emails being sent after QTP script executes or ends in a failure. Generally, outlook is configured on the systems on whissing Outlook from QTP.

‘*******************************************************************************
‘ Function: Outlook_SendEmail

‘ Sends an email using Outlook.

‘ Input Parameters:

‘ strTo - The email address or Outlook contact to whom the email should be sent.
‘ strSubject - The email’s subject.
‘ strBody - The email’s body (this may of course include newline characters).

‘ Output Parameters:

‘ None.

‘ Returns:

‘ Not applicable. This is a sub, not a function.

Sub Outlook_SendEmail(strTo, strSubject, strBody)
‘TODO: maybe add support for CC, BCC, etc?

‘Create an Outlook object
Dim Outlook ‘As New Outlook.Application
Set Outlook = CreateObject(”Outlook.Application”)

‘Create a new message
Dim Message ‘As Outlook.MailItem
Set Message = Outlook.CreateItem(0)
With Message
‘You can display the message To debug And see state
‘.Display

.Subject = Subject
.Body = TextBody

‘Set destination email address
.Recipients.Add (strTo)

‘Set sender address If specified.
‘Const olOriginator = 0
‘If Len(aFrom) > 0 Then .Recipients.Add(aFrom).Type = olOriginator

‘Send the message
.Send
End With
End Sub


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Siebel: Testing Challenges And Siebel Test Automation Tool

Siebel: Testing Challenges & Siebel Test Automation Tool

By Pratham Kailash

Customer relationship management (CRM) is a term applied to processes implemented by a company to handle their contact with their customers. CRM software is used to support these processes, storing information on customers and prospective customers. Information in the system can be accessed and entered by employees in different departments, such as sales, marketing, customer service, training, professional development, performance management, human resource development, and compensation. Details on any customer contacts can also be stored in the system. The rationale behind this approach is to improve services provided directly to customers and to use the information in the system for targeted marketing and sales purposes.
Oracle’s Siebel CRM product suite of applications is market leader in CRM software domain, which enables organizations to transform the customer experience. With solutions tailored to more than 20 industries, Siebel CRM delivers:
• Comprehensive CRM capabilities
• Tailored industry solutions
• Role-based customer intelligence and pre-built integration
It helps organizations in successfully managing their important needs like:
• Sales
• Marketing
• Contact Center Infrastructure and Service
• Customer Data Integration
• Quote, Order and Billing
• Partner Relationship Management
• Business Intelligence Applications
• Price Management
Oracle’s Siebel CRM technology provides the server framework to support Siebel applications. It delivers solutions for:
• Development
• Deployment
• Diagnostic
• Integration
• Productivity
• Mobile services
Quality Assurance challenges and testing tool for Siebel CRM applications
Any organization that relies on CRM application(s) to serve the needs of internal clients or customers recognizes that application quality is a prerequisite for success, not an option. A crucial ingredient for this success is an efficient, disciplined testing process to verify that applications have achieved a level of fitness that either meets or exceeds project expectations. Slipping schedules, frequently changing application user interfaces, and recurrent feature regression introduce variables that ad-hoc testing practices are unable to handle.
IBM Rational Functional Tester Extension for Siebel Test Automation is one of the tools built to address these issues. Rational Functional Tester Extension for Siebel Test Automation records user interactions with Siebel 7.7 applications, creating a test script that - when executed - reproduces those actions. During recording, the user can insert verification points that extract specified data or properties from the application under test. During playback, these verification points are used to compare recorded information with live information to ensure consistency. Following any test recording activity, testers have the option of adding custom code to the test script to perform an unlimited array of tasks, including the data manipulation and environment configuration activities that are often necessary to ensure the test lab is properly constituted for the test run. Following test execution, Rational Functional Tester Extension for Siebel Test Automation generates a report listing the results of the verification point comparisons. With Rational Functional Tester Extension for Siebel Test Automation, teams are able to more reliably and efficiently expose problems in Siebel 7.7 applications, increasing the opportunity for defect capture and repair before product deployment.
Features and benefits of IBM Rational Functional Tester Extension for Siebel Test Automation:
• Supports Siebel controls for GUI automated testing
Siebel 7.7 delivers a rich UI comprised of standard and complex controls. Rational Functional Tester Extension for Siebel integrates with Siebel Test Automation interfaces to provide robust automation support for this rich environment. By supporting standard web controls in addition to Siebel Standard-Interactivity and High-Interactivity controls Functional Tester Extension for Siebel generates scripts consisting of advanced UI control recognition and readability.

• Support for testing of Java, Web and Visual Studio .NET WinForm-based applications
Test teams are often required to assess applications built upon more than one technology base. IBM Rational Functional Tester provides equally robust automation support for applications constructed using Java, HTML/DHTML and Visual Studio .NET WinForm technologies.

• Choice of language - Java or Visual Basic .NET - for test script customization
Test script customization is mandatory in order to perform anything but the most basic tests Functional Tester for Siebel gives you a choice of powerful, mainstream scripting languages to make this possible. Choose between either Java or Visual Basic .NET - both options can be used with all the supported user interface technologies. By working with Functional Tester for Siebel, testers quickly learn to work with basic language constructs and acquire programming skills that facilitate more productive communication with developers.

• Native Java and Visual Basic .NET editor and debugger for advanced testers
Test script editing is important, but it can be difficult without a good editor and debugger. Functional Tester for Siebel delivers industrial-strength options to address this concern. Testers using Java can work in the Eclipse Java Development Toolkit (JDT), and those using Visual Basic .NET can work in Visual Studio .NET. Both integrated development environments offer a host of options to simplify test enhancement, including a helpful code-complete feature that suggests code to accelerate editing. GUI developers will find this feature particularly useful, as they can access it within the IDE they use to build the user interface.

• ScriptAssure technology to accommodate frequent UI modifications
Frequent changes to an application’s user interface can break tests, which embody assumptions about how to identify the interface’s objects during playback. Functional Tester for Siebel introduces an advanced ScriptAssure™ technology to accommodate these changes and avoid increases in maintenance overhead. ScriptAssure uses configurable algorithms to locate objects during test execution, even if the objects have changed since test creation.

• Automated data correlation and data-driven testing eliminate need for manual coding
Functional tests typically need to vary data during playback to properly simulate true users. Functional Tester for Siebel can automatically detect data entered during test recording and prepare the test for data-driven testing. Using a spreadsheet-like data editor, you can then create customized data sets to be inserted into the script during playback. In this way, you can produce highly personalized tests without manual coding.

• Multiple verification points with regular expression pattern matching support
Verification points help to ensure there is no regression from one build of the application under test to the next. Functional Tester for Siebel provides a wide range of verification points to test various aspects of your application, and it includes pattern matching support for tests in which you cannot predict the exact data response.

• Advanced object map maintenance capabilities
Functional Tester for Siebel uses an object map to store information used during test execution to locate user interface objects. It also provides maintenance capabilities to update this object map automatically whenever changes are made to the application’s user interface.

• Ships with IBM Rational Manual Tester
For teams not yet prepared to automate all of their testing efforts; IBM Rational Manual Tester is included in the Functional Tester for Siebel product box. Rational Manual Tester brings control and organization to manual testing efforts, introducing a novel test step reuse technology to improve the resiliency of manual tests despite changes made to the applications under test.

• Ships with IBM Rational ClearCase LT for automated version control
Typically, more than one version of an application is deployed within an organization, and testers must therefore maintain groups of tests for each version. Without the help of automated version control, this can be extremely difficult. Functional Tester for Siebel is designed to support automated version control, which not only provides a mechanism to maintain multiple test sets, but also enables parallel development and supports geographically dispersed teams, To help teams take advantage of this support, a full version of IBM Rational ClearCase LT, an entry-level version control tool designed for small project workgroups, is included in the product box. Rational Functional Tester users also have the option of upgrading to the standard version of IBM Rational ClearCase.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

IIS Performance Settings

IIS Performance Settings

Blog Home »

*
Calender
May 2009 M T W T F S S
« Apr
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
*
Recent Posts
o Updates about QTP 10 (I)
o Agent Controller Issue on Starting RAServer Process
o Oracle Tunning
o Bridging the Gap between Open Source & Commercial Tools
o Necessity- the Mother of all Invention (Part II)
*
Archives
o May 2009
o April 2009
o March 2009
o February 2009
o January 2009
o October 2008
o September 2008
o August 2008
o July 2008
*
Tags
Agile Mode CRM | Database Document Object Model | Functional Automation Functional Testing | General General Concepts human resource Human Resources | Open Source Tools Oracle Performance Testing | QTP SRS Telecom Testing

Search Blog

*
Categories
o Catch The Latest
o CRM |
o Functional Automation
o Functional Testing |
o General
o General Concepts
o Human Resources |
o Open Source tools
o Open Source tools
o Other Commercial Tools
o Performance Testing |
o Quick Test Professional
o Rational Functional Tester
o Security Testing
o Telecom Testing
o Testing SOA
o Uncategorized
*
Recent Comments
o Abhay on Sample code to click on dynamic link using Browser DOM
o parul wahi on Testing Ajax Application using QTP
o Pankaj Goel on Testing Ajax Application using QTP
o Pankaj Goel on Oracle Tunning
o Pankaj Goel on Bridging the Gap between Open Source & Commercial Tools
*
Blogger
o Roshi Malhotra
o Kuldeep Singh
o Vaibhav Agarwal
o Rajat Singhal
o Avinash K. Tiwari
o Ravinder Singroha
o Vivek Goyal
o Sudha Sharma
o Betsy Joy
o Shalini Rawal

IIS Performance Settings

July 28th, 2008 admin Posted in Performance Testing | |

By Pankaj Goel

Performance tuning tips for IIS 6.0
The IIS server tuning is slightly involved in terms of understanding the performance critical parameters and tuning them to meet the performance specifications. Some of the ways to improve performance hosted on dot net platform are

a) Recycling Worker process
b) Reducing thread contention
c) Kernel Mode caching

a) Recycling Worker Processes
If a Web application contains code that causes problems, and you cannot easily rewrite the code, it might be useful to limit the extent of the problems by periodically recycling the worker process that services the application. You can accomplish this by using what is known as Worker Process Recycling. Worker process recycling is the replacing of the instance of the application in memory. IIS 6.0 can automatically recycle worker processes by restarting the worker process, or worker processes, that are assigned to an application pool. This helps keep problematic applications running smoothly, and minimizes problems such as memory leaks. You can trigger the recycling of the worker processes assigned to an application pool by using worker process recycling methods that are based on elapsed time, the number of Hypertext Transfer Protocol (HTTP) requests, a set time of day, and two kinds of memory consumption, in addition to recycling on demand.
To configure all the above settings, go to the Properties window of the application pool in which your Web application is running using the IIS manager. Using the Recycling, Performance, and Heal

You can follow any responses to this entry through the RSS 2.0 feed. You can skip to the end and leave a response. Pinging is currently not allowed.
AddThis Social Bookmark Button
Leave a Reply

Name (required)

Mail (will not be published) (required)

Website

(Required)




Type the two words:Type what you hear:Incorrect. Try again.
Get a new challenge
Get an audio challengeGet a visual challenge
Help

« SOA Testing Simplified (Series-I)
Telecom Testing »



http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Telecom Testing

Telecom Testing

By Rohit Aggarwal

Testing in telecom world is as vast a subject as the domain itself. This blog talks about the readiness to become a telecom tester. The intent is to address the apprehensions and misconceptions; the actual tools and methodologies involved form the next level of discussion.

Kick-start
Do not jump to the System Under Test (SUT). Start from the bigger picture.
For instance, understand the technological domain/network, then the network nodes, then the SUT as a black box.

W5H of the System Under Test
There is a lot of food for thought. It is important to answer a few questions to be able to prepare the use cases and test details.
i. What does it do: functionality
ii. Where is it placed: network positioning
iii. Why is it needed: network architecture, business drivers
iv. Who will use it: end user – could be human being or another network node
v. When is it used: time, service, peak usage etc.
vi. How does it work: protocol, language, interfaces

Knowledge of deployment zone is a value-add. Test criteria needs to consider regional policies, infrastructure and usage patterns.

Documentation
A lot of information about the SUT is available through
? Technical specifications sheet
? Release bulletin
? User Manual
? Requirement Specifications
and other technical documents.

Types of Testing
There are multiple behavioral elements of the SUT that need to be tested.
? Functional
? Compliance
? Performance
? Load
? Soak

One of the key facets is that most of the telecom equipment is real-time.

Test Environment
Test tools can be
? Purchased: cost-driven
? Produced: time-driven
? Procured: contract-driven

Investment
? Ramp up
? Time: large amount of time needed to understand the technical specifications
? Cost: almost entire knowledge base freely available on web
? Test bed
? Time: to code test stubs
? Cost: to buy test stubs e.g., OS, compiler, network software, equipment


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

A Beginner’s Blog to Performance Testing

A Beginner’s Blog to Performance Testing

Blog Home »

*
Calender
May 2009 M T W T F S S
« Apr
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
*
Recent Posts
o Updates about QTP 10 (I)
o Agent Controller Issue on Starting RAServer Process
o Oracle Tunning
o Bridging the Gap between Open Source & Commercial Tools
o Necessity- the Mother of all Invention (Part II)
*
Archives
o May 2009
o April 2009
o March 2009
o February 2009
o January 2009
o October 2008
o September 2008
o August 2008
o July 2008
*
Tags
Agile Mode CRM | Database Document Object Model | Functional Automation Functional Testing | General General Concepts human resource Human Resources | Open Source Tools Oracle Performance Testing | QTP SRS Telecom Testing

Search Blog

*
Categories
o Catch The Latest
o CRM |
o Functional Automation
o Functional Testing |
o General
o General Concepts
o Human Resources |
o Open Source tools
o Open Source tools
o Other Commercial Tools
o Performance Testing |
o Quick Test Professional
o Rational Functional Tester
o Security Testing
o Telecom Testing
o Testing SOA
o Uncategorized
*
Recent Comments
o Abhay on Sample code to click on dynamic link using Browser DOM
o parul wahi on Testing Ajax Application using QTP
o Pankaj Goel on Testing Ajax Application using QTP
o Pankaj Goel on Oracle Tunning
o Pankaj Goel on Bridging the Gap between Open Source & Commercial Tools
*
Blogger
o Roshi Malhotra
o Kuldeep Singh
o Vaibhav Agarwal
o Rajat Singhal
o Avinash K. Tiwari
o Ravinder Singroha
o Vivek Goyal
o Sudha Sharma
o Betsy Joy
o Shalini Rawal

A Beginner’s Blog to Performance Testing

July 29th, 2008 admin Posted in General Concepts |

By Happy Himanshu Gupta

Class 12th Board results were out and I ran to my PC to check the result. The anxiety level was rising every moment as it was taking time for the result page to open. Why is this page taking so long to open…?? Have I scored less…or perhaps I have failed in Chemistry..?? Oh god……!!!! What is going to happen… :( ?? Finally after 7 minutes and a few seconds, I rested in peace when the internet page opened and highlighted PASS on the screen. I just kicked my PC for taking the life out of me in those 7 minutes.

It’s just been a few days that I have stopped cursing my poor machine for those worst 7 minutes of my life when I came across a term called “Performance Testing”. I felt like sharing the definition of this term with you people and got up to writing this starting blog. :)

Performance testing is an emerging science in the field of Software testing. When we talk about effective development of an application that makes life easy, we cannot neglect its high performance as one of the major factors contributing to its quality. Performance testing is done to test a specific behavior of the application. But is there any specific criterion to define performance?

To define the concepts of performance of any system software or any general application (web/system), we can define a phrase “meeting of requirements with the timeliness” of the application. A well running application will always meet its requirements under the benchmark indicators. We map these benchmarks to the actual performance counters, which help us to judge, how the application is performing under the current environment.

These performance counters can broadly be categorized into two major dimensions, namely, Responsiveness and Scalability.

Response time of an application is the time required to process any instruction and return back the result. It is the waiting time, the measure of which is directly proportional to the number of concurrent users working on the application.
Throughput of the system is the number of events which an application can process within some interval of time. More the throughput better is the performance of the application.

Scalability of an application can be defined by its ability to meet the throughput and the response time objectives, even as the load on the application continue to increase with the increasing functionaries. Better performance is achieved by high scalability of the application.

Though Performance testing is done in conjunction with stress testing, but performance testing is not only stress testing. It covers the other aspects such as load testing and endurance testing along with the stress testing.

There are a number of tools, both commercial and open-source, to facilitate performance testing of any application. Commercial tools like Load Runner (LR) and Rational Performance Tester (RPT) are well equipped to support a number of different protocols for performance testing. Open-source tools such as OpenSTA, JMeter, Grinder, Webload etc. have made it easy for the small scale IT industries to test the performance of their applications without spending a substantial sum on gaining the licenses of their commercial counterparts.

Performance testing allows the user to identify the bottleneck in the application, which is the cause of the poor performance. The relational graphs and the statistical values of the performance counters help in analyzing the behavior of the application with the increasing load (number of users).

Well friends, the heavy rush on the Result website was the actual reason for the delay in the result that night and not my poor PC. I believe such websites should go for the performance testing of their applications before giving nightmares to the innocent children…. :)

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.
AddThis Social Bookmark Button
2 Responses to “A Beginner’s Blog to Performance Testing”

1.
Aaisha Ghosh Says:
July 30th, 2008 at 10:47 pm

:-) A Nice Read Himanshu. I am a functional tester and have been curious to Know how to get started in Performance Testing. Since you have already taken this first step, can u guide me in what all to read.

Is there some good training institute where i can get performance testing training
2.
Happy Himanshu Gupta Says:
August 4th, 2008 at 3:22 am

Thanks for your comments Aaisha. I will keep updating the blog adding more to this particular domain of testing , to help all the learners gain the primary knowledge of Performance Testing.

“QACampus”, is a one stop testing portal, which is maintained by the Learning and Training Devision of our organization. Just check out the ‘Contact Us’ section on this website, to reach out to us for any query. I believe QACampus is the best place to learn any aspect of testing.

Leave a Reply

Name (required)

Mail (will not be published) (required)

Website

(Required)




Type the two words:Type what you hear:Incorrect. Try again.
Get a new challenge
Get an audio challengeGet a visual challenge
Help

« Telecom Testing
Introduction To HR Payroll »



http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Introduction To HR Payroll

Introduction To HR Payroll

By Anurag Vasisht

Introduction

Payroll is one business area that is of relevance to all of us. We all work to get paid (or most us, anyway). In good old days, this payment used to be a straight affair of a fixed salary OR a simple multiplication of number of hours with hourly rate. Not any more. Nowadays, it’s a much more complicated process with concepts like deductions, contributions, exemptions, taxation laws, union contracts, commissions, loans, bonus payments and other special payments/deductions. HR Payroll also provides input to downstream systems like Finance Accounting, General Ledger etc.

For the purpose of understanding the nuts and bolts of HR Payroll, let us define employees under the following broad categories. This distinction is required because the Payroll processing is vastly different for these categories. Most employers, big or small, follow this segregation at the topmost level, drilling down to different levels of complexity based on size and diversification of their businesses.

1. Non-Exempt
These employees are required to accurately record the time that they spend on work. If different types of work have different payment rates, then they are expected to record time against different “attendance/work codes”. If time is not recorded for work done, then no payments are made for that work.

Such employees are setup with an expected daily/weekly hours and basic rate of pay instead of a fixed/basic salary. If these employees work more than their schedule, then they are paid overtime at a normal or premium rate as defined.

In some cases, such employees are contracted through Labor Unions and are paid as per the terms and conditions as defined in the respective contracts.

2. Exempt
These employees are exempt from recording their time on work. Such employees are setup with a schedule (such as Mon-Fri 9-6) and are expected to report time only if some exception happens from their normal schedule (such as vacation or jury call or military duty). Unless they report an exception it is understood that they followed their schedule. Though employers do set up in/out devices for monitoring purpose and deviations are questioned, but these typically do not impact the payments unless there is a drastic variation.

Typically, management employees fall under this category while non-management employees are under Non-Exempt category.

2. Stages of Payroll processing
1. Employee setup
This is the static information which is typically set up for an employee at Hire time and updated typically at the start of a financial year. This is expected to remain the same till there is change in status of the employee, caused by events such as promotion, termination, demotion, change of job profile or location etc. Setup information includes data such as type of employee, job profile, work location, work schedule, leave entitlements, payroll cycle (weekly, semi-monthly, monthly etc), basic pay (or rate of pay), compensation and benefits, medical and insurance plans etc. This setup pretty much defines the way actual processing happens at the end of every payroll cycle.

2. Attendance/Absence Records
Non exempt employees record each hour and minute of work against relevant attendance code. This time is compared against the work schedule and determined if any premium is to be paid against overtime, holiday worked etc.

In case of absence in scheduled work hours, time is coded against the relevant absence codes. This is applicable for both exempt and non-exempt employees. Absences are compared against the leave entitlements, and determined if they need to be paid out or any deductions need to be made.

3. Calculation of Gross Pay

It is typically the summation of all earnings. For non-exempt employees, the figure arrived from number of hours X hourly rate can start as the basis while for exempt employees, the basic salary can for the basis. In this figure, various garnishments are added to arrive at the Gross Pay. These include overtime/holiday worked premiums, commissions, bonus payments, arrears from previous salaries, misc benefits etc.
4. Calculation of Net Pay

This calculated by subtracting the various deductions and taxes from the Gross Pay. Deductions may include loan payments, union dues, overpayments of previous salaries, contributions to misc health/benefit/insurance plans. Taxes are generally applied after deductions and include federal, state and local taxes. Net Pay is what an employee gets in his account.

3. Key Considerations in Testing
1. Setup & Locale: Payroll processing is different for each country, and sometimes there are major differences with in states in a country. Even though generic rules are good to have an overview, a thorough understanding of the local (financial, labor and taxation) laws is essential. Similarly, HR policies of a company that govern all the Payroll processing are important to learn and comprehend before jumping headlong into IT solution that implements those policies. Finally, knowledge of IT solution (implementation) is recommended as each vendor differs vastly from the other, and even in case of black box system testing, it does no harm to know the vital aspects.

2. Movements: Any kind of movement brings complexity to the system. Whether it was a Hire, Rehire, Termination, Transfer from one location to another, Promotion, Demotion: all of these visibly impact the payroll processing. Much of the complexity can be removed if a company prohibits movements in the mid of a pay period. Then, there is a clear demarcation, and the process can be easily developed and tested to take notice of such a movement. However, if there is no such restriction, then it is a nightmare…requiring much more analysis in understanding, implementing and testing the solution.

3. “Worked/Earned” vs. “Paid”: This concept is critical in case of reports which run out of Payroll. Consider this example: An employee works for the period 12/16/07 to 12/31/07. His Payroll processing happens on 1/3/08 and gets paid on 1/5/08. Now, assuming there is monthly report which displays employee wages. In this case, should the amount for the work done between 12/16/07 to 12/31/07 appear in Dec 07 report or Jan 08 report? There is no default answer to this question, and depends from case to case – determined by the business that uses the report. So, this question must definitely be asked for any Payroll report testing.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Tuesday, May 19, 2009

IBM Rational Functional Testing

IBM Rational Functional Testing

By Vaibhav Agarwal

Now a days most of us (in IT industry) know/have heard about Automated testing tools.

Rational Functional Tester (RFT), as the name says, is an automated functional testing tool from Rational. This tool tests your application automatically without human intervention.

IBM Rational Functional Tester is an object-oriented automated testing tool that lets you test a variety of applications. You can quickly generate scripts by recording tests against an application, and you can test any object in the application, including the object’s properties and data. Rational Functional Tester offers you a choice of scripting language and development environment — Java in the Eclipse framework or Microsoft Visual Basic .NET in the Microsoft Visual Studio .NET Development Environment. That means that regardless of the language or platform your development staff has chosen, you should be able to integrate with them and leverage some of their expertise as you develop your automated tests.

Rational Functional Tester offers these powerful capabilities
• Play back scripts against an updated application
• Update recognition properties for an object
• Merge multiple test object maps
• Display associated scripts
• Use pattern-based object recognition
• Integrate with UCM


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Working with Excel objects in QTP

Working with Excel objects in QTP

We create framework for automating the application. For this we need the independent structure for reporting and data. Excel plays a very important role in this approach.

QTP has its own test result displaying mechanism in the predefined format. Once the test is run, the result sheet is generated which gives you the insight of the script – stating the point of failures, warnings and passes.

We create customized checkpoint in the script and it is possible to customize the result file also depending upon the checkpoint created will be passed or failed.

In most of the cases we want to create summarized or detailed report of the entire test in excels. The reason to create customized report is that one is able to keep the file in central location and to create the report in our own format.

In this article we are going to learn the interaction of Excel with VBScript.

The whole mechanism goes in the following steps:
1. Understanding the hierarchy of Excel Application.
2. Creating the Excel Object
3. Opening an existing workbook or creating the new one
4. Setting the objects for various sheets in workbook.
5. Writing and fetching the data values in the cells.
6. Saving and closing the workbook
7. Closing the application and releasing the memory

We will go through each of the above stated steps with a suitable example to understand the approach properly.

Understanding the hierarchy of Excel Application

We will not go into the details of the complete hierarchy of the Excel application but to the extend what is required.

Excel Application
Workbooks
Sheets
Cells

Creating the Excel Object

The first step towards the process of reporting via excel is to create object of Excel. Reporting in Excel can either be done in backend, without making the application visible or u can make it appear to user once the process of writing or fetching the data is going. In either way creating of the Excel Application object is required.
It goes as:
Dim xl
Set xl = CreateObject(“Excel.Application”)

Opening an existing workbook or creating the new one

Once the excel object has been created, it means that excel application has been invoked but is not visible. So either one can perform the operations like that or make the application visible and then perform the operations.

To make the application visible:
xl.visible = true

To open a new Workbook:
xl.workbooks.Add

To open an existing Workbook:
xl.workbooks.Open(“File Name with complete path”)

Setting and accessing the objects of sheets in workbook.

Once the workbook has been opened, either existing or new one, we need to write some data in various cells in various sheets of that workbook.
By default there are 3 sheets in a workbook and various operations can be performed on. So one need create the object to reference these sheets as it becomes easy to access them and you don’t have to mention the complete hierarchy over and over again.

Say one has to create a reference for sheet with index i, which starts from 1
Set sht1 = xl.activeworkbook.sheets(i)

One can add or delete n number of sheets from the activeworkbook
To add a sheet in workbook –
xl.activeworkbook.sheets.add

To delete a particular sheet where i represent the index which starts from 1 –
xl.activeworkbook.sheets(i).delete

To change the name of the sheets –
xl.activeworkbook.sheeets(i).name = “Name of your choice”

To count the total number of sheets in the workbook
Cnt = xl.activeworkbook.sheets.count

Writing and fetching the data values in the cells

To write the data in Excel sheet, one should know the cell address in which the data has to be written. Same thing goes for accessing the data from the cells

To write the data in sheet2 cell address as D8, we write the following command. Cell address here is represented by row number followed by column number –
xl.activeworkbook.sheets(2).cells(8,4) = “hello”
To fetch the data from sheet3 cell address A7 –
Val = xl.activeworkbook.sheets(3).cells(7,1)

If one has already created the object of the particular sheet, you don’t have to write the complete hierarchy but simply –
Object.cells(row,col) = value

Saving and closing the workbook

Once the work completed you can save the newly created workbook to a specified location or save the changes made to already existing opened workbook.

To save as in case of new workbook
xl.activeworkbook.saveas “path_with_file_name.xls”

To save in case of existing workbook
xl.activeworkbook.save

To close the workbook
xl.activeworkbook.close

Closing the application and releasing the memory

To close the application
xl.quit

To release the memory of all the objects
Set xl = nothing



http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Rational Functional Tester- Features and benefits

Rational Functional Tester- Features and benefits

1. Ensure playback resilient to application changes with ScriptAssure technology -

Frequent changes to an application’s user interface can hamper test script execution. IBM Rational Functional Tester introduces an advanced ScriptAssure™ technology to accommodate these changes and avoid increases in maintenance overhead. ScriptAssure uses fuzzy matching algorithms to locate objects during test execution, even if the objects have changed since test creation.

2. Increase script re-use with wizard for data driven test creation –

Data driven tests are functional tests that perform the same series of test actions, but with varying data. IBM Rational Functional Tester can automatically detect data entered during test recording and prepare the test for data-driven testing. Using a spreadsheet-like data editor, you can then create customized data sets to be used by the test during playback. In this way, you can achieve test script re-use without time consuming manual coding.


3. Streamline automation with keyword testing –

Rational Functional Tester provides ability to define and automate keywords which can be reused in both functional and manual tests. This promotes script reuse and enables manual testers to easily and selectively leverage the power of automation within manual test cycles.

4. Proxy SDK –

This feature allows testers to define support for custom controls.

5. Choice of test editing language - Java or Visual Basic .NET –

Test script customization is mandatory in order to perform anything but the most basic tests. IBM Rational Functional Tester gives you a choice of powerful, mainstream scripting languages to make this possible. Choose between either Java or Visual Basic .NET - both options can be used with all the supported user interface technologies. By working with Functional Tester, testers quickly learn to work with basic language constructs and acquire programming skills that facilitate more productive communication with developers.

6. Validate dynamic data with dynamic data validation wizard –
Validating dynamic data, such as time stamps or order confirmation numbers can be tedious time consuming task, involving complex manual coding. IBM Rational Functional Tester includes wizard driven support for regular expression dynamic data validation. Users can effortlessly validate dynamic data against template patterns without having to write complex code.


7. Manual Test Automation support –

For teams not yet prepared to automate all of their testing efforts, IBM Rational Manual Tester is included in the Rational Functional Tester product box. Rational Manual Tester brings control and organization to manual testing efforts, introducing automated data entry and data validation to manual.


8. Test script version control for parallel development –

Typically, more than one version of an application is deployed within an organization, and testers must therefore maintain groups of tests for each version. Without the help of automated version control, this can be extremely difficult. IBM Rational Functional Tester is designed to support automated version control, which not only provides a mechanism to maintain multiple test sets, but also enables parallel development and supports geographically dispersed teams, To help teams take advantage of this support, a full version of IBM Rational ClearCase LT, an entry-level version control tool designed for small project workgroups, is included in the product box. Rational Functional Tester users also have the option of upgrading to the standard version of IBM Rational ClearCase.


9. Native Java and Visual Basic .NET editor and debugger for advanced testers –

Test script editing is important, but it can be difficult without a good editor and debugger. IBM Rational Functional Tester delivers industrial-strength options to address this concern. Testers using Java can work in the Eclipse Java editor and those using Visual Basic .NET can work in Visual Studio .NET. Both integrated development environments offer a host of options to simplify test enhancement, including a helpful code-complete feature that suggests code to accelerate editing. GUI developers will find this feature particularly useful, as they can access it within the IDE they use to build the user interface.


10.Linux test editing and test execution support –

For cross-platform Java applications, IBM Rational Functional Tester offers scripting to create, edit, and execute tests on the Linux platform - including everything except a test recorder. It also supports the Windows platform for all recording, editing, and execution capabilities.


11.Add-on support available for testing 3270/5250 terminal-based applications –

Teams responsible for testing mixed workload environments - environments comprised of both Java/Web/Visual Studio .NET-based applications and mainframe-based applications - can purchase the IBM Rational Functional Tester Extension for Terminal-based Applications. This extension enables automation and testing for 3270 and 5250 terminal-based applications, including automated data entry and response verification, using the testing tool and scripting language the tester favors for Windows and Linux-based applications.

12.Extended automated functional and regression testing for Siebel® and SAP® applications –

IBM Rational Functional Tester Extension for Siebel Test Automation interfaces to provide robust automation support for both Siebel 7.7 and 7.8. IBM Rational Functional Tester Extension Automated test creation, execution and analysis of SAP GUI applications.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Application Security : Unlocking the myth

Application Security : Unlocking the myth

By Avinash K Tiwari

More and more companies are relying on Web-based applications to provide online services to their employees, to support e-commerce sales and to lever¬age portals, discussion boards and blogs that help staff better communicate with customers, partners and suppliers. However, as the number and complex¬ity of Web applications have grown, so have the associated security risks. With increasing frequency, incidents of Web application breaches resulting in data theft are popping up as front-page news.

Some of news you can’t ignore

World story

• 40 Million Credit Cards Compromised
• 55 Million Customer Records Exposed, 130+ Security Breaches in 2005
• $105 Billion In Cyber crime Proceeds in ’04, More than Illegal Drug Sales

There have been quite a few Govt and FSS security breaches in India recently.
• Hacker breaks into 17 bank a/cs
• Bank of India site hacked, served up 22 exploits
• Maharashtra govt website hacked
• Goa govt’s info website hacked

A fact which can’t be ignored:

Close to 80% of web sites are vulnerable to Cross-site scripting, which can be executed even by a novice hacker. Your website could be one as well.

A myth that our website is safe:

“We have Firewalls in Place”
“We Use Network Vulnerability Scanners”
“we use SSL”

What’s the real picture??

Lets take a look at the different security layers used in the protection of a web application:

In order to protect the desktop, users and organizations install antivirus protection
The communication layer is protected by encrypting the traffic, using SSL
Firewalls and IDS are used to only allow specific communication (port) access (i.e. WEB traffic)

Let’s take each layer one by one

When we talk about web traffic desktop security is not in picture

Using SSL will encrypt the web traffic and make life difficult for a hacker intercepting the traffic. But, use of SSL can’t stop a hacker from manupulating the data by directly accessing the application

Firewalls are set up to allow outsiders access to specific resources, and to prevent them from accessing other resources. For example, an outside individual wouldn’t be allowed to directly connect to a database, but they can make a request to a web server. This means the firewall would be configured to deny traffic on a standard database port 1443, but allow traffic through ports 80 and 443 - web application ports. This system is clearly no protection at all against malicious attacks aimed at the web application.

The next protection an HTTP request encounters is an intrusion detection system. The IDS has been set up to look for signatures in the traffic that might indicate an attack. For example, they may look for a SQL statement embedded within a request, or they might look for a script tag that indicates a potential XSS attack. The challenge with these systems is that if the request is encoded in some alternative format (such as Unicode Transformation Format, UTF-7) or perhaps the traffic is encrypted using SSL, the intrusion detection system is often not able to interpret or understand the requests. The IDS offers little to no protection against the web application attack.

The last system that the HTTP request might encounter before the web server is probably an application firewall. These are the smartest of all the network protections and can be configured explicitly to only allow certain traffic that it knows to be good. The problem with these systems is that it’s very expensive to maintain the correct configurations or valid algorithms and testing of these systems to recognize good traffic. If the web application firewall has been designed to fail securely (a web application security principle), that is, if you’re not sure what to do, they block the user, the web application firewall may block legitimate traffic. For this reason, most application firewalls are usually designed to break one of the founding principles of security (fail securely) by allowing through traffic that they don’t understand.

So, as you can see, even with antivirus, firewalls and IDS, you still have to allow web traffic through your firewalls, IDS and IPS. This web traffic can be friendly, or it could be malicious – but there is no way to know which one it is. Since web users can access your application over the web, they can perform all sorts of malicious activities and either steal, manipulate or destroy information.

So the bottomline is THE WEB APPLICATION MUST DEFEND ITSELF.

(Copyrighted by CresTech Software Systems Pvt. Ltd.)


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Batch Execution of QTP Scripts

Batch Execution of QTP Scripts

By Navneesh Garg

The user has several scripts which test different parts of his application, and he would like to execute the tests all together in a batch mode i.e. a main script that calls other scripts. The end objective is that complete suite of scripts can be executed without any human intervention one after another.

Here are some ways to run a set of several scripts together.
1. Use the Automation Object Model (AOM)
Use the Automation Object Model (AOM) to define the tests to be run and execute them. Using AOM you can set various QuickTest Professional settings and values. The AOM is as close as command line functionality as QuickTest Professional has. Using AOM you can get reference to QTP application object and using this reference, you can call and execute all QTP testscripts which you would want to run in sequence/batch mode. Inoder to find how to use AOM, you can refer to Object Model reference guide of QTP

2. Use TestDirector for Quality Center
Use TestDirector for Quality Center (Quality Center) to schedule and run an execution flow. Quality Center will launch QuickTest Professional and run the tests for you, so you will not need to launch QuickTest Professional, run a script, and close the script, in order to launch the next script.

3. Use Resuable Actions
Make the Actions in the test scripts reusable, then insert a copy or a call to action into the main test. This will call the Action of the scripts.

4. Use the Multi-Test Manager utility.

This utility is available on Mercury Support Site, which helps in batch execution of scripts. It has some decent features like, drag and drop of scripts, scheduling script execution and automatic emailing.

5. Use the Test Batch Runner utility.

You can access it by going to Start -> Programs -> QuickTest Professional -> Tools -> Test Batcher. For more information on the Test Batch Runner, please refer to the QuickTest Professional User’s Guide (QuickTest Professional User’s Guide -> Running and Debugging Tests and Components -> Running Tests and Components -> Running a Test Batch).


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

SOA Testing Simplified (Series-I)

SOA Testing Simplified (Series-I)

By Pallavi Sharma

If you have reached this blog, then may be you have heard about SOA and testing SOA applications, or may be you understand testing and would like to know what SOA testing is all about or maybe you want to know what SOA and testing is all about… Well however complicated your case may be… I would explain SOA and testing SOA applications in an uncomplicated way to you in a series of blogs and this is the very first blog on it.

SOA stands for Service Oriented Architecture. It describes IT infrastructure which allows different applications to exchange data with one another as they participate in business processes. The aim is a loose coupling of services with operating systems, programming languages and other technologies which underlie applications. [1].

These services inter-operate based on a formal definition (or contract, e.g., WSDL) that is independent of the underlying platform and programming language. Services written in C# running on .NET platforms and services written in Java running on Java EE platforms, for example, can both be consumed by a common composite application (or client). Applications running on either platform can also consume services running on the other as Web services, which facilitates reuse.

Let’s understand the term ‘Web Service’. A Web Service in simplistic terms is a web enabled API, that can be accessed over the network. In the SOA architecture it is one component which takes care of a specified business need. Each web service has its own contract file which is required to communicate with it. This contract file is termed as WSDL, which is defined as, ‘Web Service Description Language’. As clear from the name, it provides information about what a web service is all about. The kinds of information you can fetch from the WSDL file are:
1. Where the web service is hosted.
2. What functionality the web service provides.
3. What all methods a web service consist of.
4. What arguments the web service takes and what response it returns.
Let’s take an example web service and its wsdl file and try to make sense out of it. The example web service is ‘MoneyConverter’; this service converts a currency value provided in Indian Rupees, to Dollars and vice versa. It solves a business requirement and will consist of two methods, you guessed them right:

a. RupeesToDollars
b. DollarsToRupees

Let’s take a look at the WSDL file for this service, which is an xml file describing this web service. [File attached]

To decipher the WSDL document, let’s begin from the end. The last xml node of the wsdl file provides information where the service is hosted.


A web service for converting currencies




Moving up, we find information about what all methods are present and what arguments they take and the return value.

wsdl:portType name=”ConverterSoap”>

Convert from Rupees to Dollars




Convert from Dollars to Rupees





Now we would like to know what does the input message looks like, and the output, for this lets move a bit up the document. The “wsdl type” xml node defines all the data types which are passed as arguments or returned as response after the method invocation. Let’s take one node and explain it further:








It states that the element is of type ‘decimal’, name of the element is ‘amount’, and it is a mandatory argument for the method, which can be understood by the minoccurs and maxoccurs attributes of the s:element node.

So, from the above information provided by the WSDL document of the MoneyConverter, we now know the following;

1. The port where the web service is hosted.
2. The methods which are present in the web service.
3. The arguments and the return value of each method of the web service.

Empowered with all this information, in the next blog series we will understand how we exactly use this web service, so that we are more enlightened to test it.
References:

1. Newcomer, Eric; Lomow, Greg (2005). Understanding SOA with Web Services. Addison Wesley. ISBN 0-321-18086-0.

(Copyrighted by CresTech Software Systems Pvt. Ltd.)


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

What’s in Web 2.0

What’s in Web 2.0

The rising popularity of user-driven online services, including MySpace, Wikipedia, and YouTube, has drawn attention to a group of technological developments known as Web 2.0. So, what all cosist of web2.0?
Read till end…

Blogs (short for Web logs) are online journals or diaries hosted on a Web site and often distributed to other sites or readers using RSS (see below).

Collective intelligence refers to any system that attempts to tap the expertise of a group rather than an individual to make decisions. Technologies that contribute to collective intelligence include collaborative publishing and common databases for sharing knowledge.

Mash-ups are aggregations of content from different online sources to create a new service. An example would be a program that pulls apartment listings from one site and displays them on a Google map to show where the apartments are located.

Peer-to-peer networking (sometimes called P2P) is a technique for efficiently sharing files (music, videos, or text) either over the Internet or within a closed set of users. Unlike the traditional method of storing a file on one machine—which can become a bottleneck if many people try to access it at once—P2P distributes files across many machines, often those of the users themselves. Some systems retrieve files by gathering and assembling pieces of them from many machines.

Podcasts are audio or video recordings—a multimedia form of a blog or other content. They are often distributed through an aggregator, such as iTunes.

RSS (Really Simple Syndication) allows people to subscribe to online distributions of news, blogs, podcasts, or other information.

Social networking refers to systems that allow members of a specific site to learn about other members’ skills, talents, knowledge, or preferences. Commercial examples include Facebook and LinkedIn. Some companies use these systems internally to help identify experts.

Web services are software systems that make it easier for different systems to communicate with one another automatically in order to pass information or conduct transactions. For example, a retailer and supplier might use Web services to communicate over the Internet and automatically update each other’s inventory systems.

Wikis , such as Wikipedia, are systems for collaborative publishing. They allow many authors to contribute to an online document or discussion.



http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Use Case Based Testing (UCBT)

Use Case Based Testing (UCBT)

Use Case Based Testing (UCBT), is a technique for generating test cases and recommended configurations for system level testing. In this approach, testers build a test model based on the standard UML notions of use cases, actors, and the relationships between these elements. The use cases are enhanced with additional information, including the inputs from actors, the outputs to the actors, and how the use case affects the state of the system. Newly developed algorithms use this model to generate a test suite which provides a specified level of coverage of each use case. We could also generate workload configurations that combine the test cases according to requirements specified in the model. The generation algorithm performs minimization to reduce the number of test cases required to cover the system to the specified level. The workload configurations are based on desired percentages associated with each actor that the tester provides. These features form a powerful basis for model-based test case generation.

What testing phase does UCBT address?
UCBT addresses phases where the tester is interested in exploring behavior that flows through multiple use cases, which are typically the late function, system, and solution phases of the test life cycle. In late function level test, the tester has tested the use cases individually, and is now interested in looking at combinations of the use cases. System test addresses the situation when all required functionality for the system is present, and the tester seeks to ensure the proper functioning of the system as a whole. An important component of system test is also ensuring that the system can handle customer-like scenarios and workloads. Solution test addresses the situation in which several complete systems are combined to provide complex functionality through some process that involves the systems. These processes can be captured, modeled, and tested using UCBT.

What does the tester do in UCBT?

To do UCBT, the tester needs to identify four things:
the use cases of interest,
the actors involved in using the system,
the input, output, and system effects for the use cases,
The flows of interest between the use cases.

A use case is a semantically meaningful function that provides some value from the user’s point of view. For example, saving a file in a word processing system would be represented by the Save File use case. Each use case can have input parameters associated with it, and for each parameter, a set of logical partitions of the values that parameter can take may be identified. Finally, the use cases can be connected using flows that describe a sequence of use case that are performed to accomplish some goal.

What is produced by UCBT Tool?
UCBT produces two types of test cases. The first types are abstract test cases, which are not executable. These are suitable for incorporation in a test plan and are provided in structured English. They show the ordering of use cases to be performed and the inputs and expected results that each use case should have during the test. The second type of test cases are in a format known as ATS, which can be used to create executable test cases using another tool known as TCBeans or any such tool . The test suite produced by UCBT is minimized in size by performing a two pass optimization based upon the input parameter interactions that the tester specifies, as well as the flows of interest. This ensures that the number of test cases is reasonable, so execution and evaluation can be done with a feasible amount of effort.



http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

What’s new with QTP9.5

What’s new with QTP9.5

This is the general overview giving the brief description of what is new in QTP9.5

New Features:

1. New design time panes:
Various new IDE panes have been introduced which does not provide any new functionality to add up but basically the operations which were in the deep sub menus are now put up in front.

Available Keyword Pane:
This pane shows all the available functions in the current test (either in-action or externally added), as well as all the objects in your object repository (local and external). The items are effectively separated into groups, making it easier to search for a relevant item. Double clicking any item in the pane will open it, and dragging the item to the main window will add it to the script in the drop position. Double clicking a function will not only open the hosting file in the main window, but also focus on the exact position of the function within the file.

Test Flow Pane:
This pane lays out the action call structure of the current test. It outlines the order in which the “main” actions are called, as well as the inner-action calls between actions. Other than understanding the test flow, it offers a central place to access and maintain the properties of the actions. One can easily delete an action, change its properties and its action-call properties, access its object- repository, and control the order in which it will be called within a test.

Resource Pane:
This pane offers a quick overview of all the external function files, recovery scenarios and object repositories associated with the test; it also provides a quick way to add and remove these resources. One can associate repositories with specific actions, add new external library files and add new recovery scenarios.

Process Guidance Pane:
This pane (or rather – panes) somewhat resembles MS Office on-line help. These panes include a topic list on one side and a content area on the other (though you can position and dock the components as you like). Clicking a topic will load its contents onto the content area, and one can navigate via the Next and Back buttons.
One can add his own “process guidance packages”, which can offer unique topics and contents. For example, these might include the way to store tests and manage code versions, how to work with custom objects in the application, coding standards, and much more. These packages can be easily created with simple HTML pages and a contents XML file.

2. Checkpoint and Output Values Management
Checkpoint and Output Objects in the Object Repository: All your checkpoints and output values are stored as objects in the object repository so that you can manage them together with your test objects.
Enhanced functionality of Bitmap Checkpoint: Bitmap checkpoints now include options for specifying RGB tolerance (percentage) and pixel tolerance (number of pixels or percentage) values. These values enable one to indicate acceptable differences between the actual image and the one stored with the checkpoint.

3. Running Scripts in Maintenance Mode
In Maintenance Run Mode, QTP identifies discrepancies between the objects in the repository and those in application, and then offers solutions for updating objects and steps in real time. The run pauses each time an object is not found. One can point to the required object in application, and QuickTest will recommend a solution for updating the object repository and test step to match the selected object. Alternatively, one can select to add comments to a problematic step and address it manually at a later time.

4. Web Add-In Extensibility
QTP 9.5 enables one to extend the support given for Web objects by means of its Web Add-In Extensibility feature. This feature is very important especially when the AUT includes unsupported third-party or custom Web controls. Moreover new technologies such as AJAX are also supported. To use this feature, JavaScript knowledge is necessary. Implementing a Web extension requires to configure 2 XML files, write some code in JavaScript and then deploy these files to the required folders.

5. New Supported Operating Systems and Environments
QTP 9.5 has added new support for the operating systems, browsers, and development environments listed below.
• Windows Vista 64-bit Edition
• Netscape Browser 8.1.3, and 9
• Mozilla Firefox 2 and 3.0 Alpha 7
• Microsoft .NET Framework 3.5
• Oracle Forms and Oracle Applications, version 10g
• Java SWT toolkit, versions 3.2, and 3.3
• Eclipse IDE, version 3.2 and 3.3 (For Java Add-in Extensibility)
• New Terminal Emulator types and versions:
o AttachmateWRQ EXTRA! 9, Terminal Viewer (compatible EXTRA! 9), Reflection for Unix and OpenVMS sessions 14, Reflection 14
(Note: Reflection 13 is not supported.)
o Hummingbird HostExplorer 2007
(Note: HostExplorer 2006 is not supported)
o IBM IBM 5.9, IBM WebSphere Host On-Demand 10
o NetManage RUMBA 7.5, RUMBA Web-to-Host 5.3
o Seagull BlueZone 4
o Zephyr PASSPORT 2007, PASSPORT PC TO HOST / WEB TO HOST 2004 and 2007

6. Miscellaneous Improvements:
Auto-convert to relative path: Whenever you include a resource (e.g. an external library, object repository, etc.) a pop-up will appear, suggesting to add the relevant path to QTP’s folders list, and to convert that path to a relative one.
New actions are reusable: In QTP 9.5, when you create new actions, they are set as reusable by default.
Text recognition mechanism: One can now configure which text recognition mechanism(s) you want to use for recognizing text in Windows applications, and also in which order the mechanisms should be applied.
Record on SWT: When using the Java Add-in, recording on objects developed using the SWT toolkit is now supported.


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Database Handling in QTP using ADODB

Database Handling in QTP using ADODB

Database handling via VBScript is basically done with following steps:
1. Creating the object of ADODB
2. Define the Connection String for the database to connect
3. Opening the connection
4. Firing of the query
5. Accessing data with Record Set Object
6. Closing the connection
7. Release the memory occupied by the Objects.

We will go through each of the above stated steps with an appropriate example showing how the things work up in real life application.

Creating the object of ADODB –

Set db = CreateObject(“ADODB.Connection”)

Specifying the Connection String of the database to connect –

Connection String related to specific connection can be set either with or without DSN (Data Source Name).

In Case of DSN –
You create the DSN depending upon you want to fetch the data from SQL Server, Excel, Access etc. depending upon the drivers present in your system. Say for example DSN for Ms-Access is created with name as “MyDSN” for a pre specified database selected. You will write the command as:
Db.ConnectionString = “DSN = MyDSN”

In case you don’t want to create a DSN-
Connection String will now contain the complete information of what is contained in DSN. It has a benefit over DSN, that the connection since it can contain the complete network path of the database, so can work on all systems. But in case of DSN, that has to be present in the machine when you are using it. Again taking that database is on MS-Access -
Db.ConnectionString = “Driver={Microsoft Access Driver (*.mdb)};Dbq=C:\mydatabase.mdb;Uid=Admin;Pwd=;”

Opening the connection

Once the connection string has been set, next step goes towards the opening of the connection. It basically setup the connection/pathway with the database being specified in the connection string. Command goes like:
Db.Open

Firing of the query

Next step goes to the writing of the SQL Query and executing it. We write a sql query in a variable and that query is executed. On executing the query, a recordset object is returned which contains the result set of the query executed.
Getting the SQL Query:
SQL = “Select * from table1”

Executing the SQL query and capturing the recordset object returned:
Set rec_ob = Db.execute (SQL)

Accessing data with Record Set Object

As per the example, rec_ob is the recordset object containing the result of the query which was being executed. Now we can capture each record in the recordset. In general for looping down till end through the recordset we use the following:
Do while rec_ob.EOF <> true
Operation on records
Loop

Various important methods/events/properties and collection supported recordset object with explanation:
Properties:
BOF – Returns true if the current record position is before the first record,
otherwise false.
EOF - Returns true if the current record position is after the last record,
otherwise false
State – Returns a value that describes if the Recordset object is open, closed,
connecting, executing or retrieving data

Methods:
Open – Opens a Recordset
Close - Closes the Recordset
MoveFirst - Moves the record pointer to the first record
MoveLast - Moves the record pointer to the last record
MoveNext - Moves the record pointer to the next record
MovePrevious - Moves the record pointer to the previous record
Save - Saves a Recordset object to a file or a Stream Object

Events:
The various events supported by ADODB Recordset object cannot be handled using VBScript or JSCript (Only VB, V C++ and V J++ can handle these events). So we are not going discuss these over here.
Collections:
Fields - Indicates the number of field objects in the Recordset object
Properties - Contains all the Property objects in the Recordset object
The Fields Collection’s Properties:
Count - Returns the number of items in the fields collection. Starts at zero
Item (name/number) – Returns a specified item in the fields collection.

The Properties Collection’s Properties:
Count - Returns the number of items in the properties collection. Starts at
Zero
Item (name/number) – Returns a specified item in the properties collection.
Closing the connection

Once all the activities have been carried out on Recordset object and no more database accessing is record, you need to close the connection established by the ADODB object. Command goes like:
Db.Close

Releasing the memory space occupied by the Objects

The final work is to free up the space occupied by all the objects which were created to reference the objects created. This is done for freeing up the memory so that there is no memory leak.
Set Db = nothing


http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

Saturday, March 14, 2009

IBM Rational TestManager

IBM Rational TestManager



Rational® TestManager is the central console for test activity management, execution and reporting. Built for extensibility, it supports everything from pure manual test approaches to various automated paradigms including unit testing, functional regression testing, and performance testing. Rational TestManager is meant to be accessed by all members of a project team, ensuring the high visibility of test coverage information, defect trends, and application readiness. Rational TestManager is freely available to all users of IBM Rational® Functional Tester, Rational Manual Tester and IBM Rational® Robot. And because of its value to development teams, it is also included in the IBM Rational® Team Unifying Platform.

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

IBM Rational Test RealTime

IBM Rational Test RealTime



* Cross-platform solution for component testing and runtime analysis.
* Designed specifically for those who write code for embedded and other types of pervasive computing products.
* Supports safety- and business-critical embedded applications.
* Allows you to be more proactive in your debugging, discovering and correcting errors before they make their way into production code.
* Automated source code review, which reports on adherence to guidelines for C source code.
* Integrates with IBM ® Rational® solutions for model-driven development, test management, and software configuration management.
* Integrates with industry-leading third-party tools, such as Mathworks Simulink, Microsoft Visual Studio, and TI Code Composer Studio.
* Eclipse plug-in allowing for seamless integration for Runtime Analysis with the Eclipse C/C++ Development Tools (CDT)

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

IBM Rational Software Modeler

IBM Rational Software Modeler



IBM® Rational® Software Modeler is a robust UML™ 2.0-based visual modeling and design tool.

Enables architects, systems analysts, designers and others to specify and communicate development project information from several perspectives and to various stakeholders.

* Extends Eclipse 3.3 open software development environment.
* Is easy to install and use, with flexible installation options in a single product for Microsoft® Windows® and Linux®.
* Provides rich support for modeling with the UML 2.1 as well as for creation of UML-based Domain Specific Language environments and “simplified” UML modeling environments
* Enables flexible model management for parallel development and architectural re-factoring, e.g., split, combine, compare merge models/model fragments.
* Helps ease the transition between the architecture and the code with model-to-model and model-to-code transformations, including reverse transformations.
* Allows you to apply included design patterns — and/or author your own — to ensure that conventions and best practices are followed.
* Integrates with other facets of the lifecycle — including requirements, change management and process guidance; includes Rational ClearCase® LT. O
* perating systems supported: Linux, Windows

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au s

IBM Rational Rose Data Modeler

IBM Rational Rose Data Modeler



IBM® Rational Rose® Data Modeler offers a sophisticated visual modeling environment for database application development

Accelerate your processes by connecting the database designers to the rest of the development team through a common tool and the Unified Modeling Language™ (UML™) v1.4.

Enables database designers to visualize how the application accesses the database, so problems are escalated and resolved before deployment

Enables the creation of the object models, data models and data storage models and provides the ability to map logical and physical models to flexibly evolve database designs into the application's logic

Supports round-trip engineering between the data model, object model and defined data language (DDL) file/database management system (DBMS) and offers transformation synchronization options (synchronization between data model and object model during transformation)

Offers a data model-object model comparison wizard, supports forward engineering of an entire database at a time, and integrates with other IBM Rational Software Development lifecycle tools

Provides the ability to integrate with any SCC-compliant version control system, including IBM Rational ClearCase®

Provides Web publish models and reports to improve communication across the extended team

Operating systems supported: HP Unix, Linux, Sun Solaris, Windows

View features and benefits : Rational Rose Data Modeler is a visual modeling tool that makes it possible for database designers, analysts, architects, developers and anyone else on your development team to work together, capturing and sharing business requirements, and tracking them as they change throughout the process. It provides the realization of the ER methodology using UML notation to bring database designers together with the software development team. With UML, the database designer can capture information like constraints, triggers and indexes directly on the diagram rather than representing them with hidden properties behind the scenes. Rational Rose Data Modeler gives you the freedom to transfer between object and data models and take advantage of basic transformation types such as many-to-many relationships. This tool provides an intuitive way to visualize the architecture of the database and how it ties into the application.

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

IBM Rational Robot

IBM Rational Robot



Rational Robot is a test automation tool for functional testing of client/server applications.
Test automation tool for QA teams for testing client/server applications. Enables defect detection, includes test cases and test management, supports multiple UI technologies.

o Provides a general-purpose test automation tool for QA teams for functional testing of client/server applications
o Lowers learning curve for testers discovering the value of test automation processes
o Enables test-automation engineers to detect defects by extending test scripts and to define test cases
o Provides test cases for common objects and specialized test cases to development environment objects
o Includes built-in test management, integrates with IBM Rational Unified Process tools
o Aids in defect tracking, change management and requirements traceability
o Supports multiple UI technologies
o Operating systems supported: Windows

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au

IBM Rational RequisitePro

IBM Rational RequisitePro



Uses advanced real-time integration with Microsoft® Word to provide a familiar environment for activities such as requirements definition and organization

Incorporates a powerful database infrastructure to facilitate requirements organization, integration, traceability and analysis

Enables detailed attribute customization and filtering to maximize the informative value of each requirement

Provides a fully functional, scaleable web interface optimized for usage in a geographically distributed environment

Provides detailed traceability views that display parent/child relationships and shows requirements that may be affected by upstream or downstream changes

Performs project version comparisons using XML-based project baselines
Integrates with multiple tools in the IBM Software Delivery Platform to improve accessibility, communication and traceability of requirements
Operating systems supported: Windows

Features and benefits : Software development is a team endeavor, so it is critical that team members possess a shared understanding of their project's vision, goals, specifications and requirements. But how can this be achieved when teams are geographically distributed and functionally isolated, failing to communicate with each other in a timely, clear, consistent manner? The IBM Rational RequisitePro solution addresses this need.

Rational RequisitePro is an easy to use requirements management tool that lets team's author and share their requirements using familiar document-based methods while leveraging database-enabled capabilities such as requirements traceability and impact analysis. The result is better communication and management of requirements with the increased likelihood of completing projects on time, within budget and above expectations. Successful projects start with requirements management - the more effective the execution, the greater the resulting quality and customer satisfaction.

http://www.qacampus.com
http://www.crestech.in
http://www.crestechsoftware.com.au