website quality etiketine sahip kayıtlar gösteriliyor. Tüm kayıtları göster
website quality etiketine sahip kayıtlar gösteriliyor. Tüm kayıtları göster

4 Ocak 2009 Pazar

Web Site Quality

Web Site Quality Assurance


The term quality assurance, when applied to web sites, describes the process of enforcing quality control standards and working to improve the processes that are used in producing the web site and its components, infrastructure and content. When quality assurance is well implemented, a web site should see progressive improvement in terms of both lessening rate of defects and general increase in site usability and performance.

Quality assurance should function as a "voice" for the user, a reminder to the designers and developers that the site is designed for users outside the office. Quality assurance as ombudsman would be a positive force for a quality user experience.

If you are limited in what you can accept responsibility for, document those limits. For example, if you can't test data or middleware, announce that fact whenever you provide test results for the quality of the site. Even the best designed and developed sites will experience problems and failures, so a good quality assurance team should set expectations -- for the entire web site team and with management -- for what QA can effectively accomplish.
Focus on Improving Processes

The key to understanding quality assurance is understanding the emphasis on process: quality control focuses on what comes out of the web site creation process (creation, development, publication -- whatever term you prefer that describes the process that results in the web site). Quality assurance focuses on what goes into the creation process as well as on the process itself with the goal of improving the quality of output by improving everything "downstream".

Quality assurance looks beyond the structured testcases used by quality control because these testcases are necessarily limited. Quality assurance focuses on more than a site's ability to meet a specific benchmark; quality assurance aims to make the site better so tests are passed more consistently, so that the benchmark can in fact be refined, and so that problem areas can be eliminated.

Quality assurance should be involved in the development process. QA should review new designs before they are finalized with an eye towards usability and user experience factors; heading of problems before they become real improves quality immediately and reduces problems "downstream".

Quality assurance should be involved in customer service and user-support communications, especially with a commerce site, so that usability defects can be reviewed. With user input, QA can refine user scenarios to better match "real" behavior. There is no substitute for user comments.

Are the tools used to create and maintain your web site appropriate for their tasks? Can the tools be tweaked to shorten the processes or eliminate some steps? Can some tests be incorporated earlier in the creation process, such as spell checking? If your site has changing content, is the content checked after it is published, or before it is entered into the database? Quality assurance should pay attention to all of these issues.

Following any major phase of your web site, perform a postmortem analysis: review the success of the changes, redesigns, scheduling, file transfers, etc. What could be made more efficient? Which processes could be smoothed out?
Focus on Tracking Problems

Quality assurance also involves a closer involvement with defects and their resolution. During the quality control process, problems are discovered and typically reported and handed off to the people who "own" the work with the defects. Quality control can be a binary process: something passes, or it fails and is "bounced" back to the team responsible for fixing it enough to be tested again.

Quality assurance catches the problems discovered through use of quality control testcases, but also finds problems uncovered through more general site reviews and ad hoc usability and consistency testing. In addition, quality assurance testing finds areas for improvements that may not be defects, but rather opportunities; user input is a great source for such "opportunities". The set of problems returned by thorough quality assurance testing is therefore larger than the set found through quality control testing. The handling of this larger set of problems is a major function of QA.

Quality assurance should log reported problems in a database of some kind, assigning properties to the problem such as the priority and scope, and recording such attributes as description, error message, affected functionality, etc. In addition, QA should assign and track ownership of the problem, and should track the progress made towards resolution of the problem. Quality assurance must take an active role in getting problems fixed; demanding schedules for the fixes, explanations for the problems, working to eliminate the type of problem in the future -- these are all common actions.
Source: philosophe.com

Sigma-Aldrich is committed to providing quality products, so it isn't surprising that it also wants its Web site to provide a quality experience for its customers.

"Our Web site brings in millions of dollars a day. If we have any outage, we lose orders -- we lose customers," said Rich Porter, Web administrator at Sigma-Aldrich, a life science and high technology company based in St. Louis, Mo.

About 35% of Sigma-Aldrich's business is conducted via its Web site, which supports more than 100,000 page views per day. Its Internet catalog and store offer more than 130,000 products.

If we provide a good customer experience at the Web site then the customer overall has a better idea and sense that Sigma provides quality products and a quality experience.
Rich Porter
Web administrator, Sigma-Aldrich



That's why the company, which provides biochemical and organic chemical products and kits used in scientific and genomic research, biotechnology, pharmaceutical development and the diagnosis of disease, continues to evolve and enhance its Web site for usability and a quality experience for its customers. Predeployment automated performance testing is a key component of ensuring a good customer experience, and it will be particularly important as the company transitions its Web site to the IBM WebSphere platform. Its tool of choice is Borland's SilkPerformer for automated load, stress and performance testing.

Porter chronicled the Web site's approximately nine-year evolution, which started as a Lotus Notes platform and then moved to the Haht application server from Haht Software, now owned by Global Exchange Services (GXS). Sigma-Aldrich is now in the process of migrating to a WebSphere environment. The goal is to be able to scale the site to accommodate the company's growth, and reliability and stability are critical, he said.

Sigma-Aldrich has also gone through several evolutions in its quest to do predeployment performance testing, which the company has been doing for about six years, Porter said.

"Our first attempt at load testing, we chose a vendor that gave a good demo, a good sales pitch, was reasonably priced, and had a development team in St. Louis," he said. "But when we started using it, we found the interface difficult to use and the application was buggy. We had their developers here to resolve issues, but after six months we gave it up."

Fortunately, the company's second go at performance testing with a different product went much better, Porter said, but after a few years he was asked to reevaluate the product landscape. "I reevaluated about five vendors, including the one we were using, and SilkPerformer came out on top. It has proven to be more flexible, and I like the interface better. It's also a lot easier to customize the scripts," he said.

Also, he added, "One of my pet peeves is support, and their [Borland's] support is probably best I've seen. Their software is good -- one problem we had was fixed by an update, and we've had no problems after that. It's very reliable, not buggy."

That's important to Porter: "I have scars from the first software we used."

Performance during peak times
Sigma-Aldrich has been using SilkPerformer for about a year and a half. Porter and a quality assurance person are involved with the testing. "We have peak times we have to be conscious about," he said. The company has experienced a lot of growth from the Pacific-Rim and Europe, he said, so when U.S. customers start coming on in the morning the site receives heavy traffic.

"Mornings are peak times," he said. "We have a lot of customers doing searches, ordering, and there's a lot of researching, so we have to be conscious about that time. Borland has made it easy to script for different applications very quickly, and test those apps along with the current environment to see if they have any impact."

For example, he said, the Sigma site provides a Java tool that researchers can use to draw chemical compounds and submit requests based on that image. "That's very CPU-intensive; it could take several minutes to come back based on options the researcher uses," Porter said. His job is to determine if those requests may impact other customers ordering or searching.

"We constantly have folks coming in and doing searches on a large variety of products on the Web site to get pricing and availability information," Porter said. "That information is live, so we're pulling from SAP systems. It's real-time data, so we have to make sure no one is being impacted by anything anyone is doing on the Web site."

Now the company is at the beginning of its WebSphere migration, and it is relying on SilkPerformer throughout that process.

"We do have WebSphere available to limited customers," Porter said. "SilkPerformer is helping to prove the WebSphere environment can handle, say, three times the load the current site can without an impact on the servers or performance."

WebSphere is available to B2B customers, he said, and Sigma hopes to expand that to its public customers going forward.

Sigma-Aldrich as a company is committed to quality and quality assurance systems, and states on its Web site that: "Every employee at Sigma-Aldrich is dedicated to defect-free work, following established procedures, and delivering products and services that are world-class."

Predeployment performance testing is in keeping with that quality commitment.

"It improves the quality of our Web site by improving reliability and performance," Porter said. "If we provide a good customer experience at the Web site then the customer overall has a better idea and sense that Sigma provides quality products and a quality experience. That's how SilkPerformer helps, in improving the overall user experience."
searchsoftwarequality.techtarget.com/

21 Aralık 2008 Pazar

Ways to Improve Your Website Navigation & SEO

4 Easy Ways to Improve Your Website Navigation & SEO


Posted in December 19th, 2008
by admin in Internet Marketing

Navigation is an extremely important part of your website that helps users find the content they’re after and informs search spiders about the structure of your website. In order to improve the usability and ranking of your site, it’s important to keep both users and SEO in mind when deciding on the structure of your website.

Here are my top 4 (Google Approved) ways to improve your sites navigation:

1. Create a clear structure for your content
Unless your site only has a handful of pages, it’s a good idea to group related content under a subfolder on your site. If you have an online store for example, your products could be structured in the following manner:

/home/
/articles/
/about/
/online-shop/
/shoes/
/mens shoes/
/womens shoes/

2. Use text (not images) for navigation
Navigation is going to include some of your most important keywords, so you want Google to index these right? Google has a hard time making sense of images so make sure your site is easy to crawl by using text for navigation!

3. Use “Breadcrumb” navigation
Breadcrumb navigation helps show visitors where they are on your website and is also another chance to include important keywords on your pages. These links are usually found at the top or bottom of pages and generally look like this:

DVD Store > Comedy > Top 10 Movies

4. Use a HTML and XML sitemap
A HTML sitemap is just a simple page that contains links to all your major pages. This can be helpful to users if they become lost or are looking for specific content on your site. An XML sitemap is useful for search engines and helps inform the search spider about the structure of your site. If you don’t have an XML sitemap, visit XML Sitemaps to create one for free!
4 Easy Ways to Improve Your Website Navigation & SEO

Why major search engines changed their criteria for ranking and are it difficult to get to the top?

It has become increasingly difficult to get to the top of the major search engines in recent years as the likes of Google and Yahoo! have changed their criteria for how they rank websites. Many old techniques such as metatags and reciprocal links have lost their effectiveness. However, despite these changes, article submission remains one of the most effective ways to gain incoming links that will boost your website’s popularity among the major search engines.
While there are many other aspects to search engine optimization (SEO), article submission can be a cheap and effective way to help you get that coveted spot on the first page of the search engine results for keywords relating to your business.
First let’s look at how article marketing works in relation to improving SEO. Basically, every time a link to your website is published on another website your popularity with the search engines increases. These are known as backlinks. In every marketing article you submit to an article directory you can include a resource box with information about you or your product, and a backlinks to your website.
So, even just by submitting to article directories you are already increasing the number of backlinks to your website. However, the real value from submitting these articles is that they can be picked up and published anywhere on the web. Every time an article is published you get another backlinks to your website. If your articles are published enough times you could see your search engine page ranking rocket. Bear in mind though, that the effectiveness of your campaign will depend on where the articles appear. A backlinks from a website that itself has a high search engine page ranking will carry more weight than a backlinks from a site with a poor ranking.
Keywords vs quality
carefully choosing keywords is an essential part of article marketing for a number of reasons. First, you want your articles to be easily found in article directories. If they remain buried under similar articles they will never be published on other websites and your efforts will have been wasted.
Also, you want your articles to get good results on the search engines themselves. If your articles find their way onto the front pages of the search engines you will increase exposure to your website or product, and increase your reputation as an article writer.
However, there is more to successful article marketing than keywords. There has been an ongoing difference of opinion among SEO experts as to whether the quality of articles can be sacrificed for the sake of high keyword density within the text. While it is important to put keywords in the text to help with SEO, quality content is king when it comes to article marketing.
The main aim of article marketing is to get your articles published on as many websites as possible. Quality content is at a premium on the internet so if you can provide good articles that people want to read you will be successful. If you stuff your articles with keywords websites will not publish them and readers will rarely read beyond the first paragraph. So, keep keywords to about two per every 100 words of text and do not let them disrupt the flow and the meaning of the article.
Know your neighbor
the most successful article marketers are the ones who gain a reputation as experts in their chosen field, so try to find your niche and work to your strengths. If you write authoratively and enthusiastically about a subject that you know about you are much more likely to attract readers.
If, for example, you have a website that sells Mac software you could write about new releases and trends in this field. However, you could also boost your profile by becoming involved in the general online Mac community by posting on forums or even creating your own blog. These can provide extra opportunities for creating backlinks and driving traffic to your website, and as you become better known in your field you will find your articles will be more used more often and more widely.
New Web 2.0 social bookmaking websites are also giving article marketers new opportunities for improving SEO and increasing traffic to their website. If you come up with an article you think could be very popular on the web you could publish it on your website and submit it to websites like Digg and Reddit. If it gains popularity on these sites it could go viral and bring thousands of hits to your site.
Article marketing offers excellent opportunities to promote your product while improving the SEO of your website. Even if you do not have the time or the will to write articles yourself, you should look into having a freelance writer doing the job for you.
For more useful tips & hints, please browse for more information at our website:- www.article-promotion-course.com www.articlemarketing.infozabout.com, Why major search engines changed their criteria for ranking and are it difficult to get to the top?



Are you looking for more traffic and viewers for your pages?


Well, if you’re writing or selling anything on the internet, this has an obvious answer…

Traffic to your pages is really all about having high rankings in the search engines. What this means to you is that any prospect typing your keyword(s) into any search engine’s search bar will be served your link on the first page of results (hopefully).

To direct traffic your way and to generate sales, you’re going to have to do some sharp analysis of the quality, organization and topic relevance of your website pages.

So let’s start with a quick overview of just five simple ways to start ranking higher using better search engine optimization (SEO) tactics. First thing’s first…

1. Prepare An Outline For Your Page Content Before You Write It-

For website success, you MUST stay tightly organized and be very resourceful. Search Google using keywords that pertain to your niche and look for sites that have these features. When a site looks clean and orderly, make sure you’ve got the tools in place to help you validate your suspicions. For instance:

Download Sparky which is Alexa’s toolbar gadget for your Firefox browser. Sparky installs a non obtrusive menu at the bottom right of your browser revealing the Alexa rankings (traffic trends, reach meter & traffic rank) of the pages that you are currently viewing. “What, you don’t work with Firefox?!”

Firefox is a browser that’s incredibly useful for online marketers because there’s many widgets n’ tools made compatible with the browser to assist you in your marketing research.

2. Be Niche Specific-

The more you can carve out a little niche space within a larger industry, the better your chances of achieving higher ranks for that particular market. It’s equally important to find a niche that’s large enough to be profitable for your time & efforts though. Research is the key here and I’ll explore this subject with you at another time.

3. Estimate Your Search Traffic-

One of the most important aspects of SEO is knowing just how much traffic may be searching for your particular niche. Several good keyword tools are available to help you determine which keywords you’ll need to target throughout your webpages to get higher ranks. Keyword Spy is an advanced research engine from the leaders in keyword research technology. Increase your revenues by finding the most profitable keywords. Know your competitor’s online marketing strategies by spying on their choice of keywords! I won’t build a website without this tool.

4. Use Compelling Keywords-

Use the most powerful keywords in the title of your pages, in the meta tags and scatter them as well throughout your pages. Your composition should feature about one to two uses of the particular keyword per paragraph. Aim for a keyword density between two to four percent per page and no greater or search engines may consider it spam & over promotional.

5. Develop A Long Term Plan-

Building a marketing plan & pages without a clearly defined end purpose in sight is like driving in the night without the headlights on. You need to have a clear set of goals your site has been created for and to achieve these goals you you have to put a business plan together to give direction and to achieve your goals.

23 Ocak 2008 Çarşamba

The WebSite Quality Challenge

The WebSite Quality Challenge
Dr. Edward Miller
eValid HOME


ABSTRACT

Because of its possible instant worldwide audience a WebSite's quality and reliability are crucial. The very special nature of the Web applications and WebSites pose unique software testing challenges. Webmasters, Web applications developers, and WebSite quality assurance managers need tools and methods that can match up to the new needs. Mechanized testing via special purpose Web testing software offers the potential to meet these challenges.
INTRODUCTION

WebSites are something entirely new in the world of software quality! Within minutes of going live, a Web application can have many thousands more users than a conventional, non-Web application. The immediacy of the Web creates an immediate expectation of quality and rapid application delivery, but the technical complexities of a WebSite and variances in the browser make testing and quality control more difficult, and in some ways, more subtle. Automated testing of WebSites is both an opportunity and a challenge.
DEFINING WEBSITE QUALITY & RELIABILITY

A WebSite is like any piece of software: no single, all-inclusive quality measure applies, and even multiple quality metrics may not apply. Yet, verifying user-critical impressions of "quality" and "reliability" take on new importance.

Dimensions of Quality. There are many dimensions of quality, and each measure will pertain to a particular WebSite in varying degrees. Here are some of them:

* Time: WebSites change often and rapidly? How much has a WebSite changed since the last upgrade? How do you highlight the parts that have changed?

* Structural: How well do all of the parts of the WebSite hold together. Are all links inside and outside the WebSite working? Do all of the images work? Are there parts of the WebSite that are not connected?

* Content: Does the content of critical pages match what is supposed to be there? Do key phrases exist continually in highly-changeable pages? Do critical pages maintain quality content from version to version? What about dynamically generated HTML pages?

* Accuracy and Consistency: Are today's copies of the pages downloaded the same as yesterday's? Close enough? Is the data presented accurate enough? How do you know?

* Response Time and Latency: Does the WebSite server respond to a browser request within certain parameters? In an E-commerce context, how is the end to end response time after a SUBMIT? Are there parts of a site that are so slow the user declines to continue working on it?

* Performance: Is the Browser-Web-WebSite-Web-Browser connection quick enough? How does the performance vary by time of day, by load and usage? Is performance adequate for E-commerce applications? Taking 10 minutes to respond to an E-commerce purchase is clearly not acceptable!

Impact of Quality. Quality is in the mind of the user. A poor-quality WebSite, one with many broken pages and faulty images, with Cgi-Bin error messages, etc. may cost in poor customer relations, lost corporate image, and even in lost revenue. Very complex WebSites can sometimes overload the user.

The combination of WebSite complexity and low quality is potentially lethal to an E-commerce operation. Unhappy users will quickly depart for a different site! And they won't leave with any good impressions.
WEBSITE ARCHITECTURE

A WebSite can be complex, and that complexity -- which is what provides the power, of course -- can be an impediment in assuring WebSite Quality. Add in the possibilities of multiple authors, very-rapid updates and changes, and the problem compounds.

Here are the major parts of WebSites as seen from a Quality perspective.

Browser. The browser is the viewer of a WebSite and there are so many different browsers and browser options that a well-done WebSite is probably designed to look good on as many browsers as possible. This imposes a kind of de facto standard: the WebSite must use only those constructs that work with the majority of browsers. But this still leaves room for a lot of creativity, and a range of technical difficulties.

Display Technologies. What you see in your browser is actually composed from many sources:

* HTML. There are various versions of HTML supported, and the WebSite ought to be built in a version of HTML that is compatible. And this should be checkable.

* Java, JavaScript, ActiveX. Obviously JavaScript and Java applets will be part of any serious WebSite, so the quality process must be able to support these. On the Windows side, ActiveX controls have to be handled as well.

* Cgi-Bin Scripts. This is link from a user action of some kind (typically, from a FORM passage or otherwise directly from the HTML, and possibly also from within a Java applet). All of the different types of Cgi-Bin Scripts (perl, awk, shell-scripts, etc.) need to be handled, and tests need to check "end to end" operation. This kind of a "loop" check is crucial for E-commerce situations.

* Database Access. In E-commerce applications either you are building data up or retrieving data from a database. How does that interaction perform in real world use? If you give in "correct" or "specified" input does the result produce what you expect?

Some access to information from the database may be appropriate, depending on the application, but this is typically found by other means.

Navigation. Users move to and from pages, click on links, click on images (thumbnails), etc. Navigation in a WebSite often is complex and has to be quick and error free.

Object Mode. The display you see changes dynamically; the only constants are the "objects" that make up the display. These aren't real objects in the OO sense; but they have to be treated that way. So, the quality test tools have to be able to handle URL links, forms, tables, anchors, buttons of all types in an "object like" manner so that validations are independent of representation.

Server Response. How fast the WebSite host responds influences whether a user (i.e. someone on the browser) moves on or continues. Obviously, InterNet loading affects this too, but this factor is often outside the Webmaster's control at least in terms of how the WebSite is written. Instead, it seems to be more an issue of server hardware capacity and throughput. Yet, if a WebSite becomes very popular -- this can happen overnight! -- loading and tuning are real issues that often are imposed -- perhaps not fairly -- on the WebMaster.

Interaction & Feedback. For passive, content-only sites the only issue is availability, but for a WebSite that interacts with the user, how fast and how reliable that interaction is can be a big factor.

Concurrent Users. Do multiple users interact on a WebSite? Can they get in each others' way? While WebSites often resemble conventional client/server software structures, with multiple users at multiple locations a WebSite can be much different, and much more complex, than complex applications.
ASSURING WEBSITE QUALITY AUTOMATICALLY

Assuring WebSite quality requires conducting sets of tests, automatically and repeatably, that demonstrate required properties and behaviors. Here are some required elements of tools that aim to do this.

Test Sessions. Typical elements of tests involve these characteristics:

* Browser Independent. Tests should be realistic, but not be dependent on a particular browser, whose biases and characteristics might mask a WebSite's problems.

* No Buffering, Caching. Local caching and buffering -- often a way to improve apparent performance -- should be disabled so that timed experiments are a true measure of the Browser-Web-WebSite-Web-Browser response time.

* Fonts and Preferences. Most browsers support a wide range of fonts and presentation preferences, and these should not affect how quality on a WebSite is assessed or assured.

* Object Mode. Edit fields, push buttons, radio buttons, check boxes, etc. All should be treatable in object mode, i.e. independent of the fonts and preferences.

Object mode operation is essential to protect an investment in tests and to assure tests' continued operation when WebSite pages change. When buttons and form entries change location -- as they often do -- the tests should still work.

When a button or other object is deleted, that error should be sensed! Adding objects to a page clearly implies re-making the test.

* Tables and Forms. Even when the layout of a table or form varies in the browser's view, tests of it should continue independent of these factors.

* Frames. Windows with multiple frames ought to be processed simply, i.e. as if they were multiple single-page frames.

Test Context. Tests need to operate from the browser level for two reasons: (1) this is where users see a WebSite, so tests based in browser operation are the most realistic; and (2) tests based in browsers can be run locally or across the Web equally well. Local execution is fine for quality control, but not for performance measurement work, where response time including Web-variable delays reflective of real-world usage is essential.
WEBSITE VALIDATION PROCESSES

Confirming validity of what is tested is the key to assuring WebSite quality -- and is the most difficult challenge of all. Here are four key areas where test automation will have a significant impact.

Operational Testing. Individual test steps may involve a variety of checks on individual pages in the WebSite:

* Page Quality. Is the entire page identical with a prior version? Are key parts of the text the same or different?

* Table, Form Quality. Are all of the parts of a table or form present? Correctly laid out? Can you confirm that selected texts are in the "right place".

* Page Relationships. Are all of the links a page mentions the same as before? Are there new or missing links?

* Performance, Response Times. Is the response time for a user action the same as it was (within a range)?

Test Suites. Typically you may have dozens or hundreds (or thousands?) of tests, and you may wish to run tests in a variety of modes:

* Unattended Testing. Individual and/or groups of tests should be executable singly or in parallel from one or many workstations.

* Background Testing. Tests should be executable from multiple browsers running "in the background" [on an appropriately equipped workstation].

* Distributed Testing. Independent parts of a test suite should be executable from separate workstations without conflict.

* Performance Testing. Timing in performance tests should be resolved to 1 millisecond levels; this gives a strong basis for averaging data.

* Random Testing. There should be a capability for randomizing certain parts of tests.

* Error Recovery. While browser failure due to user inputs is rare, test suites should have the capability of resynchronizing after an error.

Content Validation. Apart from how a WebSite responds dynamically, the content should be checkable either exactly or approximately. Here are some ways that should be possible:

* Structural. All of the links and anchors match with prior "baseline" data. Images should be characterizable by byte-count and/or file type or other file properties.

* Checkpoints, Exact Reproduction. One or more text elements -- or even all text elements -- in a page should be markable as "required to match".

* Gross Statistics. Page statistics (e.g. line, word, byte-count, checksum, etc.).

* Selected Images/Fragments. The tester should have the option to rubber band sections of an image and require that the selection image match later during a subsequent rendition of it. This ought to be possible for several images or image fragments.

Load Simulation. Load analysis needs to proceed by having a special purpose browser act like a human user. This assures that the performance checking experiment indicates true performance -- not performance on simulated but unrealistic conditions.

Sessions should be recorded live or edited from live recordings to assure faithful timing. There should be adjustable speed up and slow down ratios and intervals.

Load generation should proceed from:

* Single Browser. One session played on a browser with one or multiple responses. Timing data should be put in a file for separate analysis.

* Multiple Independent Browsers. Multiple sessions played on multiple browsers with one or multiple responses. Timing data should be put in a file for separate analysis. Multivariate statistical methods may be needed for a complex but general performance model.

* Multiple Coordinated Browsers. This is the most-complex form -- two or more browsers behaving in a coordinated fashion. Special synchronization and control capabilities have to be available to support this.

SITUATION SUMMARY

All of these needs and requirements impose constraints on the test automation tools used to confirm the quality and reliability of a WebSite. At the same time they present a real opportunity to amplify human tester/analyst capabilities. Better, more reliable WebSites should be the result.