Frank Cohen

author
+ Follow
since Apr 17, 2004
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
0
Received in last 30 days
0
Total given
0
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Frank Cohen

Hi Mary: There is a lot of documentation on TestMaker. Take a look in the download for testmaker_home/docs for a tutorial. Also, take a look at http://docs.pushtotest.com for on-line documentation and links to the articles that have been published about using TestMaker.

Additionally the TestMaker user community is reached at http://lists.pushtotest.com/mailman/listinfo/users

Hope this helps.

-Frank
19 years ago
After you read it I would appreciate you doing three things:

1) Write a reader review on Amazon.com

2) Tell 5 of your friends about Java Testing and Design.

3) Buy a copy of the book!

Thanks for your interest in Java Testing and Design.

-Frank
19 years ago
It's probably unfair of me to answer this question. It may also be better to ask the TestMaker user community the same question as lots of people use TestMaker to test J2EE applications. The user community discusses things on users@lists.pushtotest.com.

Hope this helps.

-Frank
[ October 07, 2004: Message edited by: Frank Cohen ]
19 years ago

Talking of scripting languages, I believe the competitive matrix on your site has incorrect information; LoadRunner supports both C and Java for programming the test scripts. I also wouldn't call these "proprietary scripting languages".



I'm ready to take down that competitive matrix. At one time I thought Mercury and PushToTest were competitng with each other. With the past 3 year's experience working with software developers, QA technicians, and IT managers I see that the tools we use to test and monitor systems are less important than the techniques and best practices. It's like the early days of desktop publishing, the publishing software let everyone choose crappy fonts and bad layouts. I think the same is true about testing tools.

I recently met with Mercury Interactive's management to talk about software developers, testing, and open-source. I am now looking into how Load Runner can be used by developers to turn unit tests into scalability and performance tests. I can envision a jUnit2LoadRunner utility, for example. I'd love to hear your feedback on these thoughts.

-Frank
19 years ago
Hi Lasse: Jython's viability as a language has been in question for a long time. It seems to be a recurring theme. I chose Jython as the TestMaker scripting language two years ago and I've been happy with the choice. The Jython project runs two email lists (jython-users@lists.sourceforge.net and jython-dev@lists.sourceforge.net). All of the questions I have posed there have been responded to by other users and the maintainers (Samuele Pedrone, Clark Updike, and others.

The developers are working on Jython 2.2 that will incorporate several changes that have happened on the Python side over the years.

The most impressive thing to me about Jython is its stability. I have run complex scalability tests of Service Oriented Architecture (SOA) with up to 20,000 concurrently running test agents written in Jython. Jython is solid.

Groovy looks just that: it's groovy. I like the fact that Groovy compiles into Java byte codes. This is VERY important to the Java community as it puts Java on a better competitive footing with .NET's ability to run multiple languages through their VM. Please read my blog about Java and scripting. The biggest problem for Jython and Groovy is Sun's reluctance to endorse scripting and to support it.

-Frank
19 years ago

At that point, you could use the archtypes all the way through the development process to design the system in the beginnging and test the system in the e



Yes, archetypes help the process from the start to finish. Alan Cooper introduced me to Goal-Directed Software Design back in 1997 when I was the principal archetect of the Sun Community Server. His mantra is to tell software developers to stop thinking about solving every possible use of their software and instead to concentrate on solving the needs of 4-5 users. I found that the technique could be extended to software testing and so I coined the phrase User Goal Oriented Testing (UGOT.)

I was sitting in a meeting of two companies that were considering to partner with one another. The more established company asked how they could be assured that the service would be always available and offer high quality. I described to them to 5 user archetypes. They had built their application under a "whatever-is-needed-now" philosophy and so having them talk about the user archetypes opened-up a whole new level of considerations that pertained to both the ongoing maintenence of their software and the method to test the service. Defining archetypes was a "win" all around.

The really cool thing about using archetypes today is the new interest software developers have in unit testing and how those tests implement the archetypes behavior and are shareable to do scalability, load, regression, and quality of service monitoring testing. This is the start of a new "golden age" of testing.

-Frank
19 years ago
Consider the following things when testing a Web-enabled application:

1) Who are the archetypal users? What are their goals?
2) Who will I test the system for functionality? This answers the basic question: Does it work?
3) Is the datacenter running the application ready for production? Will it handle the number of concurrent users I am forecasting?
4) Does the application perform functions fast enough to satistify the archetypal users?
5) How will I monitor the service to make certain the functionality still works?

Think of the above 5 questions as chapter headers in a test plan. If you can answer these questions then you will have the plan in your hands!

Good luck.

-Frank

P.S. There is a lot of help for you on testing Web-enabled applications at PushToTest.
19 years ago
Hi Kishore: I like simplicity too and sometimes the minimal approach works best. For example, if you are in control of the client/consumer and the service/server then why go through all the trouble of using SOAP when an HTTP Post with some XML in the payload will work just fine. This is the attitude Java Testing and Design takes when discussing architectural and design issues. It assumes that the software developer making protocol, library, and architectural choices is under the normal amount of stress � i.e. lots! - and needs to solve a problem quickly. Java Testing and Design begins with the "Forces at Work Affecting Your Web-Enabled Software." One tenth of it is about technology the rest of this chapter is about human nature.

Java Testing and Design suggests practices for achieving interoperability by describing the protocols and technologies available to software developers, including Java and .NET. It then shows the typical problems that come with each choice, including interoperability and scalability problems. For example, Java developers are regularly told by Sun to use SOAP RPC-encoding. The book shows that choosing SOAP RPC-encoding can lead to a 3000% decrease in throughput over the other encoding styles. Plus, SOAP RPC-encoding does not interoperate with .NET services easily.

-Frank
19 years ago
I think I fixed the comma in the signature. Thanks for letting me know. -Frank
19 years ago
Hi J.B.: I'm not an expert at Agile methods, so please take my reply with a grain of salt. I would like to learn from you about Agile Customer Testing (and Acceptance Testing) so feel free to reply.

Bret Petticord introduced me to Agile Testing a couple of years ago. It seemed very well thought-out and something that would challenge engineers to think about testing first. UGOT is different from Agile Testing in these ways:

1) Agile stresses testing-first where a unit/functional test is built before and engineer writes the application software. UGOT says "test first" is a good idea but it is optional. UGOT is not nearly as militant as Agile sometimes comes across.

2) UGOT provides a decision maker with actionable knowledge to decide when a service is ready for production. UGOT provides:
- Functional test to check for regression and successful operation.
- Scalability testing for capacity planning. Scalability is stated as a function of throughput measured at the consumer/client in requests-per-second as the number of concurrent consumers/clients increases. This shows that the system has enough capacity to serve forecasted levels of users. (See http://www.pushtotest.com/Docs/howto/onscalability2.html to see how I have been plotting this for our customers. I'm open to feedback/criticism.)
- Performance testing to make sure the service meets user expectations for response times. A 3 tall mocha latte's a day person isn't going to wait more than 5 seconds for their email client to check for new messages.

3) Bret described "coaching tests, a way of thinking about Acceptance tests" that turn user stories into tests. UGOT identifies archetypal users by their behavior when using the service. By understanding the archetypal users behavior, we can then write test agent code that drives the service, as the archetypal user will. For example, the following are two archetypes for an upcoming test of a payroll service:

Payroll Data Entry Clerk (Irene)
Irene works for a paper supply company in Milwaukee, Wisconsin. She is 26, engaged to be married next Spring, an accomplished long distance runner. Irene is response for managing the company and employee information for the company payroll. On a daily basis she gets questions from employees and contractors about their tax withholding. For example, how many dependents do I have on my W2? The company makes a payroll every 2 weeks. Irene's work effort greatly increases as the next payroll date approaches.

Payroll Approver (Maggie)
Maggie works for the same paper supply company as Irene. She manages the in-house bookkeeping staff of 3 clerks, including Irene. Maggie has been with the company for the past 12 years as the financial services manager. She has two children, her husband is a small business owner, and she loves to travel. Maggie is the companies' final check that payroll information is correct and that the calculated taxes in each payroll are correct. Maggie routinely updates company and employee information.

The user archetypes provide a common, down-to-earth, way of discussing typical usage of a service among developers, QA techs, and IT managers. UGOT turns the archetypes into functional tests. This can be done in any framework: jUnit, Load Runner, jMeter, Java, Jython, etc. (Of course, my preference is Jython and Java in TestMaker.)

With the right framework these functional tests are run multiple times and concurrently to test the system for scalability and performance. The same functional tests can be run periodically as a Quality of Service monitor.

Java Testing and Design shows the reason why building archetypes is important to a business, how to go about doing it, and then how to repurpose them between developers, QA technicians, and IT managers.

-Frank
19 years ago


In one way, I think you should be able to ignore what technology is used to implement an application so that you can test the functionality of that application at face value. Regardless of what was used to create an application, it should still perform the same functions and those functions certainly need to be tested. That would seem to fall right in-line with black-box testing.

On the other hand, I think it's only fitting that you do pay attention to the technology that was used to write the application so that you can ensure that every execution path is tested (or at least try to). That would line up nicely with white-box testing. It would be great if the black-box tests covered all possible execution paths but, unless you've got a lot of black-box tests lined up, you're probably going to need to use some white-box tests



You've hit upon a topic that is a central thesis of Java Testing and Design: Black box and white box testing do not produce actionable knowledge in a service environment. Black box testing assumes that the "box" is self-contained � You are the operator of the box and when you make a sequence of requests the box is supposed to respond in the same way. That's just not how a service works. When you make a request to the service, it's functionality and speed-to-perform-the-request depends on what other requests are being served concurrently. Black-box testing is fine to achieve a simple regression test to make sure the box still performs functions correctly, but you cannot take the response and extrapolate that the functional will be correct for subsequent requests.

White box testing has a similar and deadly problem in a service environment: In my experience there is little to no chance of achieving 100% coverage in a set of test cases. With services you not only have to test the consumer/client facing request interface but also test the backend system too. This leads to text matrices that have 20,000 or more test cases. For example, in a test of Web Services for General Motors we looked at a service that offered multiple encoding styles, multiple levels of functions, and different hard configurations. In all there were 18,000 possible test cases. With each test case taking 30 minutes to set-up, run, and tear-down the entire suite would take more than a year to run. Few enterprises are able to test all of the test cases.

Java Testing and Design puts forth User Goal Oriented Testing (UGOT.) UGOT is a testing methodology that models the behavior of archetypal users of a service. In the book I show how to understand these behaviors and how to turn them into test agent code. The resulting code does a functional test of a service. The functional test is intelligent since the test requires multiple calls to the service to accomplish the user's goals. Rather than testing every function of the service, the UGOT method tests the service against the goals of the archetypal users. This has worked really well with GM, Sun, and others. The book covers three case studies. Take a look at my blog (http://www.pushtotest.com) for additional case studies that happened after I turned in the manuscript.

Anyway, I hope this gets the discussion going here. I'm open to any and all of your new ideas.

-Frank
19 years ago
Hi Guys: There is a world of difference between testing services built with Java and .NET. For starters, most .NET developers do not have an understanding of the impact of SOAP encoding styles, XML handling methods, and coupling techniques have on functionality, scalability and performance. Visual Studio makes these choices for the developer.

In the Java world software developers and QA technicians are still arguing over weather Eclipse is better than NetBeans. :-) Java developers are expected to pick their favorite tools, techniques and designs. For example, given a choice between JAXB, JAXP, Xerces, and JDOM which is most appropriate for a service design and which will run fastest in production? Java developers have any choices to make and each choice has a huge impact on the scalability and performance.

Java Testing and Design is the first book to look at these issues and provide a methodology and framework to understand scalability and performance in a service environment. For example, Chapter 14 shows a huge problem in building scalable SOAP-based Web Services when the developer chooses SOAP RPC-encoding. I'll post a separate reply about this topic.

-Frank
19 years ago
Hi Jeff: Thanks for your interest in Java Testing and Design: From Unit Testing to Automated Web Tests (JTD.)

JTD is the first book to make the case that delivering services � software application functionality delivered over routed networks and open protocols � requires the combined effort of software developers, QA technicians, and IT managers. In my experience those three groups rarely cooperate. It's more like they put up with one another. So, when it's time to make design decisions for a new project or on maintaining an existing service there is rarely group discussion on the best design to use. JTD educates software developers, QA technicians, and IT managers on the available protocols, tools, and techniques to deliver services. For example, choosing tightly coupled n-tier server architectures over loosely-coupled multiple server designs. JTD encourages a build-it-to-be-tested design philosophy. That's the "design" part of Java Testing and Design.

I hope this answered your question. Please let me know.

Thanks.

-Frank
19 years ago
Hi Lasse: I love a good technical book. Over the years I've bought a lot of them. Some have been the kind of book that is basically a detailed reference guide to a very specific topic. Others are an introduction to a new technology or a new way of doing something. In Java Testing and Design (JTD) I cover the high level issues of testing services and also gives you code-level examples.

The goal of the book is to make this case: Delivering service excellence requires a new level of cooperation between software developers, QA technicians, and IT managers. JTD defines services as any software application that is accessible over a routed network using open protocols. JTD is a book about testing services, including Web applications, Service Oriented Applications, and n-tier applications. JTD begins with testing methodology, covers protocols and architecture decisions, and delivers code-level examples in TestMaker scripts.

TestMaker is a framework and utility that is composed of a number of open-source libraries and tools. The book uses TestMaker to show you by example how to accomplish testing tasks. The examples can easily be applied to other testing tools, languages and platforms.

For example, chapter 7 talks about the move developers made from HTTP applications through XML-RPC and into SOAP-based Web Services. The chapter shows the reasons why XML-RPC is cool and appropriate applications for XML-RPC. It then covers the things that typically go wrong with XML-RPC to give you some thoughts on how to test an XML-RPC application. The chapter then presents a TestMaker script showing how to make an XML-RPC call. TestMaker uses the Apache XML-RPC library. So, the example script shows by example how Apache XML-RPC could be used in a Java application too.

The book breaks down into three parts: JTD starts with an understanding of the reasons the existing testing methodologies fail to deliver excellent services. This part describes a test methodology and several techniques for measuring service excellence. The second part introduces the technologies used to deliver scalable and well performing services, make the case for where they are appropriately used, talks about the problems, and then shows a how-to example in code. The third part of JTD shows three case studies of how the methodology is applied to solving enterprise scalability, regression, functionality, and quality-of-service problems in information services.

I hope this answers your question and concern. Feel free to reply if it was off-the-mark.

Thanks for your interest in Java Testing and Design.

-Frank
19 years ago
Hi Dirk, Kishore, and Lasse: It's so nice of Java Ranch to invite me to be here. As anyone who has written a book knows, this isn't the project that will make you rich. But, it hopefully will make the world a little better. In the world of testing Web applications and Service Oriented Architecture there is a lot of education and effort that needs to be done to make the testing world better.

I very much appreciate the warm welcome. I'll be checking and replying to messages in this forum all week. If I miss your post feel free to send me an email at fcohen@pushtotest.com.

Thanks and I look forward to the time at Java Ranch.

-Frank
PushToTest
19 years ago