I have cleared SCJP,SCJD,SCWCD and SCBCD and I had used the Whizlabs respective simulator for each exam (except SCJD - for which it is not relevant). I took the SCJP after almost 10 years in Java on account of friend recommendation. Little did I know how hard and tricky the exam is and arrogantly enough, I was under the impression that I could easily clear it with minimum preparation. Just a week prior to the exam I got a little curious about it and did some browsing on the ranch. I was shocked!!!.
Whizlabs SCJP5.0: When I realized what the exam was about I quickly purchased the SCJP5.0 simulator from Whizlabs. I immediately took the diagnostic exam and failed miserably (this seems to be a Whizlabs pattern and I will refer to it later). Now I panicked. I spent the rest of the week vigorously testing my self with the simulator and trying to rescue my sorry behind from failing the exam.
I had no experience with Whizlabs products prior to that and the SCJP simulator seemed to really help me. Many of the questions, especially the code snippets, seemed to me, very messy at the time - but I figured this is intentional with the aim of preparing me for messy code snippets in the actual exam (which they actually really were). I was used to neat and proper indented code and reading the simulator mess was really a tough experience for me. However - I swallowed that pill and it really helped me to accept messier code than I was used to. Other than that, there were some annoying issues with the flash UI.
Such was the fact that whenever I dragged the bookmarks window (Whizlabs allow you to enter your own comments as bookmarks), it would erase everything I wrote in it and I had to start over again. I quickly learned to initialize it with typing some arbitrary text, save it, and then do whatever I want with it. After the first save it was less 'touchy' about dragging.
Another issue was that the scrolling (with the mouse wheel) got very slow and it took me ages to scroll down and review the entire question. I had to read the questions, and choose answers, that were located in different parts of the screen that I could not see together. For the answers bit, this is hard enough. For the questions part it is simply intolerable! I could not find a trick for that.
All in all, I was pleased with the SCJP5.0 simulator and I felt prepared. I was not! A week is hardly enough preparation for SCJP (any version). On the day of the exam my prometric center held exams for many vendors. I was in a class room full of Microsoft, Cisco and Oracle candidates taking different exams on the same day. Bill's people went home after 20 minutes (Wow, that's got to be a tough exam), Cisco followed after 40. The Oracle candidates sweated for almost one and a half hours. I, on the other hand, could only call it the day 1.5 hours after the last Oracle guy left. I did not do too good either. I cleared the SCJP exam with a mark of 66. Shame on me. 10 years in Java and I barley passed. This taught me my lesson and I stopped under-estimating Sun's certification.
All in all the Whizlabs SCJP5.0 simulator is worth it's price it and I can say I do not regret purchasing it. My low grade is entirely my fault and the simulator really prevented me from completely failing (I wish I had - now I have to live with that horrible score). It took me three more months to finish part1 and part2 of the SCJD and then I moved to SCWCD.
Whizlabs SCWCD5.0: In January 2009 I was amazed with the high SCJD standards and I learned my lesson from SCJP so I planned more than a month to study for SCWCD. I wanted to clear it with a nice mark this time. Naturally, I took the newly purchased simulator diagnostic exam first. I failed miserably again (I already said this is a pattern). Now, this was not fair. I'm writing servlets for a living since 2002. I am not too bad either. It was an extremely and disproportionally hard exam. I remembered this after the real exam which was not nearly as hard. It seems to me, this is some kind of a marketing gimmick designed to depress the average candidate to convince them they don't know their elbows from their back end and are in desperate need of the product.
Well, let me tell you the bottom line: I cleared SCWCD with 82. This was on account of the SCWCD5.0 simulator - but not because it was good. It was a nightmare!!!. It is one hell of a messy, buggy and annoying application. I had to verify everything against the Servlet specs. This is why I got so good. I could not trust the crappy thing for anything it told me. I had to look up every statement that it made in the specs and API docs. It is full of errors, of carelessness mistakes, bad English and indentation issues, and it is clear that it had not passed minimal QA.
At some point I started to notice the simulator is marking me wrong for correct answers. Amazingly, even the final simulator exam markings are wrong. Correct answers are marked wrong. Not only that, in many cases, the question you are asked, and the choices provided - has nothing to do with one another and are taken from different exam objectives. I am not even mentioning a misplaced or forgotten quotes in code snippets (as if this is SCJP again) and extremely messy code.
It turns out, the messy code, is some kind of flash problem and not really intentional (as I believed in SCJP5.0). There are cases where you are asked to mark two (out of 4) invalid choices - and only one is provided (and vise versa too). In other cases some choices get entangled together so if you mark 'C' the simulator thinks you marked 'D'. You can actually see the CDATA xml tags get messed up in the user GUI. These issues, varying in severity, are so abundant that I quickly lost the ability to evaluate my preparation degree for the exam.
I did not get true markings, the explanations I got were for the wrong problems, I was marked wrong for right and right for wrong and I was terrified of the exam (which was about material I've been working with for years). The final exam was not easy! I used 2.5 hours out of the 3 provided and I was under the impression of failing the whole time. This time, however, I found the real exam questions to be a bit different in style then what I was used to from the simulator. The real exam, nevertheless, was much easier than the Whizlabs SCWCD5.0 diagnostic exam. Also, the Sun's drag and drop questions are much easier to handle (from usability point of view) then the Whizlabs simulator D&D questions which do not really get dragged to where you expect them to, and in most cases cannot be cleared once dragged.
After the exam, I was pleased with the results but I was really mad about the waste of time with SCWCD5.0. I took two hours of my time and made a collection of 'marvelous' screen shots reflecting the most obvious Whizlabs bugs. I sent this little hall of shame to Whizlabs customer support. I was blowing steam - did not expect an answer. Surprisingly, I got a very respectful answer from Whizlabs customer support. They said they are very ashamed. They said they are taking this product off the shelves and replacing it with an online web simulator where all these issues are fixed. They said it is planned for launching the first week of March 2009. It is now the last week of March. This horrible application is still available for 75$ and I can not find any news regarding a new version on the Whizlabs site. They did, however, also offered me a full refund.
Whizlabs SCBCD5.0 I cashed on the refund offer but I asked Whizlabs, rather than cash, for an activation number for their SCBCD5.0 simulator. They kindly agreed. This time I was not expecting much from the simulator and I did some prior preparation with the Sun J2EE tutorial (which is, surprisingly for Sun, about the most boring document I ever read in my life) and the excellent Wiley book Mastering Enterprise EJB 3.0. (free - with source code examples of many J2EE examples). It turned out I did right not to count on this simulator.
This time, I employed a different tactic. I started by taking short adaptive tests using the simulator - one for each objective. Adaptive, learning mode, exams in SCBCD5.0 simulator are short (12 - 21 questions) and the neat thing about it is that due to some bug in the application - each test produces a 'finalexam_error.xml' file on the desktop with all explanations, choices and answers. I used this to make a quick copy and paste extensive revision notes list per objective. Other than this, the adaptive tests had a few minor bugs such as questions asking you to choose 'incorrect' statements while marking you for 'correct' statements. Annoying, but minor bugs.
When I covered all objectives and reviewed my notes, I took the diagnostic test. What can I tell you... I failed miserably again. Only this time, the test was three-fold hard. In most cases you are asked to mark 2 to 3 'incorrect' choices out of 4, or 1, 2 or 3 'correct' choices out of 4. This is for questions where, not only the objective is not clear, and the questions don't match the choices, but the English is bad as well and the grammar is wrong (sometimes words are missing or misplaced as if you have to complete the question like in 5th grade). Adding insult to injury, you, once again, have to deal with wise ass questions requiring you to choose 'incorrect' choices but grade for 'correct' answers. Later on, the explanation tells you something like 'Choice A is correct because a, b, c...' Now, go figure out if choice 'A' is correct because it is the right answer or because it is an incorrect answer.
I cannot express enough, my discontent, from educational training software that asks candidates for 'incorrect' choices and provides a mixture of correct and incorrect choices. This creates such a head mess with the candidate who may never really learn right from wrong, and may just accidentally, get stuck with the incorrect statement in memory. This, off course, is only under the assumption that the training program 'knows' the right from wrong choices. Now imagine what would happen if the candidate is not sure which is the correct answer - but the software insists wrong is right, and not only that, but also phrases the reasoning as such: 'choices A,B and D are correct and therefore choice C is the correct answer' or 'Choice C is incorrect and therefore the correct answer'. Now even if the candidates knew the right answer in the past - they can no longer be 100% confident. It's hard to imagine - but this is exactly what's going on in SCBCD5.0. Really ridiculous.
My breaking point came after at least 4 separate incidents where explanations for questions were simply wrong, and despite referencing to 'EJB 3.0 specs' gave wrong, misleading answers. The SCBCD5.0 simulator has an annoying tendency to refer you to resources that are simply unavailable, such as the Sun training courses (URL: http://www.sun.com/training/catalog/courses/CX-310-091.xml, for example) and Whizlabs propriety resources (that you have to purchase) - No thanks!.
I decided not to get aggravated anymore on account of buggy software. After 10 years in the industry, world wide, I just know, all too well, the ridiculous process that leads to these kinds of products. A former Sun or educational expert makes a list of questions and answers, this goes by e-mail in notepad text file directly to the Flash programmers. These in turn, copy & paste this into XML format (usually overnight under pressure), and the entire mess gets shoved in a poorly tested Flash application - for the general 'benefit' of the unsuspecting public for 75 bucks. Very nice.
If Whizlabs cuts back on QA - I have done more than my share of QA as a customer. This time I was not buying it. The simulator is so buggy it caused me more damage than good. I stopped using it, tried to forget all that it told me, and went back to the specs, API and books. I cleared SCBCD with 89. Highest score I ever got on Sun, without a simulator. Well, the exam is not all too difficult either.
Tom Silverman: SCJP5, SCJD6, SCWCD5, SCBCD5, IBM-142, ScrumMaster
For my next feat, I will require a volunteer from the audience! Perhaps this tiny ad?