James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft

UpdatesWhy I joined Microsoft [published: March 21, 2012; written: March 13, 2012]
– James Whittaker ‏ @docjamesw 7:06 AM – 20 Mar 12 via web · Details

A web futurist is someone who hates the web as it is now and envisions a better future for it.

Why I hate search [MSDN Blogs > JW on Tech , March 15, 2012]

We start from scratch each time. We search for things we’ve already found.

The problem with Internet search is that being stupid about it is profitable. The more ugly blue links you serve up, the more time users have to click on ads. Serve up bad results and the user must search again and this doubles the number of sponsored links you get paid for. Why be part of the solution when being part of the problem pays so damn well? It’s 2012 and we are still typing search queries into a text box. Now you know why, a ‘find engine’ swims in the shallow end of the profit pool. Is it any surprise that technology such as Siri came from a company that doesn’t specialize in search? (Where do you place an ad in a Siri use case?)

There’s no more reason to expect search breakthroughs from Google than there is to expect electric car batteries to be made by Exxon.

We can do better. We’ve been searching for over a decade. We know every place possible where the online equivalent of car keys are found. We know where our online pet is, always. We know so many things about the world that no longer need to be served up as search “results.” (Results indeed! If users ever wake up and divorce their search engine, the “results” page is likely to be exhibit A in the separation hearing.)

Search, my friends, is broken. Finding things has become secondary to monetizing the search process. Fixing this situation is not in the best interest of the incumbents. Which, actually, is all well and good because the fix will need a more web-wide effort anyway. The companies that own the data sources, the companies that ingest, store and conflate that data, the myriad small development shops that do interesting things with the data, the cleverness of the people who curate the data and the power of crowdsourced know-how need to come together and make search … better? No, not better, irrelevant.

Search is dead. The web doesn’t need it and neither do we.

Now you see it, in 20 years you won’t [MSDN Blogs > JW on Tech , April 12, 2012]

Google’s Marissa Mayer gave a Flintstonian glimpse at what search might look like in 20 years including “predicting what restaurant you might like in a new city” and “connecting you with strangers based on common interests.” Things that take entire seconds today will take … entire seconds in 2032. Thankfully, for Mayer at least, violating Moore’s Law carries no actual criminal or civil penalties.

In a nutshell, what Mayer (and I assume Google) is proposing is that in twenty years Google and the web will still be standing between knowledge and its consumption. Google has 40 billion reasons to be patient regarding the future.

You want a prediction of the future? The trend of disappearing search will continue. The web will melt into the background and humans will progressively be removed from their labor intensive and frustrating present by automation. In five years the web is likely to be completely invisible. You will simply express your intent and the knowledge you seek will be yours. Users will be seamlessly routed to apps capable of fulfilling their intent. Apps won’t need to be installed by a user; they will be able to find opportunities to be useful all by themselves, matching their capabilities with a user’s intent. You need driving directions? Travel reservations? Takeout? Tickets to a show? Groceries? Tell your phone, it will spare you the ugly links. It will spare you the landing page. It will spare you the ads. It will simply give you what you asked for. This is already happening today, expect it to accelerate.

End of Updates

James Whittaker@docjamesw 8:03 PM – 29 Feb 12 via web · Details

I got my team today. Hmm…what shall I do with 300 developers? You won’t have to wait long to find out.

About JW on Tech [MSDN Blogs > JW on Tech > About James Whittaker, March 13, 2012]

James Whittaker is a technology executive with a career that spans academia, start-ups and top tech companies. He is known for being a creative and passionate leader and in technical contributions in testing, security and developer tools. He’s published dozens of peer reviewed papers, five books and has won best speaker awards at a number of international conferences. During his time at Google he led teams working on Chrome, Google Maps and Google+. He is currently at Microsoft reinventing the web [in a Partner Development Manager role as per LinkedIn, and as a web futurist at Microsoft according to his twitter account].

Want to read more? James wrote How to Break Software, How to Break Software Security (with Hugh Thompson), and How to Break Web Software (with Mike Andrews). While at Microsoft, James transformed many of his testing ideas into tools and techniques for developers and testers, and wrote the book Exploratory Software Testing [Sept 4, 2009]. His current book was written when he was a test engineering director at Google and is called How Google Tests Software (with Jason Arbon and Jeff Carollo) [Whittaker’s twitter: getting close to end of printing, otherwise April 8, 2012].

     

How Google Tests Software [Google Testing Blog, May 26, 2011]:
Part 1Part 2Part 3Interlude  – Part 4Part 5Part 6Q&APart 7

Large-scale Exploratory Testing: Let’s Take a Tour [SQEVideo, published on Oct 2, 2011]

STARWEST 2009, October 5-9, 2009: Manual testing is the best way to find the bugs most likely to bite users badly after a product ships. However, manual testing remains a very ad hoc, aimless process. At a number of companies across the globe, groups of test innovators gathered in think tank settings to create a better way to do manual testing–a way that is more prescriptive, repeatable, and capable of finding the highest quality bugs. The result is a new methodology for exploratory testing based on the concept of tours through the application under test. In short, tours represent a more purposeful way to plan and execute exploratory tests. James Whittaker describes the tourist metaphor for this novel approach and demonstrates tours taken by test teams from various companies including Microsoft and Google. He presents results from numerous projects where the tours were used in critical-path production environments. Learn about the collection of test tours, test cases, and bugs from these case studies and recommendations for using tours on your own products.

[James Whittaker’s Google+ post, Feb 13, 2012]

Signing off of Google+
This will be my last post on Google+. Anyone interested in my post-Google career can follow me on Twitter (@docjamesw).

[James Whittaker’s Google+ post, Feb 3, 2012]

There comes a time when all good things must end and my time at Google is one of them. This is not one of those “Google let me down” rants, nor is it a “I love this company, keep up the good work” farewell … just a realization that even as my perf scores and profile within the company has risen my ability to lead has diminished. It’s time to stop being part of a team changing the world and time to go lead one. Unfortunately, the place to do that is elsewhere. Today is my last day.

Keep in touch with me on Twitter @docjamesw. Or not.

James Whittaker interview [TCLGroupLimited, Dec 3, 2011]

James Whittaker, Engineering Director of Google interviews for TCL in November 2011. … [3:26] I think Google has a much more mature testing organization than probably almost anyone. In fact, I think, really at the top of the testing tool chain Microsoft and Google probably stand alone. [3:38] … [6:27] I think both companies still have that kind of aura about that net cool factor. [6:32] … [8:08] Traditional publishing is as dead as testing …

James Whittaker’s testing blog posts while with Google [Google Testing Blog, June 8, 2009 – Nov 15, 2011]

James Whittaker joins Google [Google Testing Blog, June 2, 2009]

By Patrick Copeland

I’m excited to announce that James Whittaker has joined us as our newest Test Director at Google.

James comes to us most recently from Microsoft. He has spent his career focusing on testing, building high quality products, and designing tools and process at the industrial scale. In the not so distant past, he was a professor of computer science at Florida Tech where he taught an entire software testing curriculum and issued computer science degrees with a minor in testing (something we need more schools to do). Following that , he started a consulting practice that spanned 33 countries. Apparently, fashion is not high on his list as he he has collected soccer jerseys from many of these countries and wears those during major tournaments. At Microsoft he wrote a popular blog,and in the near future you can expect him to startcontributing here.

He has trained thousands of testers worldwide. He’s also written set of books in the How to Break Softwareseries. They have won awards and achieved best seller status. His most recent book is on exploratory testing is coming out this summer. It is not a stretch to say that he is one of the most recognizable names in the industry and has had a deep impact on the field of testing. If you have a chance, strike up a conversation with James about the future of testing. His vision for what we’ll be doing and how our profession will change is interesting, compelling and not just a little bit scary.

Join me in welcoming James to Google!

James Whittaker’s testing blog posts while with Microsoft 1st time [posted between July 8, 2008  and May 21, 2009]

tour of the month: the exit-stage-right tour [MSDN Blogs > JW on Test, May 21, 2009]

All tours much eventually come to an end and thus it is with my tour with Microsoft. I have resigned my position and am leaving the company. It was a great ride.

But the tours will continue. My book Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Manual Testers is in press and will appear through Addison-Wesley sometime this summer. I am truly thankful for the many wonderful testers at Microsoft who contributed wisdom, thoughts and even case studies to the effort. Special thanks go to Nicole Haugen, Geoff Staneff, David Gorena Elizondo, Shawn Brown and Bola Agbonile. Microsoft is full of great testers and even here, these guys manage to stand out.

I imagine that I will not be long in setting up a new blog as I have very much enjoyed this experience and being the only tester in Developer Division’s top ten bloggers was quite an honor. For that I thank you.

In case you are interested in my landing place, I can imagine that one or two of the more popular testing blogs around town will be talking about it.

Wish me luck …

before we begin [MSDN Blogs > JW on Test, July 8, 2008]

For those of you familiar with my writing I plan to update some of my more dated work (history of testing, testing’s ten commandments, and so forth) and preview some of the information that I will be publishing in paper and book form in the future. Specifically, I now (finally) have enough notes to revise my tutorial on manual exploratory testing: How to Break Software and will be embarking on that effort soon. This blog is where I’ll solicit feedback and report on my progress.

For now, here’s an update on what’s happening, testing-wise, for me at Microsoft:

  • I am the Architect for Visual Studio Team System – Test Edition. That’s right, Microsoft is upping the ante in the test tools business and I find myself at the center of it. What can you expect? We’ll be shipping more than just modern replacements for tired old testing tools. We’ll be shipping tools to help testers to test: automated assistance for the manual tester; bug reporting that brings developers and testers together instead of driving them apart; and tools that make testers a far more central player in the software development process. I can’t wait!
  • I am the Chair of the Quality and Testing Experts Community at Microsoft. This is an internal community of the most senior testing and quality thought leaders in the company. We kicked off the community with record-breaking attendance (the most of any of Microsoft’s technical network communities) at our inaugural event this past spring where some of our longest-tenured testers shared a retrospective of the history of testing at Microsoft followed by my own predictions for the future of the discipline. It was a lively discussion and underscored the passion for testing that exists at this company. In this quarter’s meeting we’re doing communal deep dives into the testing-related work that is coming out of Microsoft Research. MSR, the division responsible for Virtual Earth and the Worldwide Telescope also builds test tools! I can’t wait to ship some of this stuff!
  • I am representing my division (DevDiv) on a joint project with Windows called aQuality Quest. Our quest is concerned with quality, specifically, what we need to do to ensure that our next generation of platforms and services are so reliable that users take quality for granted. Sounds like I took the blue pill, doesn’t it? Well, you won’t find us dancing around acting like our software is perfect. Anyone who has ever heard me speak (either before or after I joined Microsoft) has seen me break our apps with abandon. In this Quest, we’ll leave no stone unturned to get to the bottom of why our systems fail and what processes or technology can serve to correct the situation.

New hire into our group – James Whittaker [MSDN Blogs > Michael Howard’s Web Log, May 5, 2006]

I’m pleased to announce, actually I’m *thrilled* to announce, that James Whittaker has joined our group [SDL – Security Development Lifecycle]. James is a well-known author and speaker on software testing and security. He most recently worked as a professor of computer science at Florida Tech where he ran a huge software security research team. James created the “How to Break…” book series with Addison Wesley. He wrote How to Break Software [May 19, 2002], How to Break Software Security [May 19, 2003] and How to Break Web Software [Feb 12, 2006].

     

He’s also one of the folks behind the Holodeck testing tool.

He’s a cool guy, sharp as a tack, with a very dry sense of humor, so we should get along just fine! He’ll be a peer of mine, reporting to Steve Lipner [Trustworthy Computing Initiative chief], and is initially focused on our internal security and privacy training.ne of the folks behind the Holodecktesting tool.

As I’m sure most of you will agree, hiring good security people takes time, and hiring talent like James is rare indeed.

Welcome, James!
[Michael Howard is now the chief security officer for Microsoft as a so called Principal Cybersecurity Architect working with customers and partners. Before that he was a long-time member of the Security Development Lifecycle team, in fact a co-founder of that in 2001, the SDL being also closely related to the now 12 years old Trustworthy Computing Initiative by Microsoft. ]

GTAC 2008 Keynote Address: The Future of Testing by James Whittaker of Microsoft [GoogleTechTalks, published on Apr 7, 2009]

Presented by James Whittaker, Microsoft Corp. at the 3rd Annual Google Test Automation Conference (GTAC) held in Seattle, WA on October 23-24, 2008.

An Interview with James Whittaker [Dr.Dobb’s Journal, Sept 26, 2006]

Michael Hunter interviews James Whittaker, noted testing guru and author, to shed some light on his testing philosophy.

James Whittakeris, I dare say, one of the celebrities of the testing world. He was long a professor of computer science at the Florida Institute of Technology, where he became well-known for his efforts to find ways to make testing a teachable skill. He and his research group there created innovative testing technologies and tools, including the popular runtime fault injection tool Holodeck, and became highly skilled at breaking software security. James founded Security Innovation to productize his work, but recently he has left both that company and teaching to join Microsoft as a Security Architect, where he is working to integrate testing into the Security Development Lifecycle (SDL).

James wrote How To Break Software – one of my favorite books on testing, co-wrote How To Break Software Security (also very good) with Hugh Thompson, and co-wrote How To Break Web Software(haven’t read it yet) with Mike Andrews. James’ talks at Microsoft are always standing room only; this interview will give you a taste of why.

DDJ: What was your first introduction to testing? What did that leave you thinking about the act and/or concept of testing?

JW: I was in graduate school in a software engineering group studying high assurance software engineering methodologies (cleanroom to be specific) and the bloody dev group met at 7:30 on Saturday mornings! I missed the first three meetings (dude, in grad school the nerd act doesn’t happen that early on a weekend) so the professor put me in charge of the independent test team (which I discovered was just me). So that left me with the idea that testers get more sleep than devs but that we need it because we are woefully outnumbered.

And that perception remains, sans the sleep part.

DDJ: What has most surprised you as you have learned about testing/in your experiences with’ testing?

JW: The sheer number of people *passionate* about testing, particularly at Microsoft. It gives me a great deal of confidence in the future knowing that such skill and talent is being applied to the hardest problem the discipline has to offerwhich is quality.

DDJ: What is the most interesting bug you have seen?

JW: The most interesting bug is always the latest bug. Just today everyone in our group was surprised at an Inbox with thousands of recall status messages. Someone sent a mail from an alias of 1275 members, then recalled it. The recall then sent success/failure notices to EVERYONE on the alias. That’s 1275 x 1275 (about 1.6 million) emails! How’s that for exploiting a design flaw!

DDJ: How would you describe your testing philosophy?

JW: Eyes open, brain on, test! Or the longer explanation covered in How to Break Software. Thanks for the chance to plug one of my books!

DDJ: What do you see as the biggest challenge for testers/the test discipline for the next five years?

JW: There are a number of trends that testers are going to have to grapple with. The first is that software is getting better. The result of this is that bugs are going to become harder and harder to find and the weaker testers will be relegated to Darwinian insignificance. Keeping sharp, building skills and maintaining a cutting edge testing knowledge has never been more important.

The second is that software process is finally taking over. For years processes haven’t much affected the way software is built (which doesn’t say much for legacy processes). But here at Microsoft the SDL is revolutionizing the way software is constructed. Testers have to figure out their role in this process. We have to be there, working, at project initiation and play a key role in every single phase of the lifecycle. Testing is not a task for the latter stages of the ship cycle. Testers who realize this and customize their work accordingly will rise in prominence within their product group and be able to influence the growth of the SDL rather than be steamrolled by it.

[See my Table Of Contents post for more details about this interview series.]

“if Microsoft is so good at testing, why does your software suck?” [MSDN Blogs > JW on Test, Aug 11, 2008]

What a question! I only wish I could convey the waythat question is normally asked. The tone of voice is either partially apologetic (because many people remember that I was a major ask-er of that same question long before I became an ask-ee) or it’s condescending to the point that I find myself smiling as I fantasize about the ask-er’s computer blue-screening right before that crucial save. (Ok, so I took an extra hit of the kool-aid today. It was lime and I like lime.)

After 27 months on the inside I have a few insights. The first few are, I readily concede, downright defensive. But as I’ve come to experience firsthand, true nonetheless. The last one though is really at the heart of the matter: that, talent notwithstanding, testers at Microsoft do have some work to do.

I’m not going down the obvious path: that testing isn’t responsible for quality and to direct the question to a developer/designer/architect instead. (I hatethe phrase ‘you can’t test quality in,’ it’s a deflection of blame and as a tester, I take quality directly as my responsibility.)

But I am getting ahead of myself. I’ll take up that baton at the end of this post. Let’s begin with the defensive points:

  • Microsoft builds applications that are among the world’s most complex. No one is going to argue that Windows, SQL Server, Exchange and so forth aren’t complex and the fact that they are in such widespread use means that our biggest competitors are often our own prior versions. We end up doing what we call “brown field” development (as opposed to ‘green field’ or version 1 development) in that we are building on top of existing functionality. That means that testers have to deal with existing features, formats, protocols along with all the new functionality and integration scenarios that make it very difficult to build a big picture test plan that is actually do-able. Testing real end-to-end scenarios must share the stage with integration and compatibility tests. Legacy sucks and functionality is only part of it…as testers, we all know what is really making that field brown! Be careful where you step. Dealing with yesterday’s bugs keeps part of our attention away from today’s bugs.

(Aside: Have you heard that old CS creationist joke: “why did it take god only seven days to create the universe?” The answer: “No installed base.” There’s nothing to screw up, no existing users to piss off or prior functionality and crappy design decisions to tiptoe around. God got lucky, us…not so much.)

  • Our user-to-tester ratio sucks, leaving us hopelessly outnumbered. How many testers does it take to run the same number of test cases that the user base of, say, Microsoft Word can run in the first hour after it is released? The answer: far more than we have or could hire even if we could find enough qualified applicants. There are enough users to virtually ensure that every feature gets used in every way imaginable within the first hour (day, week, fortnight, month, pick any timescale you want and it’s still scary) after release. This is a lot of stress to put our testers under. It’s one thing to know you are testing software that is important. It’s quite another to know that your failure to do so well will be mercilessly exposed soon after release. Testing our software is hard, only the brave need apply.
  • On a related point, our installed base makes us a target. Our bugs affect so many people that they are newsworthy. There are a lot of people watching for us to fail. If David Beckham wears plaid with stripes to fetch his morning paper, it’s scandalous; if I wore my underpants on the outside of my jeans for a week few people would even notice (in their defense though, my fashion sense is obtuse enough that they could be readily forgiven for overlooking it). Becks is a successful man, but when it comes to the ‘bad with the good’ I’m betting he’s liking the good a whole lot more. You’re in good company David.

But none of that matters. We’ll take our installed base and our market position any day. No trades offered. But still, we always ready to improve. I think testers should step up and do a better job of testing quality in. That’s my fourth point.

  • Our testers don’t play a strong enough role in the design of our apps. We have this “problem” at Microsoft that we have a whole lot of wicked smart people. We have these creatures called Technical Fellows and Distinguished Engineers who have really big brains and use them to dream really big dreams. Then they take these big dreams of theirs and convince General Managers and VPs (in addition to being smart they are also articulate and passionate) that they should build this thing they dreamt about. Then another group of wicked smart people called Program Managers start designing the hell out of these dreams and Developers start developing the hell out of them and a few dozen geniuses later this thing has a life of its own and then someone asks ‘how are we going to test this thing’ and of course it’s A LITTLE LATE TO BE ASKING THAT QUESTION NOW ISN’T IT?

Smart people who dream big inspire me. Smart people who don’t understand testing and dream big scare the hell out of me. We need to do a better job of getting the word out. There’s another group of wicked smart people at Microsoft and we’re getting involved a wee bit late in the process. We’ve got things to say and contributions to make, not to mention posteriors to save. There’s a part of our job we aren’t doing as well as we should: pushing testing forward into the design and development process and educating the rest of the company on what quality means and how it is attained.

We can test quality in; we just have to start testing a lot sooner. That means that everyone from TF/DE through the entire pipeline needs to have test as part of their job. We have to show them how to do that. We have to educate these smart people about what quality means and take what we know about testing and apply it not only to just binaries/assemblies, but to designs, user stories, specs and every other artifact we generate. How can it be the case that what we know about quality doesn’t apply to these early stage artifacts? It does apply. We need to lead the way in applying it.

I think that ask-ers of the good-tester/crappy-software question would be surprised to learn exactly how we are doing this right now. Fortunately, you’ll get a chance because Tara Roth, one of the Directors of Test for Office is speaking at STAR West in November. Office has led the way in pushing testing forward and she’s enjoyed a spot as a leader of that effort. I think you’ll enjoy hearing what she has to say.

Test Talk with James Whittaker [Oct 3, 2011]

Test Talk with James Whittaker

James Whittaker is in software testing as long as he can remember. During his study he wrote his graduation paper about Model Based Testing. He made fame at Microsoft and recently he ´joined the enemy´ by going over to Google. His books “how to break software” are bestsellers and the presentation in which he is hacking websites live in front of the audience are fantastic. His last book about Exploratory Software Testing is released last year. I was in the opportunity to ask him the questions below.

1. Can you introduce yourself and explain how you already became a tester during college at the University of Tennessee-Knoxville, judging the name of your dissertation.
My name is James Whittaker and I am a Director of Engineering at Google. I own Test for a bunch of Google products including Chrome browser, Chrome operating system, Google Toolbar as well as some Search and Geo products and a bunch of back-end data center infrastructure applications. I also own Development of engineering tools including both developers and testing tools.

I got into test when I was a grad student. Mostly it was by default as my software engineering research team met on Saturday mornings and I had better things to do early Saturday than spend it with a bunch of coding nerds. My professor gave me two choices: get fired or be a tester. Neither he or I knew what a favor he did for me at the time. I really hit the ground running and did a lot of innovative work in model-based testing. In fact, I got my PhD two years before any of those developers. Testing was a great career move for me even back then.

2. How hard was it to change from a Microsoft employee towards a Google employee, and is testing very different at these two companies?
Not hard at all difficult. Microsoft was great preparation for Google. Culturally they are polar opposites with Microsoft being more top down whereas Google is more engineering driven. At Microsoft the high ratio of testers to developers is a case in point. The numbers of people on a project is made by execs and managers. At Google it is made by engineers and no engineer at Google believes a 1:1 ratio is necessary or even healthy. The fewer testers on a project means more involvement by developers for QA. I had a meeting yesterday with the development director for Chrome OS and the entire subject was what they could do to make our job easier. The director was genuinely concerned that his developers were engaging deeply enough on testing issues. A culture like that makes the 1:1 ratio irrelevant … everyone on the project is a tester.

3. I read somewhere that you are busy at Google with forging a future in which software just works. Is that possible, a world without software bugs?
Not in my lifetime. However we are getting closer. Even a few years ago I had to pull the battery on my smart phone every week or so. I’ve not even turned my current one off for 3 months and it works fine. Quality assurance for software is much like health care for humans. Humans will always get sick but with good prevention, good hygiene and regular maintenance our bodies do ok. We need to make testing like this: continuous and ongoing. One of the things that annoys me is the whole “push quality upstream” movement. Some people seem to believe that we can rig it so we just write perfect code. That’s like taking all your vitamins when you are a baby and then expecting a long healthy life. Obviously upfront debugging is good, but quality is an ongoing endeavor. It starts at the beginning and is a constant activity throughout the life of a product.

4. Patrick Copeland said that your vision on the future of testing is interesting, compelling and not just a little bit scary. Can you shortly tell us your vision, so we don’t get scared?
I am happy that it scares people and honored that it scares smart people like Patrick who thinks deeply on these matters. Too many people are dogmatic about testing. Some say “avoid rigor and do only exploratory testing” and they say it with a fervor that reminds me of religious fundamentalist who see only black and white. Others say the same of automation with the same amount of self righteousness. One thing I do know is that when you think your world view is the only view, there is a problem. People like this have stopped thinking about alternatives. They’ve stopped being open minded. They’ve definitely stopped being right.

I am also not going to stand in the middle and start every answer with ‘it depends.’ It turns out that there are some absolutes. There are some testing problems that can be driven to extinction with automation. There are some problems where exploratory testing is exactly the opposite of a good idea. I think it is smart to be problem-oriented and not solution-oriented. The latter is the proverbial hammer solution where every problem looks like a nail because you sell hammers for a living. I’ve laid out my full vision for software testing in my latest book but let me just say here that the part people find scary is that my vision requires far fewer testers than the world currently employs.

5. Your last book is about Exploratory Testing. Can you explain how taking the supermodel tour will improve our testing skills?
All the tours focus a tester’s attention. The idea is to test on purpose. Exploratory testing does not have to lack rigor and it does not have to encompass endless wandering hoping that you find a bug. It also should be about finding important bugs. I find myself endlessly annoyed by speakers who show bugs that no one would care about. Any exploratory method can find easy bugs; what about the hard ones?

Many of the tours focus on a general class of bug. The Supermodel Tour as a specific case focuses on presentation layer bugs. It asks you to first identify important properties of the UI and then choose paths that force those properties to change and then be displayed on the UI. We called it the Supermodel Tour to get the idea across that we are looking only skin-deep for bugs (only at the UI level). The tour gives both general guidance in terms of focusing on displayable properties and specific guidance about what part of the application should be visited during an exploratory session (the functions that allow you to change and then display those values). So you see that it requires some pre-work and planning but then allows for exploration once that planning is done. For example, in Maps we run the Supermodel Tour on our classification of landmarks. We make a list of all the landmarks (national parks, places of interest and so forth) in advance and explore the UI to find each location. We (actually Brendan Dhein) found a bug where Arlington National Cemetery was classified as a restaurant! It’s a subtle bug if you are just exploring. But if you are running the Supermodel Tour is jumps out at you. The idea is that a good tester can become a great tester with the right focus and by testing on purpose.

6. What will your next book be about?
I’m writing a book called How Google Test Chrome which details our testing process start to finish on Chrome OS. It’s a totally open kimono assessment of everything that we are going. Right, wrong, false starts, great ideas, cool innovation, new tools and every test artifact we generate from plans to test cases to open source automation. I am psyched about this as I don’t believe anyone has every fully documented and published a complete project before, particularly one of the complexity of an operating system.

I plan to include the browser too in this but as I have only written the first chapter on the test plan I do not want to over commit!

7. How is testing managed at Google? From one place or per country or per application or … ?
It’s divided by product lines or what we call “Focus Areas.” In my case I own the Client Focus Area. However since I am in a remote site I also have authority over all the work that goes on in Kirkland and Seattle Washington. I’m a busy boy but I like it that way. A single product would bore me.

It’s funny that on the Dev side each product has a Director in charge of it. Whereas I am the Director over many products. I have Test Managers over each product who have to interact with a Director on the dev side. So if you match us up one to one, you might have a Test Manager matching wits on a daily basis with Development Director three levels above them in rank. You talk about character building, this is the place for that. Google test managers are a breed apart. Cream of the crop.

8. Are you still collecting soccer jerseys? And if so, is there one you really want to add to your collection?
Yes and I cannot wait to wear them during the World Cup. By tradition I never wear anything else during that tournament (sorry for the visual). When people invite me to speak I often get them as gifts and I have dozens. I am hoping I get a Swiss one this trip (hint, hint) and I lost my Australian one (please don’t ask) and am looking for a replacement. But I have a lot of club jerseys too and will relish the chance to wear different colors. Send me a jersey and I’ll send you some signed books!

9. I hear you will giving a keynote at the Swiss Testing Day. Can you give us a sneak preview on what it will be about?
“Testing On Purpose” is the title. I am talking in far more depth about how we are testing Chrome at Google. I hope to see you there.

All that testing is getting in the way of quality by James Whittaker, Part One  [TCLGroupLimited, Dec 8, 2011]

TCL presents Google’s James Whittaker on “All that testing is getting in the way of quality”. PART ONE is starting with the early 2011 changes in the way Google is carrying out engineering work. This made Whittaker to rethink the role of testers fundamentally. … CAN TESTING GET CREDIT FOR SOFTWARE GETTING BETTER (which is definitely the case)? … WHY SOFTWARE IS BETTER? …IMMEDIATE COST OF LOW QUALITY (user is moving away) … POST_SHIP BUG FIXING (could fix in-place now, auto update) … SELF REPAIRING SOFTWARE (crash recovery) … GOODBY SERVER CONFIG (cloud deployment) … REDUCTION OF [platform] VARIATONS AND DEPENDENCIES (standards) … ELEGANT PROGRAMMING LANGUAGES (Java with automatic garbage collection, Python, Ruby, … don’t write big programs) … ERADICATION OF CERTAIN BUG SPECIES (extinction) … BETTER CODE MANAGEMENT (initial code quality by automating workflows … pre-submit checks …) … CONTINUOUS BUILD/INTEGRATION/RELEASE/TEST …

All that testing is getting in the way of quality by James Whittaker, Part Two [TCLGroupLimited, Dec 8, 2011]

TCL presents Google’s James Whittaker on “All that testing is getting in the way of quality”. PART TWO is starting with the observation that USER IS A BETTER TESTER THAN YOU ARE … leading to the conclusion that TESTERS COULD COMPLETELY BE REMOVED (as unnecessary go betweens) FROM THE INSIDES OF THE DEVELOPER-USER RELATIONSHIP.

A Brave New World of Testing? An Interview with Google’s James Whittaker by Forrest Shull [IEEE Software, March/April 2012 pp. 4-7]

… In their introduction, the guest editors have compiled a list of questions related to what our future, cloud-intensive world is going to look like—many of which I’ve heard myself from colleagues in government and commercial positions. The one that I hear most often is this: How should organizations leverage the power of this approach to improve testing and quality assurance of software? To get an answer, I turned to James Whittaker, an engineering director at Google, which has been at the forefront of leveraging the cloud. James is a noted expert and author on software testing, whose team has been managing Google’s cloud computing testing. Some excerpts of our conversation:

What is it like right now, looking across cloud computing testing at Google? It sounds like a pretty major undertaking.

In one of your previous interviews, I came across a statement of yours that has become one of my favorite thought-provoking quotes. You said, “Anyone who says that testing is getting harder is doing it wrong.” Could you expand on this a bit?

In the cloud, all the machines automatically work together; there’s monitoring software available, and one test case will run anywhere. There’s not even a test lab. There’s just a section of the datacenter that works for you from a testing point of view. You put a test case there and it runs. And all of the different scheduling software that any datacenter uses to schedule tasks can be used to schedule tests. So, a lot of the stuff that we used to have to write and customize for our test labs, we just don’t need anymore.

The other thing the cloud has done is brought us closer to our users. Think of Google Maps: it’s really impossible to hire a group of testers to exhaustively test it. It’s literally a piece of software of planetary proportions. If there’s a bug in my address on Google Maps, I’m likely to be the only one who will find it. But the cloud also enables us to reach out to users who are early adopters to get better and richer bug feedback than we were ever able to do back in the client-server days, when once software got to the field it was very difficult to update and instrument. Now, it’s easy to update a datacenter, it’s easy to instrument a datacenter. If a customer finds a bug, it’s easy for them to tell us about it, and it’s easy for us to fix it and push that fix to all our users, by just refreshing a browser.

So the cloud really does change things. It’s a different model of development; it’s a different model of testing; it’s a different model of usage.

Regarding testers and the skill sets that they’ve traditionally been applying on the job, does the same skill set still apply? Or are people being asked to develop new skills to take advantage of all these cloud features?

So, if I can paraphrase what you’ve been saying, the cloud is changing the whole underlying economics of software development and software testing. It’s easier and quicker for a company to try something, push it out to users, hear from the users what the problems are, and fix them, than it is to follow the traditional path of getting the requirements right up front, then getting the architecture right and nailed down, then getting the coding done well ….

Absolutely. By the time you do all that stuff, you’re too late. Your competitor’s beaten you to the market. On the cloud, you can really release and iterate—that’s much more the development model of modern times.

But you have to be careful: Google’s not pushing software out to its users saying, “Hey, is this any good? We’re not sure!” There are a lot of intermediate steps. We have an internal process we call dogfooding, as in, if you’re trying to sell dog food, you should eat your own product first to make sure it’s okay. All our software is used internally first by Googlers before we push it out to the world. If you look at something like Google+, which we released last year, we used that internally among Googlers for many months before we released it. In that process of dogfooding Google+, we found far more bugs and far richer bugs than the test team associated with Google+.

The points you’re making, about having representative users from the beginning who are able to use the product and help mature it, represents a much bigger paradigm shift than I had originally realized.

To me, that is just one of the most crucial things that companies absolutely have to get good at. In the past, if you found a bug in, say, your browser, you didn’t know how to report it. You’d have to find some bug-reporting page on the vendor’s site, and it would ask you what operating system you were using and what version of the browser you were using, and what other plug-ins you had installed…. But the machine knows all that stuff! So the idea is that once you crash, or once a user finds a bug, you just grab that machine state and send it back to the vendor so that they can understand the state the user was in exactly.

This seems like a very concrete model to use for functional testing. But does the same paradigm work if I’m worried about things like reliability, performance, or throughput?

Or better yet, security, privacy, and so on. I agree with you completely. I think the idea of paying top dollar for engineers to do functional testing really is an artifact of the 1990s and 2000s, and shouldn’t be something that companies invest in heavily in the future. But things like security, privacy, and performance are very technical in nature. You don’t do security testing without understanding a lot about protocols, machine states, or how the Web works; a lot of a priori knowledge is required. You can’t replace that. So when I give advice to functional testers who say that I’m predicting the end of their job, specialization is one of the things I recommend. Specialization is crucially important.

How does the simplistic testing model that we all learned in school—where you go first through unit testing, then integration testing, then system testing—adapt to the new paradigm?

We do integration testing, but we call it something different. People always say that Google just likes to change the names of things, but we did this one on purpose. We don’t have to integrate it from environment to environment, but we do have to integrate it across developers. So developer A writes one module, developer B writes another module; to us, integration testing hits both developer A’s anzsoftware that you simply do not have to run on the cloud: any sort of configuration test, and any sort of load testing, just isn’t necessary in this new modern environment. Load is taken care of for you; if it slows down, new cells in the datacenter are spun off automatically.

When you hire new testers for your teams at Google, is there something in particular that you’re looking for? You mentioned specialization as being important, but is there anything else that makes a good cloud tester versus just a good tester?

For folks who are trying to move legacy systems onto the cloud, does their development and testing process look a lot different from what they’d use when trying to do something more greenfield?

Where are things going in the future? Will abstractions allow developers and testers to worry about even fewer issues over time, or will there be new things that we do need to worry about as more and more people go on the cloud?

There are definitely some new things that we’ll need to worry about. First and foremost, connecting to customers is going to be really important. As much as we have the server side of it down (instead of having a massively complex server, we just have this cloud that takes care of itself), there’s still a lot of variation on the device/user side. If you look at the number of Android devices that are out there, and the number of operating systems and apps that people have configured onto them, that is still a hard testing problem.

The cloud actually makes that easier, too. Crowdsourcing companies are now connecting certain specific people with specific devices to people who are writing apps on those devices. So the idea of leveraging the crowd through the cloud is definitely something that hasn’t been done before, and is a new phenomenon that we’re watching really carefully here.

One thing is for sure, we’re never going to settle on a single platform. Humankind doesn’t seem to be capable of doing that, and I don’t think it would be a good thing to eliminate competition among platforms. The Linux/Windows competition has always been healthy, and the same thing is happening in the mobile space now. So we’re always going to have to develop for multiple platforms, and those platform owners are going to want to innovate as quickly as they can and they’re not always going to be checking with you or each other on those innovations, so the developers are just going to have to be on their toes.

Learn More

My conversation with James touched on many more issues than I could note here. If you’re interested in hearing more of the conversation we had, which ranged over additional issues such as cloud testing tools and handling privacy and robustness, then check out our half-hour audio interview at http://doi.ieeecomputersociety.org/10.1109/MS.2012.23.

More than anything else, my conversation with James made me aware again of the significant changes to the way we do business that accompany the cloud, and the new skills that are becoming important. Perhaps the best summary was James’ comments that “People really need to take the cloud seriously and rethink testing from the ground up. There are a lot of sacred cows in testing that just go away with the transition to the cloud. Keeping an open mind and taking advantage of the efficiencies of the cloud are going to be really important.” I certainly hope the remainder of this special issue on cloud computing will help give you useful food for thought in doing so.

Why I left Google [MSDN Blogs > JW on Tech, March 13, 2012]

Ok, I relent. Everyone wants to know why I left and answering individually isn’t scaling so here it is, laid out in its long form. Read a little (I get to the punch line in the 3rdparagraph) or read it all. But a warning in advance: there is no drama here, no tell-all, no former colleagues bashed and nothing more than you couldn’t already surmise from what’s happening in the press these days surrounding Google and its attitudes toward user privacy and software developers. This is simply a more personal telling.

It wasn’t an easy decision to leave Google. During my time there I became fairly passionate about the company. I keynoted four Google Developer Day events, two Google Test Automation Conferences and was a prolific contributor to the Google testing blog. Recruiters often asked me to help sell high priority candidates on the company. No one had to ask me twice to promote Google and no one was more surprised than me when I could no longer do so. In fact, my last three months working for Google was a whirlwind of desperation, trying in vain to get my passion back.

The Google I was passionate about was a technology company that empowered its employees to innovate. The Google I left was an advertising company with a single corporate-mandated focus.

Technically I suppose Google has always been an advertising company, but for the better part of the last three years, it didn’t feel like one. Google was an ad company only in the sense that a good TV show is an ad company: having great content attracts advertisers.

Under Eric Schmidt ads were always in the background. Google was run like an innovation factory, empowering employees to be entrepreneurial through founder’s awards, peer bonuses and 20% time. Our advertising revenue gave us the headroom to think, innovate and create. Forums like App Engine, Google Labs and open source served as staging grounds for our inventions. The fact that all this was paid for by a cash machine stuffed full of advertising loot was lost on most of us. Maybe the engineers who actually worked on ads felt it, but the rest of us were convinced that Google was a technology company first and foremost; a company that hired smart people and placed a big bet on their ability to innovate.

From this innovation machine came strategically important products like Gmail and Chrome, products that were the result of entrepreneurship at the lowest levels of the company. Of course, such runaway innovative spirit creates some duds, and Google has had their share of those, but Google has always known how to fail fast and learn from it.

In such an environment you don’t have to be part of some executive’s inner circle to succeed. You don’t have to get lucky and land on a sexy project to have a great career. Anyone with ideas or the skills to contribute could get involved. I had any number of opportunities to leave Google during this period, but it was hard to imagine a better place to work.

But that was then, as the saying goes, and this is now.

It turns out that there was one place where the Google innovation machine faltered and that one place mattered a lot: competing with Facebook. Informal efforts produced a couple of antisocial dogs in Wave and Buzz. Orkut never caught on outside Brazil. Like the proverbial hare confident enough in its lead to risk a brief nap, Google awoke from its social dreaming to find its front runner status in ads threatened.

Google could still put ads in front of more people than Facebook, but Facebook knows so much more about those people. Advertisers and publishers cherish this kind of personal information, so much so that they are willing to put the Facebook brand before their own. Exhibit A: http://www.facebook.com/nike, a company with the power and clout of Nike putting their own brand afterFacebook’s? No company has ever done that for Google and Google took it personally.

Larry Page himself assumed command to right this wrong. Social became state-owned, a corporate mandate called Google+. It was an ominous name invoking the feeling that Google alone wasn’t enough. Search had to be social. Android had to be social. You Tube, once joyous in their independence, had to be … well, you get the point. Even worse was that innovation had to be social. Ideas that failed to put Google+ at the center of the universe were a distraction.

Suddenly, 20% meant half-assed. Google Labs was shut down. App Engine fees were raised. APIs that had been free for years were deprecated or provided for a fee.As the trappings of entrepreneurship were dismantled, derisive talk of the “old Google” and its feeble attempts at competing with Facebook surfaced to justify a “new Google” that promised “more wood behind fewer arrows.”

The days of old Google hiring smart people and empowering them to invent the future was gone. The new Google knew beyond doubt what the future should look like. Employees had gotten it wrong and corporate intervention would set it right again.

Officially, Google declared that “sharing is broken on the web” and nothing but the full force of our collective minds around Google+ could fix it. You have to admire a company willing to sacrifice sacred cows and rally its talent behind a threat to its business. Had Google been right, the effort would have been heroic and clearly many of us wanted to be part of that outcome. I bought into it. I worked on Google+ as a development director and shipped a bunch of code. But the world never changed; sharing never changed. It’s arguable that we made Facebook better, but all I had to show for it was higher review scores.

As it turned out, sharing was not broken. Sharing was working fine and dandy, Google just wasn’t part of it. People were sharing all around us and seemed quite happy. A user exodus from Facebook never materialized. I couldn’t even get my own teenage daughter to look at Google+ twice, “social isn’t a product,” she told me after I gave her a demo, “social is peopleand the people are on Facebook.” Google was the rich kid who, after having discovered he wasn’t invited to the party, built his own party in retaliation. The fact that no one came to Google’s party became the elephant in the room.

Google+ and me, we were simply never meant to be. Truth is I’ve never been much on advertising. I don’t click on ads. When Gmail displays ads based on things I type into my email message it creeps me out. I don’t want my search results to contain the rants of Google+ posters (or Facebook’s or Twitter’s for that matter). When I search for “London pub walks” I want better than the sponsored suggestion to “Buy a London pub walk at Wal-Mart.”

The old Google made a fortune on ads because they had good content. It was like TV used to be: make the best show and you get the most ad revenue from commercials. The new Google seems more focused on the commercials themselves.

Perhaps Google is right. Perhaps the future lies in learning as much about people’s personal lives as possible. Perhaps Google is a better judge of when I should call my mom and that my life would be better if I shopped that Nordstrom sale. Perhaps if they nag me enough about all that open time on my calendar I’ll work out more often. Perhaps if they offer an ad for a divorce lawyer because I am writing an email about my 14 year old son breaking up with his girlfriend I’ll appreciate that ad enough to end my own marriage. Or perhaps I’ll figure all this stuff out on my own.

The old Google was a great place to work. The new one?

About these ads

About Nacsa Sándor

Lazure Kft. • infokommunikációs felhő szakértés • high-tech marketing • elérhetőség: snacsa@live.com Okleveles villamos és automatizálási mérnök (1971) Munkahelyek: Microsoft, EMC, Compaq és Digital veterán. Korábban magyar cégek (GDS Szoftver, Computrend, SzáMOK, OLAJTERV). Jelenleg Lazure Kft. Amire szakmailag büszke vagyok (időrendben visszafelé): – Microsoft .NET 1.0 … .NET 3.5 és Visual Studio Team System bevezetések Magyarországon (2000 — 2008) – Digital Alpha technológia vezető adatközponti és vállalati szerver platformmá tétele (másokkal együttes csapat tagjaként) Magyarországon (1993 — 1998) – Koncepcionális modellezés (ma használatos elnevezéssel: domain-driven design) az objektum-orientált programozással kombinált módon (1985 — 1993) – Poszt-graduális képzés a miniszámítógépes szoftverfejlesztés, konkurrens (párhuzamos) programozás és más témákban (1973 — 1984) Az utóbbi időben általam művelt területek: ld. lazure2.wordpress.com (Experiencing the Cloud) – Predictive strategies based on the cyclical nature of the ICT development (also based on my previous findings during the period of 1978 — 1990) – User Experience Design for the Cloud – Marketing Communications based on the Cloud
This entry was posted in Cloud client SW platforms, Cloud Computing strategy, Cloud SW engineering, SaaS and tagged , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

4 Responses to James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft

  1. Pingback: A web újra feltalálása felhő alapon? - Hírcsatorna - devPortal

  2. Pingback: A web újra feltalálása felhő alapon? « Szoftver aktualitások

  3. Pingback: Deep technical evangelism and development team inside the DPE (Developer and Platform Evangelism) unit of Microsoft | Experiencing the Cloud

  4. Pingback: “Mélytechnológiai” központ, először a redmondi fejlesztői kapcsolatok (… DPE) történetében | Szoftver aktualitások

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s