Becoming a fantastic Software Engineer in Test and Automation
Hello Hacker News,
I've moved to the testing work recently enough from Dev/Support and have had a troublesome time attempting to find resources which will make me a better tester and a better software engineer in test. The one thing I found on hacker news other than hundreds of job postings was this: https://news.ycombinator.com/item?id=3264223#up_3264286 - which is both very old and doesn't seem to contain great advice.
So HN, how do I become an awesome software engineer in test, clearly there are a lot of them out there based on job postings, so where are they? What skills do they have? Where do they go to learn new skills to become better?
Thanks! Unfortunately, you've left out some of the details necessary to answer
your question. The big two big questions are: (1) What are you testing? (e.g. hardware?, software?, firmware?, ...
--and more detail is better) (2) What are you trying to automate? (e.g. an oscilloscope?, a Logic
Probe?, network throughput? computational load? memory usage? algorithm
efficiency? a software bug test suite to find regressions? ...). Other useful questions are: (1) Does your company do experimental/research work? (2) Is your company using "Agile", or "TDD" (Test Driven Development),
or some other organizational methodology? (3) What are your constraints? (e.g. do you need sub-millisecond timing
resolution?, are you limited to particular interfaces/buses like Serial,
GPIB/HPIB, SCPI, PCIe, USB, Ethernet, ...?, do you need to use
emulators?) (4) Does your company have an existing test regimen? (5) Is your middle name Tim? OK, that last one more than just a joke; it's actually a trick question
and it serves a point. The best test design and automation engineers
I've known all have a real knack for doing the unexpected --You can
often find bugs by doing unexpected things. Whether it's called "Test" or "Quality Assurance" or "Total Quality
Management" or whatever, the field is absolutely huge, and it can be
extremely fun and challenging. 1) Software, more specifically web applications. 2) Automated browser/ui tests, hopefully I'll get to move more toward the api/service/functional level as well. First stop will be to essentially build a regression suite. Other 1) Nope. 2) Using agile 3) Not constrained other than by management and other workload. Essentially I can make testing here what I want and I want to make it good. I'm fighting some pretty old ideas on a regular basis. 4) Basically no, they've hired me and another guy to 'develop' it but I had zero testing experience and the other guy has a few years, though has never led. 5) No, it's Patrick! I feel I have a knack for doing some unexpected things when I have freedom to think and I'm not hugely rushed. Unfortunately I don't get much support from management. For example, I found an unexpected bug which I and several devs thought was a 'show-stopper', the product owner thought this was, 'testers running through unrealistic scenarios.' The kinds of "test" and "verification" automation I've done in the past
are vastly different from the kinds you'll be doing. Though I've never
personally faced the really extreme needs (read: "requirements") of
diving physical test equipment with a Real-Time OS to get sub-millisecond
resolution, or need to resort doing the testing (read: "verification")
through simulation for even faster resolutions, the general concepts of
"Quality Assurance" will still apply to your situation. 1.) Always show up with solutions, not problems. With good reason, Rule #1 is the first thing to always remember. You
never want to be the person who impedes development or release. You
always want to be the person who improves development and release. By
the nature of you job, you will be the person who often needs to
deliver the unwanted bad news, but if you gain a reputation for
delivering the bad news with some good news, like ways around a problem,
then people will seldom dread talking to you. Take responsibility for
improving the product, offer fixes, and still offer fixes when it's
not your job to fix the bugs. Never utter the words "show stopper." Even if you are right, it may not
be your decision, and you can easily make enemies of the people you need
to work with every day. Instead educate others on the potential harm to
customers, potential loss of customers, and if that doesn't work, the
potential harm to the viability of the company as an on-going venture.
When you need to do this, and you will need to do it, be prepared with
multiple paths for working around or fixing the issue along with a cost
estimate for fixing the issue. In other words, lead others into uttering
the fateful words "show stopper." Surprisingly, you job is to make the
risks and consequences known, as well as provide alternatives to
mitigate the risks and avoid the consequences. Pointing out mistakes is always a touchy situation. Many people react
poorly to being told that something they did is wrong, so try to
memorize the secret formula, "We can improve X by doing Y to avoid Z."
The "we" is important and you can even toss a "probably" in there
somewhere for added effect. If you can reliably remember the secret formula even when your scalp is
sore from pulling out all of your hair, then please tell me how. ;) 2.) You will always have constraints, so know and memorize them. When an executive spends $200K on a piece of test equipment, and the
other engineers want data at some super fast resolution beyond the
capacity of the equipment, you are the person responsible for knowing
the constraints of the test equipment. This has actually happened to me,
and it's a whole lot of no-fun. You get stuck between a rock (the other
engineers) and a bad place (the exec who doesn't want to look bad for
buying the wrong/cheap equipment). Sure, it may seem anecdotal, but
you'll be surprised how often you are asked to do the impossible. When you know the constraints of your test system, then you can show up
saying, "We can do X with what we have currently, or you can push
upstream for more investment in test infrastructure. We try doing X to
see if it will suffice for your needs?" In your situation with web apps, particularly mobile-ready web apps,
testing will require a big investment in test infrastructure. If your
web app is using any of the newer direct-to-hardware (WebGL, AudioAPI,
...) features, just using emulation (system/browser images with VMware
or similar) may not suffice to give proper test coverage. Emulated
hardware is never perfect, so you'll often need access to real hardware.
Of course, you can often do tons with just emulation, but knowing where
emulation will fail in strange and unexpected ways means knowing the
constraints of emulation. Knowing your constraints is knowing what you can actually test. When
others have unrealistic expectations, knowing your constraints puts you
in charge of the negotiations... --And it's always a negotiation. If
need be, keep a constraints cheat-sheet around with the details. You'll
be surprised how often it comes in handy. 3.) Within your now known constraints, define the specific capacities
you want to verify, and the means to verify them. Once you know the desired capacities and the means to test them, the
automation of verification becomes a whole lot easier. Your existing bug
tracking database should be a good place to start for building up your
regression tests, but preventing the reappearance of old bugs is only
a small part of the problem. The majority of the problem is finding the
never ending stream of new bugs. Since browsers and operating systems are constantly changing, you never
have a solid foundation. This means your test environment requires
extremely strong versioning to manage the vast multitude of browser, OS,
hardware and version combinations. It's important to realize how
automatic updates are your sworn enemy. They will hose your test
environment since if you don't know what you're running, then you don't
know what you're testing. Of course you'll test the latest and greatest
versions of everything, but each new update will be a new environment
version that needs to be preserved. Environment versioning is a real pain. The number of environments you'll
need to manage and test will grow exponentially, so with the always
limiting resource of time, you'll not only need to automate as much as
possible, but you'll also need to do triage. Testing every possible
combination is intractable. Many sites will try to avoid the pain of environment versioning by only
supporting the most current version combinations, but in doing so, they
only avoid the expense of supporting "legacy" combinations. There will
still be plenty of combinations of "newest" so their test infrastructure
will undoubtedly fail to give proper coverage on some of them. Virtual machines, emulation, simulation, system images (snapshots), and
similar along with your favorite version control system (git, cvs, svn)
will really help to manage the env versioning issues. It can be a ton of
work to get things all set up the way you want them, but the benefit of
being able to give good and repeatable coverage makes it worthwhile. Personally, I'd go with open source as much as possible since it will
use a normal programming language. It will mean recreating the wheel in
some situations, but for me, it seems better than the alternatives. The
generally known open source web app testing frameworks are Selenium [1],
Watir [2], and RobotFramework [3]. If you hate real text editors and
prefer to suffer from using an IDE, some IDE's have plug-ins like
CubicTest [4] for Eclipse which can drive Selenium/Watir. [1] http://www.seleniumhq.org/ [3] http://robotframework.org/ Vendor lock-in from proprietary test suites and proprietary languages
can be immensely aggravating and expensive. You might have the budget
for expensive proprietary test suites, or you might not, but if you ever
need to do something out-of-scope or need to change vendors, then you'll
be proverbially stuffed. Sadly, I don't know the proprietary tools at
all, and worse, I'm against them without adequate information. Yes, I
suffer from the usual "I can do it" bias and personality fault, namely
valuing my time less than valuing my money. Your situation may be
different. There are a quite a few different "Software Test" and/or "Quality
Assurance" groups and conferences around. As you might expect, they
often favor proprietary test solutions since it's usually the vendors of
said solutions footing the bill for the conferences. Even with the deck
stacked in favor of the proprietary solutions, the groups/conferences
can be a good place to learn. EuroStar is one common conference but
there are many others if you look around. When looking up EuroStar, I
found a writeup from last year that you might want to look over: http://www.eurostarconferences.com/blog/2013/7/25/comparison... If you've ever seen the musical called "The Wiz" or even the movie of
the same name, try to memorize the lyrics of the song, "No Bad News". If
you can learn to sing a reasonable A Cappella rendition of it, even better.
It will definitely come in handy, and more often than you'll want to admit. ;) http://www.youtube.com/watch?v=pQT-QFy5Nig Good Luck! wow, thank you for your incredibly detailed reply. I'll ensure I refer back to it again and again. Often I have the problem I'm in the midst of right now, I focus on specifics rather than the bigger picture and the general things you need to keep in mind while working. Thank you very much for going to the time and effort of that reply, I really appreciate it.