Your gathering place for information and ideas about Quality Assurance, Testing, and other topics of interest.
March 30, 2006
March 23, 2006
Lessons Learned Sessions
I've conducted some "Lessons Learned" sessions in the past. Some have been very useful, some less so.
When I've written Lessons Learned documents in the past, the purpose was to compile feedback from the entire project team regarding what worked well, and what didn't, and pass that information along to all the relevant parties. It was a learning exercise - intended to help us grow as a team.
Generally, I'd call a meeting in a large conference room and invite pretty much everyone who participated in the release - Managers, Developers, QAers, Product Managers, Documentation, Support, etc, etc.
I would "host" the meeting, and try to act as a facilitator. I'd ask a few simple questions to keep the ideas flowing, ask for clarification when needed, try to keep people on track, and note all the responses.
When the meeting was over, I'd gather and organize all the thoughts. copyrightjoestrazzere
For me, I usually just had two main categories - What Worked Well, and What Didn't Work Well.
Underneath each, I'd try to group similar feedback into project-specific categories, such as "Build Process", "Documentation", "Meetings", etc, etc.
We usually learned a few things. Often (not always) we learned things we could actually change for the better.
BTW - Unless someone actually died, I prefer not to call such a meeting/document a "Post Mortem".
When I've written Lessons Learned documents in the past, the purpose was to compile feedback from the entire project team regarding what worked well, and what didn't, and pass that information along to all the relevant parties. It was a learning exercise - intended to help us grow as a team.
Generally, I'd call a meeting in a large conference room and invite pretty much everyone who participated in the release - Managers, Developers, QAers, Product Managers, Documentation, Support, etc, etc.
I would "host" the meeting, and try to act as a facilitator. I'd ask a few simple questions to keep the ideas flowing, ask for clarification when needed, try to keep people on track, and note all the responses.
When the meeting was over, I'd gather and organize all the thoughts. copyrightjoestrazzere
For me, I usually just had two main categories - What Worked Well, and What Didn't Work Well.
Underneath each, I'd try to group similar feedback into project-specific categories, such as "Build Process", "Documentation", "Meetings", etc, etc.
We usually learned a few things. Often (not always) we learned things we could actually change for the better.
BTW - Unless someone actually died, I prefer not to call such a meeting/document a "Post Mortem".
Labels:
QA
March 21, 2006
SQL Injection
SQL injection is a security vulnerability occurring in the database layer of an application. Its source is the incorrect escaping of dynamically-generated strings embedded in SQL statements. It is an instance of a more general class of vulnerabilities that can occur whenever one programming or scripting language is embedded inside another.
For example, here is a sample basic HTML form with two inputs, username and password.
Username:
The easiest way for the login.php to work is by building a database query that looks like this:
"SELECT idIf the variables $username and $password are requested directly from the user's input without checking for special characters, this can easily be compromised. Suppose that we gave "Joe" as a username and that the following string was provided as a password: anything' OR 'x'='x
FROM logins
WHERE username = '$username'
AND password = '$password'";
"SELECT idBecause the application is not really thinking about the query, but just constructing a string, the use of the single quotes has turned the WHERE into a two-component clause. The 'x'='x' part will be true no matter what the first part contains.
FROM logins
WHERE username = 'Joe'
AND password = 'anything' OR 'x'='x'";
This could allow the attacker to bypass the login form without actually knowing a valid username / password combination!
For some reported cases of SQL Injection exposure in the wild, see:
- http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9080580&pageNumber=1
- http://socialscienceplusplus.blogspot.com/2007/08/connectucom-sql-injection-vulnerability.html
- http://msdn2.microsoft.com/en-us/library/ms161953.aspx
- http://www.sitepoint.com/article/sql-injection-attacks-safe
- http://www.codeproject.com/aspnet/SqlInjection.asp
- http://www.acunetix.com/websitesecurity/sql-injection.htm
- http://unixwiz.net/techtips/sql-injection.html
- http://www.spidynamics.com/spilabs/education/whitepapers.html
- http://www.ngssoftware.com/papers.htm
- http://www.nextgenss.com/papers/advanced_sql_injection.pdf
- http://www.nextgenss.com/papers/more_advanced_sql_injection.pdf
- http://www.spidynamics.com/papers/SQLInjectionWhitePaper.pdf
- http://www.imperva.com/application_defense_center/white_papers/blind_sql_server_injection.html
- http://www.imperva.com/application_defense_center/white_papers/sql_injection_signatures_evasion.html
- http://www.sitepoint.com/article/794
- http://www.owasp.org/software/webgoat.html
- http://en.wikipedia.org/wiki/SQL_injection
- http://searchwindowssecurity.techtarget.com/tip/1,289483,sid45_gci1161833,00.html
This article originally appeared in my blog: All Things Quality
My name is Joe Strazzere and I'm currently a Director of Quality Assurance. I like to lead, to test, and occasionally to write about leading and testing. Find me at http://strazzere.blogspot.com/. |
Labels:
QA
March 18, 2006
Inexpensive Usability Testing
A while back we had to do some usability testing, but didn't have much of a budget. We ended up getting it done with a setup that was new for me.
I was accustomed to having an expensive lab with one-way mirrors and video cameras to watch and record the user as he/she interacts with the application under test.
Instead, we used a plain, small conference room for the tester. The tester's computer was hosting a pcAnywhere session. And, the room had a speakerphone on.
In a larger conference room, the observers watched the test screen through a pcAnywhere session projected on the wall, and listened in on the speakerphone.
I was very pleased with the results. It all seemed to work well, and was an effective, inexpensive alternative to a real usability lab, IMHO. I plan to use this setup again.
I was accustomed to having an expensive lab with one-way mirrors and video cameras to watch and record the user as he/she interacts with the application under test.
Instead, we used a plain, small conference room for the tester. The tester's computer was hosting a pcAnywhere session. And, the room had a speakerphone on.
In a larger conference room, the observers watched the test screen through a pcAnywhere session projected on the wall, and listened in on the speakerphone.
I was very pleased with the results. It all seemed to work well, and was an effective, inexpensive alternative to a real usability lab, IMHO. I plan to use this setup again.
Labels:
QA
March 17, 2006
Being in Quality Assurance is like...
Labels:
QA
March 14, 2006
Internationalization and Localization
There are certainly a lot of issues when preparing to test the Internationalization (i18n) and/or Localization (l10n) of a system.
First, get familiar with how you plan to Internationalize and Localize your application.
Consider how it will deal with different datatypes and their localized entry/display formats:
First, get familiar with how you plan to Internationalize and Localize your application.
- Same Application or Different Versions?
- Text in Resource Files? copyrightjoestrazzere
- Using Unicode?
- Localized In-house or by a Separate Organization?
- etc, etc.
- http://babelfish.altavista.digital.com/translate.dyn
- http://www.javaworld.com/javaworld/jw-12-1998/jw-12-internationalize.html
- http://www.microsoft.com/windows/ie/features/ime.asp
- http://www.microsoft.com/globaldev/
- http://msdn.microsoft.com/library/default.asp?URL=/library/books/devintl/s24aa.htm
- http://www.bspage.com/address.html
Consider how it will deal with different datatypes and their localized entry/display formats:
- Numbers
- Dates
- Times
- Time Zones, and any effects of Daylight Saving Time
- Currencies
- Phone Numbers
- Addresses
- Zip Codes / Postal Codes
- String Fields' Character Set
- National Characters and Accent Marks
- Special Characters
- Internet Domain Names
- Encoding for Web Pages
- Use of Abbreviations
- Use of Color
- Use of Icons and other Images
- Paper Sizes (8.5 x 11 vs A4)
- Envelope Sizes
- Localized Sort Order
- Order of Given and Family Names
- Keyboard Types to be Used
- International Laws and Regulations
- Encryption Rules and Regulations
- Copyright Laws
- Security Expectations and Regulations
- Import/Export Limitations
- Tax Issues
- Currency Exchange/Conversion Rules
- Holidays
- Simultaneous with the Original Version?
- Phased Releases?
Labels:
QA
Perhaps They Should Have Tested More - McAfee
McAfee virus update wreaks havoc
Update quarantines or deletes legitimate system files
Tom Sanders in California, vnunet.com 14 Mar 2006McAfee was forced to publish an update to its virus pattern database on Friday after the previous version mistakenly flagged system files as malware.The error caused several versions of McAfee's antivirus software to quarantine or delete system files, depending on the software's configuration.
Affected applications included Microsoft Excel, Google Toolbar Installer, Macromedia Flash Player and Windows XP.
McAfee has published a full list of files (PDF download) that were incorrectly flagged. The error spanned all operating systems from Linux to OS X and Windows.
"Users who have moved detected files to quarantine should restore them to their original location. Windows users who have had files deleted should restore files from backup or use System Restore," McAfee said in an advisory.
The company had not, at the time of going to press, returned several phone calls from vnunet.com seeking further information.
The Sans Internet Storm Center said that the bad signature files were available for several hours. A user had to run a virus scan for the problem to arise.
While users who have quarantined the infected files should have relatively little trouble restoring them, the error could still cause considerable damage, according to Daniel Wesemann, a volunteer with the Sans Internet Storm Center.
"Things like this can get messy pretty quickly if the antivirus scanner starts to quarantine vital components of your environment," he warned.
In a similar case last month, antivirus firm Sophos wrongly claimed that files on Mac computers running OS X were infected with the Inqtana-B worm. The software in some cases reported over 1,000 infections.
One user reported to vnunet.com that the Sophos mix-up caused the software to delete over 1,200 files from his PC, and that he was forced to completely reinstall the system.
http://www.vnunet.com/vnunet/news/2151887/mcafee-virus-update-wreaks
Affected applications included Microsoft Excel, Google Toolbar Installer, Macromedia Flash Player and Windows XP.
McAfee has published a full list of files (PDF download) that were incorrectly flagged. The error spanned all operating systems from Linux to OS X and Windows.
"Users who have moved detected files to quarantine should restore them to their original location. Windows users who have had files deleted should restore files from backup or use System Restore," McAfee said in an advisory.
The company had not, at the time of going to press, returned several phone calls from vnunet.com seeking further information.
The Sans Internet Storm Center said that the bad signature files were available for several hours. A user had to run a virus scan for the problem to arise.
While users who have quarantined the infected files should have relatively little trouble restoring them, the error could still cause considerable damage, according to Daniel Wesemann, a volunteer with the Sans Internet Storm Center.
"Things like this can get messy pretty quickly if the antivirus scanner starts to quarantine vital components of your environment," he warned.
In a similar case last month, antivirus firm Sophos wrongly claimed that files on Mac computers running OS X were infected with the Inqtana-B worm. The software in some cases reported over 1,000 infections.
One user reported to vnunet.com that the Sophos mix-up caused the software to delete over 1,200 files from his PC, and that he was forced to completely reinstall the system.
http://www.vnunet.com/vnunet/news/2151887/mcafee-virus-update-wreaks
Labels:
Perhaps They Should Have Tested More,
QA
March 13, 2006
Free Test Plan Template
This is a simple Free Test Plan Template.
Please excuse the formatting - this blog makes it hard to preserve the Word Document formatting I used originally.
{ProductName} {Version}
Test / Verification Plan
Revision 1.x
Author: Joe Strazzere
Control Page
Title: {ProductName} {Version} – Test / Verification Plan
File: {ProductName} {Version} Test Verification Plan.doc
Source Template:
Document Owner: Joe Strazzere
Date of Issue: {Date}
Abstract: This document defines the test verification plan for the
{ProductName} {Version} release. copyrightjoestrazzere
Revision History
Revision Date Initial Description
1 mmm dd, yyyy JSS Initial document created / sent.
1.1 mmm dd, yyyy JSS Incorporated suggestions from …
Distribution
Name Role
Firstname Lastname Project Manager
Firstname Lastname Engineering Manager
Joe Strazzere QA Manager
Firstname Lastname QA Engineer
Firstname Lastname QA Engineer
Table of Contents
Overview 4
Roles 5
Platforms 6
Test Environment 6
Tracking 7
Release Criteria 8
Categories of Tests 9
Approximate Count of Test Cases 10
References 11
Overview
This document will outline the test verification plan for {ProcuctName} {Version}, currently scheduled for Q1 yyyy.
Roles
Following are the relevant major Roles assigned to this project:
• Project Manager Firstname Lastname
• Development Manager Firstname Lastname
• Developer Firstname Lastname
• Developer Firstname Lastname
• Developer Firstname Lastname
• QA Manager Firstname Lastname
• QA Engineer Firstname Lastname
• QA Engineer Firstname Lastname
• Doc Manager Firstname Lastname
• Configuration Engineer Firstname Lastname
Platforms
{ProductName} {Version} will be tested on the following Operating Systems:
• Windows 2000 Professional (for Proxy Server only)
• Windows 2000 Server
• Windows 2003
• Windows XP Professional
Test Environment
{ProductName} {Version} will be tested in the QA Lab environment, using accounts created for testing purposes. The following equipment will be used, no new equipment is required:
machinename1 machinename2
CPU Dual 2.4 GHz Xeon CPU Dual 2.4 GHz Xeon
2GB RAM 2GB RAM
Windows 2003 Windows 2003
machinename3 machinename4
CPU Dual 2.8 GHz Xeon CPU Dual 2.4 GHz Xeon
2GB RAM 2GB RAM
Windows 2003 Windows 2000 Server
Tracking
All Requirements, Tests, and Issue Reports will be stored in {BugTrackingSystem} as follows:
• Database {DatabaseName}
• Project {ProjectName}
• Target Ver {Version}
Release Criteria
The project will be considered ready for Beta when it meets the following criteria:
• All functionality has been testedThe project will be considered ready for GA when it meets the following criteria:
• All Issues with Priority=High have been Fixed and Verified
• In the week before Beta, no new Priority=High Issues are found, despite constant testing pressure
• All functionality has been tested
• All Issues reported from Beta sites have been entered into {BugTrackingSystem}
• All Issues with Priority=High and Priority=Medium have been Fixed and Verified
• In the two weeks before GA, no new Priority=High or Priority=Medium Issues are found, despite constant testing pressure
Categories of Tests
The following major categories of tests must be performed:
• New Functionality Tests – The majority of these tests will focus on testing each of the specific new features of {ProductName} {Version}:
o Support for {NewFeature1}
o Support for {NewFeature2}
o Support for {NewFeature3}
• Integration Testing – Integration testing will verify the interoperability of {ProductName} components.
• Regression Testing – Regression testing will verify that existing features of {ProductName} continue to operate as expected.• System Tests – These tests will focus on the {ProductName} product as a whole, single solution in a production-simulated environment.o For {ProductName} {Version}, it will be necessary to perform a full regression testing cycle
o Scale, Load and Duration tests will be performed
• Bug Fix Verification – Throughout the testing cycle, bug fixes will be verified in as timely a manner as possible.
• Interoperability Testing – These tests will focus on the interoperability of {ProductName} with non-{ProductName} components:
o {OtherProductName1}
o {OtherProductName2}
o {OtherProductName3}
o {OtherProductName4}
Approximate Count of Test Cases
Approximately xxx functional Test Cases will be exercised, within the following areas. More will be added as necessary:
• {FeatureArea1} 5
• {FeatureArea2} 35
• {FeatureArea3} 10
• {FeatureArea4} 15
• {FeatureArea5} 20
• {FeatureArea6} 20
• {FeatureArea7} 85
• {FeatureArea8} 5
• {FeatureArea9} 15
• {FeatureArea10} 15
• {FeatureArea11} 25
• {FeatureArea12} 5
References
Version {Version} Requirements
• in {BugTrackingSystem}
{ProductName} {Version} Functional Specification
• In Outlook ({FileNameAndLocation} FunctionalSpec.doc)
{ProductName} {Version} Functional Specification Update (in Outlook)
• In Outlook ({FileNameAndLocation} Functional Specification Update.doc)
{ProductName} {Version} Reference
{ProductName} {Version} Test Developer User Guide
{ProductName} {PriorVersion1} {PriorVersion2} test cases
• in {BugTrackingSystem}
March 12, 2006
QAer Status Reports
I don't like to burden my QA Team with paperwork. In general, I prefer just talking over receiving reports.
But, when my team gets too large, or too dispersed geographically, I need to ask for Status Reports, to help me keep track of things over time.
Here's an email I send to all new QAers on my team, asking them to send me weekly Status Reports:
I like to receive weekly Status Reports from each QAer. This helps keep me up to speed on what you are doing, and hopefully keeps me in the loop on issues I need to know.
Please send me the following via email each Friday before you leave.
I don't want this to take more than a few minutes to complete.
Some folks find it simpler to keep track of tasks each day, then just paste them into an email at the end of the week.
I will not be sharing the details of your status reports with anyone, although I do pass along a high-level summary of the QA Team in general to the VP of Engineering.
We'll revise this process as we need to going forward, making sure it is simple, not burdensome, but effective.
Thanks,
-joe
But, when my team gets too large, or too dispersed geographically, I need to ask for Status Reports, to help me keep track of things over time.
Here's an email I send to all new QAers on my team, asking them to send me weekly Status Reports:
I like to receive weekly Status Reports from each QAer. This helps keep me up to speed on what you are doing, and hopefully keeps me in the loop on issues I need to know.
Please send me the following via email each Friday before you leave.
1) What I worked on this week:
I'm just looking for bullet items here. The basics of what you have done for the week.
I don't need to know how many hours you spent on things, just what you were doing.
I don't need to know how many hours you spent on things, just what you were doing.
For example:
- Wrote Test Plan for new Zerble Suite 7.0 feature XXX
- Completed testing of Framis 3.2 SR6 using AB&T database
- Attended planning meeting for Zerble Suite 8.0
- etc, etc
- Wrote Test Plan for new Zerble Suite 7.0 feature XXX
- Completed testing of Framis 3.2 SR6 using AB&T database
- Attended planning meeting for Zerble Suite 8.0
- etc, etc
2) What I plan to work on next week:
Again, just the basics.
I'm looking to see what is on your plate for the upcoming week.
I'm looking to see what is on your plate for the upcoming week.
3) Unplanned Activities:
This will help me better keep track of progress on our schedules versus all the other activities we participate in.
This should include:
- any work not normally part of your weekly routine
- any work on customer or hitlist problems
- any work helping out other teams
- basically anything not directly involved with QA, testing, creating test cases, etc for the upcoming scheduled release(s)
4) Time away next week:
Please tell me if you are expecting to be out of the office during the upcoming week.
I don't need to know too much detail here, just when you won't be here, and a general reason as to why.
I don't need to know too much detail here, just when you won't be here, and a general reason as to why.
For example:
- Vacation day Tuesday, February 10
- Vacation day Tuesday, February 10
5) Issues and Concerns
Anything I should know about.
If there's anything getting in the way of your work, here's where you can note it.
If there's anything getting in the way of your work, here's where you can note it.
For example:
- Memory upgrade on Bigtest server has been delayed. Without the additional memory, my load testing cannot be completed.
- Memory upgrade on Bigtest server has been delayed. Without the additional memory, my load testing cannot be completed.
I don't want this to take more than a few minutes to complete.
Some folks find it simpler to keep track of tasks each day, then just paste them into an email at the end of the week.
I will not be sharing the details of your status reports with anyone, although I do pass along a high-level summary of the QA Team in general to the VP of Engineering.
We'll revise this process as we need to going forward, making sure it is simple, not burdensome, but effective.
Thanks,
-joe
March 11, 2006
Don't Break the Build!
Where I work, we usually perform a daily build.
The idea is to have continuous integration of new features, and to provide quick feedback to the Developers when something isn't working.
Occasionally, we'll run into a stretch in the development cycle where the build is frequently broken. This is bad - very bad!
So, over the years, I've tried different techniques to encourage Developers to unit test their changes before checkin so that they don't break the build.
When someone corrects a broken build, or fixes a particularly bad bug, I often give them a small reward like this one:
The idea is to have continuous integration of new features, and to provide quick feedback to the Developers when something isn't working.
Occasionally, we'll run into a stretch in the development cycle where the build is frequently broken. This is bad - very bad!
So, over the years, I've tried different techniques to encourage Developers to unit test their changes before checkin so that they don't break the build.
- I've given out toy bugs
- I've tried funny hats
- I've tried posters with Red, Yellow and Green status indicators
- I even wrote a song once ("May the Build Remain Unbroken")
When someone corrects a broken build, or fixes a particularly bad bug, I often give them a small reward like this one:
Labels:
QA
Hiring QA Interns
Where I work now, we hire Interns from local colleges to augment our QA Team. Typically we bring in two students at a time for a 6-month paid internship.
They each work with one of the Senior QAers in a variety of roles - Functional Testing, Scalability Testing, etc. It works out well for us - we get two more QAers when the budget doesn't allow for full-timers. And it works out well for them - they get some hands-on experience.
One of the challenges in hiring an Intern is finding two good candidates from the population of applicants. They are young, inexperienced and most have little in the way of interviewing skills. Some have done an internship before, but many have not.
We have found that it works best to talk with them once, then make a quick decision. I usually talk with them first for about a half-hour, followed by a half-hour each with the two Senior QAers. After we have spoken with a batch of applicants, we get together and discuss the candidates and decide which, if any, deserve offers.
Here are some of the questions I ask when interviewing a potential QA intern. Most of them are not make-or-break questions themselves, but they help us quickly assess the candidate in very general terms:
Do you have any time off planned during the 6-month period of the internship?
In general, we are looking for people who can devote the full 6 months to the internship. We don't want someone who is planning a 2-month trip to Europe in the middle of this period, for example.
What’s your current school schedule like?If the student likes to take a bunch of easy courses, then we may not be the right internship home.
I also like to find out if they are scheduling lots of days off in their week, or would prefer to be busy.
Last, I try to see if they are early-risers, or late-sleepers. We have room for both - it's just nice to know where they fall.
How many hours per week would you like to work?
We try to make it clear in the job posting that we are looking to fill an 8-hours-per-day, 40-hours-per-week position. But, occasionally we get a candidate who is only looking to work 20 hours or so.
What year are you in?
Our past experience has shown that the later in their academic career we can get them, the better the internship.
For whatever reason, Juniors and Seniors end up being more serious, more experienced and more hardworking in general than their younger counterparts.
Have you worked an Internship before?Hey, you have to start somewhere. But, if you've already done an intership before, you'll have a better sense of what is expected, and what is in store for you.
Not everyone likes working in a cubicle. If you've done it before and want to do it again, that's a good sign.
Do you have reliable transportation?
We work in a suburban area.
There is bus transportation, but it's not the best way to get here - particularly in bad weather.
If the candidate has a car, we have found that they tend to show up on a more regular basis than if they don't.
What do you like to do when you aren’t studying?
If the candidate spends all their free time clubbing, that might be fun for them.
If they spend free time playing with computers and software, that might be a better indicator of potential success.
Do you know anything about testing/QA?
Few Universities teach anything at all about Testing or Quality Assurance.
But, some candidates have still found a way to understand what testing and QA are, and can express it.
What Operating Systems are you familiar with?
Some candidates know how to use only Windows, and perhaps a small bit of Unix.
Other candidates have been installing Windows and mucking around with Linux since they were children.
Have you had a chance to check out our web site?
We'd like to hire interns who have enough inititative to look at our web site and understand what we do.
Do you have any questions for me?
Hopefully, this open-ended question leads to a discussion of what my company does, what we in QA do, and what the intern can expect.
As I've said, none of these are make or break questions. As much as anything we are looking for someone who is smart, wants to work hard, and is willing to learn.
We've been very lucky to have found some pretty good interns over the years. Some have come back and become employees. Others have gone on to positions in other companies. Most of the time we give them a great recommendation.
Labels:
Interviews,
Management,
QA,
Templates
March 10, 2006
Perhaps They Should Have Tested More - Tarrant County, Texas
Posted on Thu, Mar. 09, 2006
Vote spike blamed on program snafu
By ANNA M. TINSLEY and ANTHONY SPANGLER
STAR-TELEGRAM STAFF WRITERS
An undetected computer glitch in Tarrant County led to inflated election returns in Tuesday's primaries but did not alter the outcome of any local race, elections and county officials said Wednesday.
The error caused Tarrant County to report as many as 100,000 votes in both primaries that never were cast, dropping the local turnout from a possible record high of about 158,103 voters to about 58,000.
Because the errors added votes equally for each candidate, the glitch did not change the outcome of Tarrant County races but narrowed the margin of victory in some statewide races. In the close Republican primary race for Texas Supreme Court, for example, incumbent Don Willett edged past former Justice Steve Smith by only about 1 percentage point with the corrected vote tallies.
Questions about possible problems were raised by election staff late Tuesday night, as it became apparent to some that the county would far exceed the 76,000 votes cast in the 2002 primary elections.
But elections officials did not look into the discrepancies that night because they were dealing with a new system, new procedures and some new equipment, said Gayle Hamilton, Tarrant County's interim elections administrator.
"We didn't think there was a problem," Hamilton said. "We should have stopped right then.
"But we didn't question it at that time."
The problem stemmed from a programming error by Hart InterCivic, which manufactured the equipment and wrote the software for the local voting system. The system is designed to combine electronic early voting results and totals from paper ballots on Election Day.
The error caused the computer to compound the previous vote totals each time the election totals were updated throughout the night, rather than keep a simple running total, officials said.
"The system did what we told it to do," said John Covell, a vice president with Hart. "We told it incorrectly."
The program was designed specifically for Tarrant County, and no other counties reported similar problems, elections officials said.
By 7 a.m. Wednesday, campaign officials for Robert Higgins, who ran against Republican state Rep. Anna Mowery in state House District 97, showed up at election headquarters wanting to know how more than 20,000 people could have voted in that race.
"We were watching the results and we knew what the universe of numbers should be," Higgins said. "We expected about 8,000 in our race and got about 21,000."
Election officials then began reviewing the results and discovered errors. Hart officials were called in and spent much of Wednesday reviewing election results.
By late Wednesday, officials were still running reports showing precinct-by-precinct totals -- about 5,000 pages in all -- to examine and compare the data with information collected by election judges countywide.
"Then we will feel very comfortable that the information is correct," County Administrator G.K. Maenius said. "We're going to be working on this continuously."
Democratic Party Chairman Art Brender said he had been on the verge of calling elections officials to get precinct-by-precinct data when he was told that there had been a problem with totals on election night.
"I was concerned about the results when I saw them," he said. "I thought there were too many."
Republican Party Chairwoman Stephanie Klick also said she was skeptical of the results when she saw that some GOP races had 114,000 voters turning out to cast ballots.
"That would have been a record turnout," she said.
Brender said the glitch drives home the need for a paper trail for the next election. Officials hope by the May elections that a device will be added to the electronic eSlate machines used in early voting to record paper copies of ballots cast. The Texas secretary of state's office must first give its approval.
For the ongoing review of Tarrant County data, printouts kept by election judges are being matched to the recording tape in the voting machines.
"I'm not concerned about the accuracy of data when it came in and was preserved," Brender said. "I'll be comfortable with electronic voting when there's a verifiable paper trail."
County officials say they don't know how much it will cost to correct the numbers. Hamilton said the county will waive the usual charge for candidates who want a recount.
In 2002, Tarrant County election officials did not report final tallies for more than a day after polls closed because of a different programming error that caused machines to ignore votes for individual candidates when a voter cast a straight-party ballot.
Republican and Democratic party officials are responsible for canvassing the election returns, which makes them official, by March 18. The returns will then be turned over to the state, party officials said.
IN THE KNOW
Voting glitch
A sample of the vote tallies illustrates the computer glitch in Tarrant County that led to overcounting of votes for Tuesday's primaries. The results listed below were taken from the Republican governor's primary in Tarrant County. The computer erred by adding previous totals to the running vote total, compounding the number of votes cast each time election officials tallied the totals throughout election night.
Time | Vote counts | Ballots cast | Phantom votes |
8:27 p.m. | 1,352 | 1,352 | 0 |
9:04 p.m. | 6,398 | 5,046 | 1,352 |
9:35 p.m. | 14,129 | 7,731 | 6,398 |
10:15 p.m. | 20,176 | 6,047 | 14,129 |
10:55 p.m. | 27,895 | 7,719 | 20,176 |
12:30 p.m. | 28,374 | 479 | 27,895 |
Source: Tarrant County Elections Center
Anna M. Tinsley, (817) 390-7610 atinsley@star-telegram.com Anthony Spangler, (817) 390-7420 aspangler@star-telegram.com
Labels:
Perhaps They Should Have Tested More,
QA
March 9, 2006
Perhaps They Should Have Tested More - Trend Micro
From NetworkComputing.com
Fix Thyself
Trend Micro's new antivirus update crashes thousands of PCs, demonstrates the need for testing
Trend Micro in April pushed out a faulty antivirus software update that sucked up all the processor cycles in Windows XP SP2 machines and caused many of them to fail. The antivirus software vendor should make restitution for such a fundamental and egregious error.
Fix Thyself
Trend Micro's new antivirus update crashes thousands of PCs, demonstrates the need for testing
Trend Micro in April pushed out a faulty antivirus software update that sucked up all the processor cycles in Windows XP SP2 machines and caused many of them to fail. The antivirus software vendor should make restitution for such a fundamental and egregious error.
May 12, 2005 - By Tim Wilson
Every entry-level programmer knows the secret to writing good software: Test, test, test. That's why we were blown away when Trend Micro--an established antivirus software vendor--published an update of its product suite that brought thousands of Windows XP machines to a sudden halt.
Every entry-level programmer knows the secret to writing good software: Test, test, test. That's why we were blown away when Trend Micro--an established antivirus software vendor--published an update of its product suite that brought thousands of Windows XP machines to a sudden halt.
On April 22, Trend Micro pushed out updates of its OfficeScan, PC-cillin and several server antivirus products to hundreds of customers, principally those in Japan, where the company has a large client base. Many organizations trust their antivirus vendors to send updates directly to end users' computers, without those updates being vetted or tested by the local IT department, to speed the installation process.
But this time, the update contained a faulty pattern file that sucked up all the processor cycles in Windows XP SP2 and caused the affected PCs to crash. Japanese IT managers traced the problem back to the Trend Micro software, and the vendor pulled the updates after about 90 minutes. But the damage had been done: Hundreds of IT staffers across the country worked the weekend to fix the thousands of damaged machines.
With so many good technologies available for software testing, packaging and distribution, such a blunder is unforgivable. Trend Micro should not only make restitution to its customers for their lost time, it also should eat crow for failing to meet its primary objective: keeping its customers' PCs up and running.
Labels:
Perhaps They Should Have Tested More,
QA
March 7, 2006
Free Issue Tracking Template
Here's a free template of an Issue Tracking system I currently use.
Issue Number: A system-generated number that uniquely identifies the Issue.
Summary: This is a short, one-line statement that summarizes the issue.
Created: This system-generated field displays the date the Issue was created and the name of the person who created it.
Last Modified: This system-generated field displays the date the Issue was most recently modified and the name of the person who modified it. copyrightjoestrazzere
Closed: This system-generated field displays the date the Issue was closed and the name of the person who closed it.
Product: This is the product for which the issue applies. It will typically be selected from a list customized for the particular database and project.
Issue Type: This indicates the type of issue. The choices are:
Target Ver: This is the version of the product in which the issue was (or will be) resolved. All version numbers should have one of the following forms: M.mm where M is the major version number, mm is the minor version number (ex: 2.50), or the code name of the Project in which the issue is to be fixed (ex: Havard).
Work Days: This indicates the Developer’s estimate of the days required to resolve the Issue. This field typically only exists once the Issue reaches Accepted status.
Fix Build: Once fixed, this field indicates the full version-and-build in which the fix is expected to appear. Once the fix is verified, this field indicates the full version-and-build number of the build used while verifying the fix.
Status: This describes the current state of the issue. The possible choices are:
Description: This is a continuous dialog contributed to by each owner of the issue. Typically, it starts with a description of the problem, the steps required to reproduce the problem, the expected results, and the actual results. It continues with a description of the solution entered by the developer who fixed the problem. It may further continue with notes from the verifier (especially in the case where the verification failed).
Each time more description text is added, the system automatically adds a time stamp.
Attachments: All files which help clarify the issue are attached here
Associated Tests: The Test Case associated with this Issue is linked here
Associated Issues: Any duplicate Issues are linked here. Any other Issues which can help clarify this issue are also linked here.
Here's a sample Issue Report:
Issue Number: A system-generated number that uniquely identifies the Issue.
Summary: This is a short, one-line statement that summarizes the issue.
Created: This system-generated field displays the date the Issue was created and the name of the person who created it.
Last Modified: This system-generated field displays the date the Issue was most recently modified and the name of the person who modified it. copyrightjoestrazzere
Closed: This system-generated field displays the date the Issue was closed and the name of the person who closed it.
Product: This is the product for which the issue applies. It will typically be selected from a list customized for the particular database and project.
Issue Type: This indicates the type of issue. The choices are:
- Bug: This is a bug in the existing product
- Feature Request: This is a request for a new feature
- Task: This is a new development task
- Hitlist: This is a hitlist (ie, customer-and-time-critical) issue
- Flex: This issue holds anticipated Developer time off
- Other: Anything else
- Critical: This is the highest priority and should only be given to issues associated with customers being down, or when testing is blocked. These issues will often be added to the hitlist.
- High: This priority is for serious issues that are causing the user a great amount of pain but can be managed in the short-term with a workaround.
- Medium: This is the priority that should be given to most issues.
- Low: This is for issues that are not very important. Easy workarounds exist for these issues.
Target Ver: This is the version of the product in which the issue was (or will be) resolved. All version numbers should have one of the following forms: M.mm where M is the major version number, mm is the minor version number (ex: 2.50), or the code name of the Project in which the issue is to be fixed (ex: Havard).
Work Days: This indicates the Developer’s estimate of the days required to resolve the Issue. This field typically only exists once the Issue reaches Accepted status.
Fix Build: Once fixed, this field indicates the full version-and-build in which the fix is expected to appear. Once the fix is verified, this field indicates the full version-and-build number of the build used while verifying the fix.
Status: This describes the current state of the issue. The possible choices are:
- New: This is the status that is given to all newly created issues.
- Accepted: This is the status that is given to an issue that will be worked on for the next release.
- Fixed: This is the status that is given to an issue that has been fixed (or implemented in the case of a feature request) and needs verification.
- Closed: This is the status given to an issue that has been successfully verified.
- Deferred: This is the status given to issues that will be fixed in a future release.
Description: This is a continuous dialog contributed to by each owner of the issue. Typically, it starts with a description of the problem, the steps required to reproduce the problem, the expected results, and the actual results. It continues with a description of the solution entered by the developer who fixed the problem. It may further continue with notes from the verifier (especially in the case where the verification failed).
Each time more description text is added, the system automatically adds a time stamp.
Attachments: All files which help clarify the issue are attached here
Associated Tests: The Test Case associated with this Issue is linked here
Associated Issues: Any duplicate Issues are linked here. Any other Issues which can help clarify this issue are also linked here.
Here's a sample Issue Report:
Issue Number:
|
1234
|
Summary:
|
Right-Click Menu is Missing the Delete Choice
|
Created:
|
2/21/06 at 10:43 AM by Joe Strazzere
|
Last Modified:
|
3/1/06 at 8:12 AM by Joe Strazzere
|
Closed:
|
3/1/06 at 8:12 AM by Joe Strazzere
|
Product:
|
Whiz-Bang
|
Issue Type:
|
Bug
|
Priority:
|
Medium
|
Issue Build:
|
2.50.0401
|
Target Ver:
|
2.50
|
Work Days:
|
1
|
Fix Build:
|
2.50.0406
|
Status:
|
Closed
|
Description:
|
When I right-click the Zerble object, the menu which appears does not include the "Delete" choice, as it did in prior versions.
Nothing unexpected appears in the server.log file.
The JavaScript console shows no errors.
To Reproduce:
- Open the Framis in the Whiz-Bang Editor
- Select one of the the Zerble objects which appear
- Right-click
- Note the choices in the pop-up menu
Expected Results:
- The pop-up menu choices should be Add, Edit, Delete, Save, Save As, Help, and Exit
Actual Results:
- Delete is missing, the rest are as expected.
- (see attached zerblemenu.jpg)
[Joe Strazzere - 2/21/06 10:43 AM]
The call to popUpMenuSelections was made without the isItDeleted modifier. Should be fixed in the next build.
[Dee Veloper - 2/28/06 6:22 PM]
Verified in build 406 usig IE and Firefox.
[Joe Strazzere - 3/1/06 8:12 AM]
|
Attachments:
|
zerblemenu.jpg
|
Associated Tests:
|
142 - Zerble Object Right-Click Menu Choices
|
Associated Issues:
|
(none)
|
March 1, 2006
Perhaps They Should Have Tested More - H&R Block
Some major software problems at H&R Block recently:
Top Ten H&R Block Excuses
10. "Instead of CPA training, employees got CPR training"
9. "Forgot to carry the one 32 million times"
8. "For years we've been secretly funding Hamas"
7. "H was out sick that day and R was on jury duty"
6. "We were using Martha Stewart's guy"
5. "Were testing the world's first accounting monkey"
4. "Come on, it's a couple of dollars. It's not like we shot a guy in the face..."
3. "Hard to stay focused when you've been drinking since April 16th"
2. "Thirty-two million dollars?! We lose that much on a good day"
1. "Hoping for hot make-up sex with the IRS"
And it's not the first time Block has had software problems:
During a conference call with analysts, company chairman and chief executive Mark Ernst said software problems during the first half of January likely prevented the company from serving an estimated 250,000 clients.
The company also cut its forecast for full-year 2006 earnings, blaming, among other things, “a slower start to the tax filing season than in previous years.” But it acknowledged it compounded the problem by introducing a new technology that went haywire -- and sent a quarter of a million customers to rivals.
...
But in a conference call with investors, Mark Ernst, the company’s chairman and chief executive, said the slow start was exacerbated by “self-inflicted wounds. Ernst said software-related technology problems left the company unprepared for a surge in January filings by taxpayers expecting refunds and “created a hole out of which we’re working to climb.” He said the problem “cost us 250,000 clients” that were ”unable to be recovered.” The company said a new software distribution system introduced in January had caused its offices glitches that would be fixed for a day, then pop up again, It said the problems left some offices unable to process taxes.
...
Besides the problems that Block had with its own tax prep needs, the company also experienced difficulties with the technology in its offices last month that hit its bottom line early in tax season. “Technology problems across the H&R Block network in early January impacted our ability to serve clients in those crucial early weeks,” said Block Chairman Mark A. Ernst. He said the problems had been corrected, but they impacted the company’s ability to serve 250,000 clients at that time of year. Mr. Ernst said he was pleased with sales of the company’s TaxCut software and online tax prep products. Block has implemented an integrated TaxCut marketing and brand-building campaign this year and has revamped the software’s look and feel. However, after the snafu in Block’s in-house tax preparation efforts, the best-selling tax prep software package, Intuit’s TurboTax, may see increased sales from customers who may decide not to buy Block’s TaxCut this year.
...
Tax preparation giant H&R Block has a little problem. It miscalculated its own state income taxes, understating its liabilities by $32 million as of April 30, 2005. The company has also had serious problems with its in-house computer network.
Besides bungling its own taxes, H&R Block had serious problems with its in-house computer network that cut into its business in January.
Technology problems across the H&R Block network in early January impacted our ability to serve clients in those crucial early weeks," said Block Chairman Mark A. Ernst. He said the problems had been corrected, but they impacted the company's ability to serve 250,000 clients who were trying to get an early start on their taxes. Analysts speculated the internal technical troubles could reflect badly on the company's TaxCut software, to the benefit of its primary competitor, Intuit's TurboTax.This, and their inability to accurately assess their tax liabilities, even prompted a David Letterman "Top Ten":
Top Ten H&R Block Excuses
10. "Instead of CPA training, employees got CPR training"
9. "Forgot to carry the one 32 million times"
8. "For years we've been secretly funding Hamas"
7. "H was out sick that day and R was on jury duty"
6. "We were using Martha Stewart's guy"
5. "Were testing the world's first accounting monkey"
4. "Come on, it's a couple of dollars. It's not like we shot a guy in the face..."
3. "Hard to stay focused when you've been drinking since April 16th"
2. "Thirty-two million dollars?! We lose that much on a good day"
1. "Hoping for hot make-up sex with the IRS"
And it's not the first time Block has had software problems:
Feb 14, 2000
H&R Block's online tax filing service exposed some customers' sensitive financial records to other customers last weekend, prompting the company to shut down the system yesterday afternoon, CNET News.com has learned.
The company's Web-based tax preparation service, which is the premier sponsor of Yahoo's Tax Center, experienced a technical glitch that accidentally switched some tax filers' records, H&R Block confirmed today. As a result, when some registered users signed on to the service to work on their tax returns, they instead received someone else's filing--including a social security number, home address, annual income and other highly sensitive information.
"What we discovered was that some of our clients' data was appearing in other clients' data files," said Linda McDougall, vice president of communications for H&R Block. "We're keeping it down until we're convinced that the problem has been corrected."
McDougall emphasized that the problem only affected the Web-based preparation and filing of returns. Taxes processed with H&R Block's preparation software or at one of the company's offices were not exposed, she said.
The software glitch revealed the confidential records of at least 50 people, although the full extent of the problem will not be known until the company completes an internal audit, McDougall said. She added that at least 10 customers have contacted the company about the problem.
"Once we determined this, we took our system offline immediately and we began an audit of our entire customer database," McDougall said.
"We're confident that it wasn't due to a hacker--we feel that it was a software problem within our system," she added. "No return has been filed to the Internal Revenue Service that contains inaccurate data."
This is the second time in two weeks that H&R Block's $9.95 "Do-it-yourself" Net filing service--which more than 300,000 people have used so far this year--has suffered a technical problem and had to be shut down. H&R Block expects to handle more than 650,000 returns via the Net this year.
Other Web sites also have had security concerns in recent months. For example, RealNames, a company that substitutes complicated Web addresses with simple keywords, warned its users last week that its customer database had been hacked, and that user credit card numbers and passwords may have been accessed.
The H&R Block privacy breach was no doubt startling to some users who chose the 40-year-old company over other online services, such as Intuit's TurboTax software. User anxiety was intensified because it occurred on the weekend, making it difficult to locate an H&R Block employee who could address the problem.
Joshua Kasteler of the San Francisco Bay area said he was tackling his EZ 1040 on Sunday when the H&R Block system started to act sluggish. Kasteler logged off, and when he signed on to the password-protected site an hour later, he was given access to the records of another H&R Block customer.
"Instead of my information, it was a gentleman from Texas who worked for Advanced Micro Devices," Kasteler said, noting that the forms also listed the other person's phone number, address, social security number and annual income. "I assumed that someone else has my information, too, because this guy's information fell into my lap. I had this guy's life."
Kasteler said he emailed and called H&R Block but still had not heard back from the firm as of late today. So he decided to call the man whose information he had accessed: James Keech, a maintenance technician who also had trouble with the H&R Block site and had been unable to process his return since Thursday.
"When (Kasteler) called, I was freaking," Keech said. "I was like, 'If he's got it, how many other people have my file and aren't being honest and letting me know.' "
Keech said he called H&R Block and was told that there had been a security problem. He has asked that his data be deleted from the system.
"I'll probably go to a regular tax filing office now," he said. "It would have been easier to fill it out on paper."
The 1040 EZ is a simplified IRS form that does not include information such as itemized deductions, capital gains or rental income.
H&R Block's privacy policy states that "information contained in your tax return will be treated with extreme care and confidence...we will never disclose any tax return information without your consent." Like many Web sites, however, the policy doesn't address information that is accidentally disclosed without permission.
With the growth of the Net, consumer advocates have been pushing for umbrella data-protection laws to safeguard U.S. computer users, who may be giving up more information in the digital age that makes them vulnerable to fraud and privacy breaches.
The Clinton administration and Congress, however, have been reluctant to pass new privacy laws that impose stricter penalties for firms that don't secure the data they collect. Instead, the U.S. government has favored industry-developed guidelines.
Labels:
Perhaps They Should Have Tested More,
QA
Subscribe to:
Posts (Atom)