And that’s precisely why QA still exists and why it shouldn’t be the devs. And yet, you’ll still wind up with weird situations, despite your best efforts!
Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).
Which is why you want somebody with a different mindset to independently come up with their own situations.
It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)
and this is an incredibly valuable reason to have a technically simple UI, because it fundamentally limits the amount of stupid shit people can do, without it being the fault of the designer.
This is very perfectionist. Let me install my doors the way it’s comfortable or pleasing. Where I see a knob I’ll reach. And where I see a “pull” sign I pull, or get contex clues.
There is research for everything, let’s say it’s more comfortable to push and the knob is on the right side for me. I could spend way more time and effort than thia desrves to apeal to that study, “I have great UX”, I’d tell myself. But then I’d show this product on some eastern market where they read in “reverse” and it’ll not be comfortable nor “100% natural” for them. Meaning, I’d fail, my UX’d be horrible for half the planet.
This might be worth for universal things, that are already researched and you don’t need to spend years and a kidney to figure out. Like maybe how are “next”, “cancel” and “back” buttons are next to each other. But I mean… just copy the most recent you used.
In Software Development it ultimatelly boils down to “are making software for the end users or are you making it for yourself?”
Because in your example, that’s what ultimatelly defines whose “wrong” the developer is supposed to guide him/herself by.
(So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).
Maybe you need better signage. Maybe you need to reverse the direction of the door. Maybe you could automate the door. Or maybe the user is just fucking stupid. 😄
A) Yes. Large companies have entire departments dedicated to QA, and it’s best not to leave QA to devs, if you can afford it. Dunno what you mean by “still,” since the job never went away.
Dunno what to tell you. I do QA for a living. I see postings all the time for QA positions in other companies, and my company has had QA for at least two decades, with the department expanding over the last three years.
I’m not claiming it’s ubiquitous, but maybe you’re just out of the loop.
In Agile, QA testing should be involved throughout the whole development process, with QA not just following the development, but supporting it. QA testing should be implemented early and continuously, with constant feedback to developers to ensure that any issues are fixed quickly.
Which, when put to practice, means QAs become BAs, no comprehensive QA occurs, and when the code is shit because they have no actual QA support and the scope changes constantly with no firm documented requirements, the dev gets fired.
Great model for people who like to sit in meetings and complain.
When did you retire? Agile has been around for at least 20 years, more like 30 if you count scrum being introduced before agile was formally defined. No matter how critical I am of agile it is hardly a fad at this point
Agile was definitelly taken in with the same irrationality as fashion at some point.
It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.
That said the Agile as fad phase was over a decade ago.
I stopped being able to find QA work in the early 2010s or so. Converted to BI Developer. Have not encountered a dedicated QA at any of the small assortment of jobs I have had since.
Edit: And fair, despite it being a waste of time cult mentality engineered to make developers suffer and enshitify software quality, Agile got enough Kool aid drinkers to qualify it as more than a fad.
I work at a company whose entire business model is providing QA to other companies. I work directly with some very large, public companies, and some smaller ones. Almost all of them have some form of dedicated in-house QA, which we supplement.
Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.
It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.
That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.
Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.
It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.
As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.
That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.
Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.
QA is also known as preventing shit from exploding and losing us millions of dollars in the process, or better yet, cybersec. Cybersec is just glorified QA
I worked for an actual QA department that produced actual documentation and ran actual full scale QA cycles.
In the past 15 years, I have seen that practice all but fully disappear and be replaced by people who click at things until they find 1 thing, have a verbal meeting vaguely describing it, and repeat 2 to 3 times a day.
IMO, that isn’t QA. It’s being lazy, illiterate, and whiny while making the dev do ALL of the actual work.
When I departed QA myself, it was in the onset of automation.
In return, when the QA jobs disappeared, I learned basic scripting and started automating BI processes.
So, I would say:
I should hope modern QA departments (as I am told they exist) are automated and share both their tests and their results with devs in an efficient manner.
I don’t think QA departments really exist today in a substantive way, and if they do, it isnt in as cooperative of a fashion as described in 1.
I still have observed a world where QA went bye bye. Planning? Drafting a Scope of Work? Doing a proper analysis of the solution you are seeking, fleshing it out, and setting a comprehensive list of firm requirements that define delivery of said solution? Offering the resources to test the deliverable against the well documented and established requirements to give the all clear before the solution is delivered?
Doesn’t exist anymore, and modern “QA” is being the lemming who sits in meetings as listens to the management, then schedules meetings to sit and complain at the Dev about how they aren’t “hitting the mark” (Because it was about 4 feet directly in front of them when they published, and is now at 5 erratically placed spaces behind them).
I think it’s probably because we’ve shifted away from shipping software as a product, and onto software as a service. I.E. in the 90s if win 95 irreversibly corrupted, that would be devastating to sales.
But today with windows 11? Just roll it out in one of the twenty three testing branches you have and see what happens, and if shit does break. Just work around it. It’ll be fine. Even if something does happen, you can most of the time, fix it and roll out a new update.
And i also think it’s moved to be more team centric, rather than department centric. A lot of the theory is probably more senior team led type responsibility. While everyone writing the code can chip in and add some as well. Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Also there’s a lot more money in shipping shit out the door, than there is in shipping a functional product, unfortunately.
Thank you for your TED talk defining enshitification.
Middle management bloat.
Edit: Bonus points for
Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Which is straight up just saying “why don’t the devs just do it themselves? I’m busy with meetings to whine back and forth with other middle management.”
I do QA for a living. If that’s the end result, it wasn’t intuitive. 😅
“The only intuitive interface is the nipple. After that, it’s all learned.” — traditional 20th-century folk wisdom.
Some babies have to be taught to nurse…
Bottles have nipples
Can you milk a bottle greg?
If their issue is with latching then a bottles not gonna change that
I’m not a professional baby feeder I just know that when my son wouldn’t latch on a tit they gave us a bottle and he did just fine.
I’m pretty sure that won’t stand in the way of somebody inventing a square bottle nipple and blaming the users for not using it properly.
I agree to a point, but users also do some weird stuff that you just can’t predict sometimes.
And that’s precisely why QA still exists and why it shouldn’t be the devs. And yet, you’ll still wind up with weird situations, despite your best efforts!
Yeah.
Any good software developer is going to account for and even test all the weird situations they can think of … and not the ones they cannot think of as they’re not even aware of those as a possibility (if they were they would account for and test them).
Which is why you want somebody with a different mindset to independently come up with their own situations.
It’s not a value judgment on the quality of the developer, it’s just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)
And some of that is because some users have been trained on some other bad UX.
and this is an incredibly valuable reason to have a technically simple UI, because it fundamentally limits the amount of stupid shit people can do, without it being the fault of the designer.
To be fair all “users” got what they wanted so… Success?
“Ugh, it works, but it was overly complicated to get what I needed.”
If they tried opening the door the wrong way, the door is wrong.
This is very perfectionist. Let me install my doors the way it’s comfortable or pleasing. Where I see a knob I’ll reach. And where I see a “pull” sign I pull, or get contex clues.
There is research for everything, let’s say it’s more comfortable to push and the knob is on the right side for me. I could spend way more time and effort than thia desrves to apeal to that study, “I have great UX”, I’d tell myself. But then I’d show this product on some eastern market where they read in “reverse” and it’ll not be comfortable nor “100% natural” for them. Meaning, I’d fail, my UX’d be horrible for half the planet.
This might be worth for universal things, that are already researched and you don’t need to spend years and a kidney to figure out. Like maybe how are “next”, “cancel” and “back” buttons are next to each other. But I mean… just copy the most recent you used.
You might have noticed at some point that for knobs are universally at the same height and same for light switches in houses that don’t suck.
there’s a difference between trying to open a door from the hinged side, vs designing a door that has 14 different deadbolts, and three latches on it.
One of those is user error, the other is designed complexity generally being a hindrance to the user.
“Wrong way” for whom?
In Software Development it ultimatelly boils down to “are making software for the end users or are you making it for yourself?”
Because in your example, that’s what ultimatelly defines whose “wrong” the developer is supposed to guide him/herself by.
(So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).
Maybe you need better signage. Maybe you need to reverse the direction of the door. Maybe you could automate the door. Or maybe the user is just fucking stupid. 😄
The philosophy is that the user’s intuition is never wrong because that’s what we’re trying to accommodate.
Also, if you have to post a sign, it’s probably broken by design. Users don’t read.
yeah who the fuck made this meme? A web programmer?
A) People still get paid to do dedicated QA?
B) If you really think that, you must be a noob.
A) Yes. Large companies have entire departments dedicated to QA, and it’s best not to leave QA to devs, if you can afford it. Dunno what you mean by “still,” since the job never went away.
B) Okay?
Yes grasshopper.
I was a QA for over 15 years.
Then the “Agile” fad ripped through the industry and QA died.
Dunno what to tell you. I do QA for a living. I see postings all the time for QA positions in other companies, and my company has had QA for at least two decades, with the department expanding over the last three years.
I’m not claiming it’s ubiquitous, but maybe you’re just out of the loop.
In Agile, QA testing should be involved throughout the whole development process, with QA not just following the development, but supporting it. QA testing should be implemented early and continuously, with constant feedback to developers to ensure that any issues are fixed quickly.
Hmmm…
Which, when put to practice, means QAs become BAs, no comprehensive QA occurs, and when the code is shit because they have no actual QA support and the scope changes constantly with no firm documented requirements, the dev gets fired.
Great model for people who like to sit in meetings and complain.
Horrible model for the people who actually work.
When did you retire? Agile has been around for at least 20 years, more like 30 if you count scrum being introduced before agile was formally defined. No matter how critical I am of agile it is hardly a fad at this point
Agile was definitelly taken in with the same irrationality as fashion at some point.
It’s probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.
That said the Agile as fad phase was over a decade ago.
Still working.
I stopped being able to find QA work in the early 2010s or so. Converted to BI Developer. Have not encountered a dedicated QA at any of the small assortment of jobs I have had since.
Edit: And fair, despite it being a waste of time cult mentality engineered to make developers suffer and enshitify software quality, Agile got enough Kool aid drinkers to qualify it as more than a fad.
I work at a company whose entire business model is providing QA to other companies. I work directly with some very large, public companies, and some smaller ones. Almost all of them have some form of dedicated in-house QA, which we supplement.
Bruh I’m a dev who does Agile and we still have a QA department lol
You either never worked with anything that did actual agile (to be fair, most don’t) or you haven’t done development in a long time if you think that.
A devout Kool aid drinker I see.
Did you buy that Kool aid with your story points?
I hear they have a competitive exchange rate to Stanley nickels.
Don’t take this badly but it sounds like you’ve only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.
It’s perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody “amateur hour every hour of the day, every day of the year” messes, Agile or no Agile.
That does however not mean that your experience stands for the entirety of what’s out there trumphing even the experience of other people who also work in QA in environments where Agile is used.
Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn’t need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.
It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn’t need experienced techies.
As usual it turned out that “there are no silver bullets”, things are more complex, Agile doesn’t work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they’re extremelly hard or even impossible to pull at a project scope.
That said, I bet that what you think is “The Industry” is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.
Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren’t even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.
QA is also known as preventing shit from exploding and losing us millions of dollars in the process, or better yet, cybersec. Cybersec is just glorified QA
I guess I’m just being a snob here.
I worked for an actual QA department that produced actual documentation and ran actual full scale QA cycles.
In the past 15 years, I have seen that practice all but fully disappear and be replaced by people who click at things until they find 1 thing, have a verbal meeting vaguely describing it, and repeat 2 to 3 times a day.
IMO, that isn’t QA. It’s being lazy, illiterate, and whiny while making the dev do ALL of the actual work.
a lot of QA has probably been automated, The entirety of SQL for instance, is using an automated testing suite to ensure functionality.
That’s a fair point.
When I departed QA myself, it was in the onset of automation.
In return, when the QA jobs disappeared, I learned basic scripting and started automating BI processes.
So, I would say:
I should hope modern QA departments (as I am told they exist) are automated and share both their tests and their results with devs in an efficient manner.
I don’t think QA departments really exist today in a substantive way, and if they do, it isnt in as cooperative of a fashion as described in 1.
I still have observed a world where QA went bye bye. Planning? Drafting a Scope of Work? Doing a proper analysis of the solution you are seeking, fleshing it out, and setting a comprehensive list of firm requirements that define delivery of said solution? Offering the resources to test the deliverable against the well documented and established requirements to give the all clear before the solution is delivered?
Doesn’t exist anymore, and modern “QA” is being the lemming who sits in meetings as listens to the management, then schedules meetings to sit and complain at the Dev about how they aren’t “hitting the mark” (Because it was about 4 feet directly in front of them when they published, and is now at 5 erratically placed spaces behind them).
I think it’s probably because we’ve shifted away from shipping software as a product, and onto software as a service. I.E. in the 90s if win 95 irreversibly corrupted, that would be devastating to sales.
But today with windows 11? Just roll it out in one of the twenty three testing branches you have and see what happens, and if shit does break. Just work around it. It’ll be fine. Even if something does happen, you can most of the time, fix it and roll out a new update.
And i also think it’s moved to be more team centric, rather than department centric. A lot of the theory is probably more senior team led type responsibility. While everyone writing the code can chip in and add some as well. Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Also there’s a lot more money in shipping shit out the door, than there is in shipping a functional product, unfortunately.
Thank you for your TED talk defining enshitification.
Middle management bloat.
Edit: Bonus points for
Which is straight up just saying “why don’t the devs just do it themselves? I’m busy with meetings to whine back and forth with other middle management.”
pretty much yeah lol