Is there a term for “the user can't use anything wrong” design?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty{ margin-bottom:0;
}






up vote
27
down vote

favorite
7












I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 50




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    2 days ago








  • 34




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    2 days ago






  • 44




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    yesterday






  • 18




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    yesterday






  • 12




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    yesterday



















up vote
27
down vote

favorite
7












I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 50




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    2 days ago








  • 34




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    2 days ago






  • 44




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    yesterday






  • 18




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    yesterday






  • 12




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    yesterday















up vote
27
down vote

favorite
7









up vote
27
down vote

favorite
7






7





I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.










share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











I'm of the opinion that the user is always using software or hardware correctly and to imply otherwise is rude, condescending, and philosophically wrong. For example, I and everyone I know pulls USB drives out of a computer without bothering to click eject. OS developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.



Is this a widely-held view among UX designers/developers? Is there an official term for this philosophy?



edit: It seems I need to clarify what I mean by "the user can't use anything wrong". I'm not saying that the user should be prevented from using something wrong, but that there aren't any "wrong" ways to use something. If a large percentage of users use a microphone as a hammer (like the Shure SM57 genuinely is), designers should embrace this and improve the hammer capabilities in the next iteration.







user-behavior user-centered-design






share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 3 hours ago





















New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 2 days ago









PascLeRasc

23827




23827




New contributor




PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






PascLeRasc is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 50




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    2 days ago








  • 34




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    2 days ago






  • 44




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    yesterday






  • 18




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    yesterday






  • 12




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    yesterday
















  • 50




    What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
    – Jan Dorniak
    2 days ago








  • 34




    This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
    – Confused
    2 days ago






  • 44




    @Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
    – Lightness Races in Orbit
    yesterday






  • 18




    I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
    – Dave Cousineau
    yesterday






  • 12




    Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
    – Gordon Davisson
    yesterday










50




50




What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
2 days ago






What you write about USB drives is, unfortunately, impossible physically. The OS needs to clean stuff up in the filesystem before the drive is disconnected. And the OS can not know your intentions if you don't warn it. So: what do you do if making sure something can't be done wrongly is impossible?
– Jan Dorniak
2 days ago






34




34




This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
2 days ago




This isn't true. A file system can pre-emptively do all of this. And almost all modern operating systems, even Android, do exactly this. The warning messages are there out of habit and in the vain hope it will discourage users from pulling out a memory stick whilst files are being transferred.
– Confused
2 days ago




44




44




@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
yesterday




@Confused That is simply not true. By default on Windows write caching is ON and yanking out the drive even if you think you've finished writing to it can and will cause your data to become corrupted. I've seen it. It's not "out of habit" or "in the vain hope" - it is the consequence of an actual feature. You can disable write caching though (it's probably called something like "enable fast removal" in your OS).
– Lightness Races in Orbit
yesterday




18




18




I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
yesterday




I think the mistake here is using USB as an example. USB is hardware, and hardware will always have some physical limitations. You might be able to write pure software this way, but not hardware.
– Dave Cousineau
yesterday




12




12




Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
yesterday






Another example where the user clearly is using it wrong: storing important items in the trash/recycle bin/deleted items/etc. This is actually disturbingly common...
– Gordon Davisson
yesterday












8 Answers
8






active

oldest

votes

















up vote
61
down vote













Accommodation for every possible user interaction is impossible.



Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





  1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


  2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


  3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






share|improve this answer










New contributor




formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 55




    "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
    – manassehkatz
    2 days ago






  • 20




    I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
    – formicini
    2 days ago






  • 7




    @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
    – formicini
    2 days ago






  • 5




    While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
    – Draco18s
    yesterday






  • 4




    Voted down as this isn't really an answer to the question.
    – Cyberspark
    yesterday


















up vote
38
down vote













Yes, there is a term for this ("the user can't do anything wrong"):



foolproof



But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




There is also a term for minimizing what a user can do wrong:



Defensive Design



In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




  • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

  • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






share|improve this answer








New contributor




ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
    – Ruadhan2300
    20 hours ago






  • 2




    "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
    – Fattie
    15 hours ago










  • I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
    – Monty Harder
    9 hours ago










  • Also great answers.
    – Fattie
    3 hours ago


















up vote
33
down vote













User-Centered Design



What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




  1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

  2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

  3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


  4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






share|improve this answer



















  • 24




    (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
    – chrylis
    2 days ago






  • 3




    @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
    – Nobody
    yesterday








  • 1




    @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
    – chrylis
    yesterday






  • 1




    Yes, this is the correct answer
    – Fattie
    3 hours ago


















up vote
30
down vote













I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






share|improve this answer








New contributor




Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 2




    I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
    – PascLeRasc
    yesterday










  • @PascLeRasc or is it pandering to the lack of common sense...
    – Solar Mike
    yesterday






  • 7




    @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
    – J...
    yesterday








  • 6




    @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
    – Confused
    yesterday


















up vote
11
down vote













This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
https://www.nngroup.com/articles/ten-usability-heuristics/



"Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



Apple refers to it as “User Control" in their IOS guidelines:
https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



"The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






share|improve this answer





















  • Joel Spolsky (praise be) wrote a pretty good article in his blog about this
    – Ruadhan2300
    20 hours ago










  • Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
    – Graham
    8 hours ago


















up vote
7
down vote













No. It is not a widely held view among UX designers. Unfortunately.



Even less so amongst those using SO and considering themselves to be UX Designers.



I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






share|improve this answer




























    up vote
    4
    down vote













    Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



    I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



    This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



    Let's not even talk about implementing PSU to meet this Sawzall requirement...



    Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



    Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



    In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



    So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






    share|improve this answer

















    • 3




      iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
      – Džuris
      21 hours ago






    • 1




      @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
      – Ruadhan2300
      20 hours ago






    • 1




      @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
      – alephzero
      18 hours ago










    • @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
      – Cort Ammon
      16 hours ago










    • @alephzero Can you please stop posting unsubstantive comments?
      – PascLeRasc
      14 hours ago


















    up vote
    0
    down vote














    OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




    Yes, you're totally, completely, absolutely correct.



    Engineers and companies that do what you say, make huge amounts of money.



    Some of the biggest key products of our entire era are totally based on what you describe.




    Is this a widely-held view among UX designers/developers?




    Yes, it's one of the central ideas.



    it is constantly and widely discussed as one of, or the, central issues in UX.



    The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




    Is there an official term for this philosophy?




    Sure, it is



    User-driven design



    Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



    Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



    Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



    Note that to some extent, the everyday phrase



    K.I.S.S.



    amounts to, basically, a similar approach.





    Note - since the "Pascal-issue" is indeed so pervasive, there are



    many, many specific terms for subsets of the concept:



    For example, in the literal example you gave, that is known as



    plug-and-play



    or



    hot swappable



    Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



    So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






    share|improve this answer























      Your Answer








      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "102"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });






      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.










       

      draft saved


      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fux.stackexchange.com%2fquestions%2f122360%2fis-there-a-term-for-the-user-cant-use-anything-wrong-design%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      8 Answers
      8






      active

      oldest

      votes








      8 Answers
      8






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      61
      down vote













      Accommodation for every possible user interaction is impossible.



      Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





      1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


      2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


      3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


      All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



      If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






      share|improve this answer










      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.














      • 55




        "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
        – manassehkatz
        2 days ago






      • 20




        I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
        – formicini
        2 days ago






      • 7




        @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
        – formicini
        2 days ago






      • 5




        While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
        – Draco18s
        yesterday






      • 4




        Voted down as this isn't really an answer to the question.
        – Cyberspark
        yesterday















      up vote
      61
      down vote













      Accommodation for every possible user interaction is impossible.



      Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





      1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


      2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


      3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


      All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



      If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






      share|improve this answer










      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.














      • 55




        "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
        – manassehkatz
        2 days ago






      • 20




        I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
        – formicini
        2 days ago






      • 7




        @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
        – formicini
        2 days ago






      • 5




        While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
        – Draco18s
        yesterday






      • 4




        Voted down as this isn't really an answer to the question.
        – Cyberspark
        yesterday













      up vote
      61
      down vote










      up vote
      61
      down vote









      Accommodation for every possible user interaction is impossible.



      Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





      1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


      2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


      3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


      All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



      If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.






      share|improve this answer










      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      Accommodation for every possible user interaction is impossible.



      Let's use your example, but switch the USB to a whole computer. A user can pull the power cord and expect the computer to safely turn off with every data saved in the drive magically. Just like a USB. How should a UX designer prepare for this?





      1. Lock the cord in place so that the user can't yank it out. Hard to maintain and replace, more money required for a feature hardly anyone would want to use when they can just press the power button. Also a lot slower if you need to move multiple computers at once, say, when your company changes its location.


      2. Remove computer caches. Data is never delayed, and you don't even have to press save when updating a component. Computer speed now slows to a crawl. A myriad of security concerns will have to be accommodated as well.


      3. Use a mandatory emergency power source. The user is now forced to buy the manufacturer's UPS/battery and have to pay to get it changed even if they already have a spare at home.


      All solutions above are worse than a simple manual warning users about the danger of unplugging a running computer.



      If you don't expect an electric saw to magically stop running right when it touches your finger, then don't expect computers to do all the work for you. That's why designers and programmers have the acronym RTFM.







      share|improve this answer










      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this answer



      share|improve this answer








      edited yesterday





















      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      answered 2 days ago









      formicini

      531117




      531117




      New contributor




      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      formicini is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      • 55




        "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
        – manassehkatz
        2 days ago






      • 20




        I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
        – formicini
        2 days ago






      • 7




        @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
        – formicini
        2 days ago






      • 5




        While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
        – Draco18s
        yesterday






      • 4




        Voted down as this isn't really an answer to the question.
        – Cyberspark
        yesterday














      • 55




        "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
        – manassehkatz
        2 days ago






      • 20




        I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
        – formicini
        2 days ago






      • 7




        @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
        – formicini
        2 days ago






      • 5




        While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
        – Draco18s
        yesterday






      • 4




        Voted down as this isn't really an answer to the question.
        – Cyberspark
        yesterday








      55




      55




      "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
      – manassehkatz
      2 days ago




      "If you don't expect an electric saw to magically stop running right when it touches your finger" is no longer a valid analogy - see sawstop.com/why-sawstop/the-technology
      – manassehkatz
      2 days ago




      20




      20




      I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
      – formicini
      2 days ago




      I should not have underestimated technology. Still, it falls into solution 1 of my example (SawStop is expensive, required a new table setup, hard to maintain and can't chop wet log) so the analogy is okay. And beside, maybe someday a computer will do all the work for us, you never know.
      – formicini
      2 days ago




      7




      7




      @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
      – formicini
      2 days ago




      @joojaa nagging is necessary because else users may not even know what they are doing is wrong. Can't RTFM if they don't know there is a manual in the first place. When was the last time we read a smartphone's manual for example?
      – formicini
      2 days ago




      5




      5




      While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
      – Draco18s
      yesterday




      While SawStop is expensive, its less expensive than the alternative: getting a finger reattached. So it isn't really comparable to locking power cords. Additionally people don't go around accidently unplugging computers (sitcoms not withstanding) whereas they DO accidentally sick their fingers in table saw blades.
      – Draco18s
      yesterday




      4




      4




      Voted down as this isn't really an answer to the question.
      – Cyberspark
      yesterday




      Voted down as this isn't really an answer to the question.
      – Cyberspark
      yesterday












      up vote
      38
      down vote













      Yes, there is a term for this ("the user can't do anything wrong"):



      foolproof



      But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




      a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




      There is also a term for minimizing what a user can do wrong:



      Defensive Design



      In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




      • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

      • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






      share|improve this answer








      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.


















      • There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
        – Ruadhan2300
        20 hours ago






      • 2




        "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
        – Fattie
        15 hours ago










      • I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
        – Monty Harder
        9 hours ago










      • Also great answers.
        – Fattie
        3 hours ago















      up vote
      38
      down vote













      Yes, there is a term for this ("the user can't do anything wrong"):



      foolproof



      But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




      a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




      There is also a term for minimizing what a user can do wrong:



      Defensive Design



      In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




      • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

      • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






      share|improve this answer








      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.


















      • There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
        – Ruadhan2300
        20 hours ago






      • 2




        "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
        – Fattie
        15 hours ago










      • I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
        – Monty Harder
        9 hours ago










      • Also great answers.
        – Fattie
        3 hours ago













      up vote
      38
      down vote










      up vote
      38
      down vote









      Yes, there is a term for this ("the user can't do anything wrong"):



      foolproof



      But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




      a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




      There is also a term for minimizing what a user can do wrong:



      Defensive Design



      In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




      • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

      • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.






      share|improve this answer








      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      Yes, there is a term for this ("the user can't do anything wrong"):



      foolproof



      But as other answers point out, making something completely foolproof isn't feasible. On wikipedia I found a quote from Douglas Adams' Mostly Harmless:




      a common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools




      There is also a term for minimizing what a user can do wrong:



      Defensive Design



      In Defensive Design you try to design in such a way that users can do least harm, while not expecting to make it completely foolproof. Some techniques include:




      • Automatic random testing: Letting a script give random inputs to your application, hoping to make it crash

      • Monkey testing: User testing, but instructing the users to either try to break the system, or try to act as oblivious to the systems workings as possible.







      share|improve this answer








      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this answer



      share|improve this answer






      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      answered yesterday









      ONOZ

      47135




      47135




      New contributor




      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      ONOZ is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.












      • There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
        – Ruadhan2300
        20 hours ago






      • 2




        "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
        – Fattie
        15 hours ago










      • I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
        – Monty Harder
        9 hours ago










      • Also great answers.
        – Fattie
        3 hours ago


















      • There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
        – Ruadhan2300
        20 hours ago






      • 2




        "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
        – Fattie
        15 hours ago










      • I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
        – Monty Harder
        9 hours ago










      • Also great answers.
        – Fattie
        3 hours ago
















      There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
      – Ruadhan2300
      20 hours ago




      There's a rather good "see also" on the wiki page for Defensive Design on the subject of Defensive programming. It describes three rules of thumb for it, the third of which feels most relevant. "Making the software behave in a predictable manner despite unexpected inputs or user actions." The goal of good UX is to present the user with just the thing(s) they want to do, and to make it clear what will happen when they do it.
      – Ruadhan2300
      20 hours ago




      2




      2




      "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
      – Fattie
      15 hours ago




      "Defensive Design" - good one, that seems to be what the OP is asking in this confusing question.
      – Fattie
      15 hours ago












      I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
      – Monty Harder
      9 hours ago




      I always say "Fool-proof and idiot-resistant". You can make things that even a fool can't screw up, but no matter how you try to make things idiot-proof, the universe can always make a better idiot.
      – Monty Harder
      9 hours ago












      Also great answers.
      – Fattie
      3 hours ago




      Also great answers.
      – Fattie
      3 hours ago










      up vote
      33
      down vote













      User-Centered Design



      What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



      As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



      In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




      1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

      2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

      3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


      4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






      share|improve this answer



















      • 24




        (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
        – chrylis
        2 days ago






      • 3




        @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
        – Nobody
        yesterday








      • 1




        @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
        – chrylis
        yesterday






      • 1




        Yes, this is the correct answer
        – Fattie
        3 hours ago















      up vote
      33
      down vote













      User-Centered Design



      What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



      As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



      In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




      1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

      2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

      3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


      4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






      share|improve this answer



















      • 24




        (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
        – chrylis
        2 days ago






      • 3




        @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
        – Nobody
        yesterday








      • 1




        @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
        – chrylis
        yesterday






      • 1




        Yes, this is the correct answer
        – Fattie
        3 hours ago













      up vote
      33
      down vote










      up vote
      33
      down vote









      User-Centered Design



      What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



      As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



      In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




      1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

      2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

      3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


      4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.






      share|improve this answer














      User-Centered Design



      What you’re describing is a consequence of User-Centered Design (coined by Don Norman himself). I’ve heard this principle expressed as “the user is always right” and “it’s not the user’s fault”.



      As has been pointed out, this type of thinking is not common enough, even among UX professionals. The issue is that we’re trying to “fix” user behavior, rather than matching the user’s mental model.



      In your example, the user’s mental model is that the flash drive is ready and can be removed if no files are being copied to or from it. Therefore, we should design our software and hardware to match this and to prevent any errors that might occur as a result. Here are a few suggestions to accomplish this:




      1. Never keep an external drive in a dirty state longer than necessary. When writing to the drive is complete, get the filesystem into a state where it can be unplugged safely.

      2. Always show an indication or notification when a drive in use, such as when a file is being saved (which should also be done automatically!). The system should inform users as to exactly what is happening, so that they know that the drive should not be unplugged yet.

      3. Ideally, USB ports should be redesigned so that it’s possible for the computer to physically hold the device in place; the operating system would then release the drive when it’s safe to be unplugged. This would make these problems impossible. (This is how CD/DVD-RW drives work when a disc is being burned.) I don’t know if this is feasible from an engineering standpoint, but I think it should have been considered during the design process for USB-C.


      4. Undo. In case a drive has been unplugged while in use, make it possible to fix the issue by plugging it back in so that the system can resume exactly where it left off.







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited 17 hours ago

























      answered 2 days ago









      David Regev

      835513




      835513








      • 24




        (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
        – chrylis
        2 days ago






      • 3




        @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
        – Nobody
        yesterday








      • 1




        @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
        – chrylis
        yesterday






      • 1




        Yes, this is the correct answer
        – Fattie
        3 hours ago














      • 24




        (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
        – chrylis
        2 days ago






      • 3




        @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
        – Nobody
        yesterday








      • 1




        @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
        – chrylis
        yesterday






      • 1




        Yes, this is the correct answer
        – Fattie
        3 hours ago








      24




      24




      (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
      – chrylis
      2 days ago




      (1) Longer than necessary for what, exactly? If the USB disk is on rotational media, it's entirely possible for tens of seconds of writes to be queued up nearly instantaneously. (3) This is a classic example of fixation on a single goal in disregard of cost, other failure modes, user convenience/frustration, and even safety (see MagSafe), unfortunately far too common in UX design.
      – chrylis
      2 days ago




      3




      3




      @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
      – Nobody
      yesterday






      @chrylis And if the software doesn't show some indicator that the data was only enqueued and not yet written it's rubbish. And if there is a point during the file transfer so that the file system breaks when you interrupt the transfer at that point, then the file system is rubbish. I agree on (3) because for USB drives it makes sense to interrupt a transfer by pulling it out.
      – Nobody
      yesterday






      1




      1




      @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
      – chrylis
      yesterday




      @Nobody FAT is a pretty lousy filesystem by modern standards. You won't find much disagreement about that. However, it's a fact of life and a design constraint.
      – chrylis
      yesterday




      1




      1




      Yes, this is the correct answer
      – Fattie
      3 hours ago




      Yes, this is the correct answer
      – Fattie
      3 hours ago










      up vote
      30
      down vote













      I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






      share|improve this answer








      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.














      • 2




        I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
        – PascLeRasc
        yesterday










      • @PascLeRasc or is it pandering to the lack of common sense...
        – Solar Mike
        yesterday






      • 7




        @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
        – J...
        yesterday








      • 6




        @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
        – Confused
        yesterday















      up vote
      30
      down vote













      I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






      share|improve this answer








      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.














      • 2




        I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
        – PascLeRasc
        yesterday










      • @PascLeRasc or is it pandering to the lack of common sense...
        – Solar Mike
        yesterday






      • 7




        @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
        – J...
        yesterday








      • 6




        @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
        – Confused
        yesterday













      up vote
      30
      down vote










      up vote
      30
      down vote









      I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).






      share|improve this answer








      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      I wonder if the concept you are looking for is Poka-yoke (https://en.wikipedia.org/wiki/Poka-yoke). This is often more associated with mechanical design (e.g. zoo cage double doors which can't both be open at the same time) but you can make an analogy with UX design (e.g. don't offer a delete button when there is nothing available to delete).







      share|improve this answer








      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this answer



      share|improve this answer






      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      answered yesterday









      Kit

      30112




      30112




      New contributor




      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      Kit is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      • 2




        I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
        – PascLeRasc
        yesterday










      • @PascLeRasc or is it pandering to the lack of common sense...
        – Solar Mike
        yesterday






      • 7




        @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
        – J...
        yesterday








      • 6




        @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
        – Confused
        yesterday














      • 2




        I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
        – PascLeRasc
        yesterday










      • @PascLeRasc or is it pandering to the lack of common sense...
        – Solar Mike
        yesterday






      • 7




        @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
        – J...
        yesterday








      • 6




        @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
        – Confused
        yesterday








      2




      2




      I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
      – PascLeRasc
      yesterday




      I like this, thanks. That's a great example about the zoo double doors - it illustrates perfectly how the user shouldn't be able to be at fault.
      – PascLeRasc
      yesterday












      @PascLeRasc or is it pandering to the lack of common sense...
      – Solar Mike
      yesterday




      @PascLeRasc or is it pandering to the lack of common sense...
      – Solar Mike
      yesterday




      7




      7




      @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
      – J...
      yesterday






      @SolarMike It's pandering to the bottom line. Lack of common sense is a fact of nature. You can either let people make mistakes, at peril of profits (or safety!) when an error is eventually made, or you can engineer the job so that they cannot mess it up.
      – J...
      yesterday






      6




      6




      @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
      – Confused
      yesterday




      @SolarMike it's as if you've never heard of Murphy's Law. Or NASA.
      – Confused
      yesterday










      up vote
      11
      down vote













      This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



      Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
      https://www.nngroup.com/articles/ten-usability-heuristics/



      "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



      Apple refers to it as “User Control" in their IOS guidelines:
      https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



      "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






      share|improve this answer





















      • Joel Spolsky (praise be) wrote a pretty good article in his blog about this
        – Ruadhan2300
        20 hours ago










      • Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
        – Graham
        8 hours ago















      up vote
      11
      down vote













      This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



      Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
      https://www.nngroup.com/articles/ten-usability-heuristics/



      "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



      Apple refers to it as “User Control" in their IOS guidelines:
      https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



      "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






      share|improve this answer





















      • Joel Spolsky (praise be) wrote a pretty good article in his blog about this
        – Ruadhan2300
        20 hours ago










      • Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
        – Graham
        8 hours ago













      up vote
      11
      down vote










      up vote
      11
      down vote









      This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



      Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
      https://www.nngroup.com/articles/ten-usability-heuristics/



      "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



      Apple refers to it as “User Control" in their IOS guidelines:
      https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



      "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."






      share|improve this answer












      This is a common UX design principle. The best error message, is to avoid an error message in the first place. There are many examples of design principles out there, but no standard set.



      Jacob Neilson used the term “Error Prevention” in his 10 usability heuristics.
      https://www.nngroup.com/articles/ten-usability-heuristics/



      "Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action."



      Apple refers to it as “User Control" in their IOS guidelines:
      https://developer.apple.com/design/human-interface-guidelines/ios/overview/themes/



      "The best apps find the correct balance between enabling users and avoiding unwanted outcomes."







      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered 2 days ago









      Jeremy Franck

      1963




      1963












      • Joel Spolsky (praise be) wrote a pretty good article in his blog about this
        – Ruadhan2300
        20 hours ago










      • Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
        – Graham
        8 hours ago


















      • Joel Spolsky (praise be) wrote a pretty good article in his blog about this
        – Ruadhan2300
        20 hours ago










      • Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
        – Graham
        8 hours ago
















      Joel Spolsky (praise be) wrote a pretty good article in his blog about this
      – Ruadhan2300
      20 hours ago




      Joel Spolsky (praise be) wrote a pretty good article in his blog about this
      – Ruadhan2300
      20 hours ago












      Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
      – Graham
      8 hours ago




      Or to improve on that, only report error messages which direct the user to how to solve the problem. "File streaming error" isn't a good error message if the actual problem is "Lost internet connection whilst downloading file", just for an example.
      – Graham
      8 hours ago










      up vote
      7
      down vote













      No. It is not a widely held view among UX designers. Unfortunately.



      Even less so amongst those using SO and considering themselves to be UX Designers.



      I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



      UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






      share|improve this answer

























        up vote
        7
        down vote













        No. It is not a widely held view among UX designers. Unfortunately.



        Even less so amongst those using SO and considering themselves to be UX Designers.



        I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



        UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






        share|improve this answer























          up vote
          7
          down vote










          up vote
          7
          down vote









          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.






          share|improve this answer












          No. It is not a widely held view among UX designers. Unfortunately.



          Even less so amongst those using SO and considering themselves to be UX Designers.



          I suspect this is mainly because UX design is not a rigorous field, nor do its proponents practice patience and understanding of their potential users. Perhaps even worse, they're seemingly of the belief ideal UX 'design' exists and can be discerned from data, without realising this is done through the subjectivity of themselves and their peers. This compounds because they're often the least qualified to set criteria for analysis, lacking both insight and intuition. Often not valuing these things, at all.



          UX Design is one of the few fields suffering from more issues pertaining to self-selection bias than programming. Quite an achievement.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered 2 days ago









          Confused

          1,823516




          1,823516






















              up vote
              4
              down vote













              Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



              I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



              This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



              Let's not even talk about implementing PSU to meet this Sawzall requirement...



              Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



              Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



              In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



              So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






              share|improve this answer

















              • 3




                iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
                – Džuris
                21 hours ago






              • 1




                @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
                – Ruadhan2300
                20 hours ago






              • 1




                @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
                – alephzero
                18 hours ago










              • @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
                – Cort Ammon
                16 hours ago










              • @alephzero Can you please stop posting unsubstantive comments?
                – PascLeRasc
                14 hours ago















              up vote
              4
              down vote













              Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



              I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



              This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



              Let's not even talk about implementing PSU to meet this Sawzall requirement...



              Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



              Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



              In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



              So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






              share|improve this answer

















              • 3




                iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
                – Džuris
                21 hours ago






              • 1




                @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
                – Ruadhan2300
                20 hours ago






              • 1




                @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
                – alephzero
                18 hours ago










              • @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
                – Cort Ammon
                16 hours ago










              • @alephzero Can you please stop posting unsubstantive comments?
                – PascLeRasc
                14 hours ago













              up vote
              4
              down vote










              up vote
              4
              down vote









              Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



              I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



              This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



              Let's not even talk about implementing PSU to meet this Sawzall requirement...



              Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



              Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



              In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



              So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"






              share|improve this answer












              Just approaching this question from an analytical perspective, you'll see this mentality in some UX environments and not in others. If users are heavily limited with regard to what they can do, you'll see more preference for UX that follow the principles you describe. The more freedom users are permitted, the less popular these principles are.



              I wouldn't say its a real name for this effect, but I'd call it "with great power comes great responsibility."



              This is the issue with the USB example which has shown up several times in this thread. A user who can physically modify hardware has a remarkable amount of freedom. They have great power over the system, and thus they have more responsibility for what happens. Sure, I can make a USB device which locks in place until files are done copying. That will work as long as you limit their power to gentle tugs on the hardware along the axis of the USB device. A user with a Sawzall can most definitely do something wrong to my USB device if they aren't responsible enough and aren't aware of what cutting a USB device in half while it is connected can do.



              Let's not even talk about implementing PSU to meet this Sawzall requirement...



              Any system with a compiler has to face this reality. I can and will do something wrong with my compiler. I will break something. I can delete files I wasn't supposed to delete. Heck, I have deleted such files! I even deleted them in parallel with a glorious multithreaded harbinger of doom! It was bad news, and was most definitely "my mistake."



              Contrast that with designing a iPhone app. iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. Likewise, app developers often permit very few operations. That keeps your UX simple. In these situations, its very easy to capture the small range of operations a user can do and prove that the user indeed cannot do anything wrong. In such settings, it makes a lot of sense from a user experience perspective to support this mentality.



              In particular, business apps are designed with this in mind. You really don't want to let a low-paid entry level worker make a catastrophic mistake with your app. Point-of-sale devices are designed to make sure you don't accidentally email a credit card number to some malicious agent in a foreign nation. You just can't do it!



              So we can see both extremes. In some situations you want to make sure the user really can't do anything wrong. In other situations you can't. I think it's pretty reasonable to say there's no dividing line between the mentalities. It's a smooth spectrum from "the user can't do wrong" to "oh my god, the monkey has a knife!"







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered yesterday









              Cort Ammon

              58125




              58125








              • 3




                iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
                – Džuris
                21 hours ago






              • 1




                @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
                – Ruadhan2300
                20 hours ago






              • 1




                @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
                – alephzero
                18 hours ago










              • @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
                – Cort Ammon
                16 hours ago










              • @alephzero Can you please stop posting unsubstantive comments?
                – PascLeRasc
                14 hours ago














              • 3




                iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
                – Džuris
                21 hours ago






              • 1




                @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
                – Ruadhan2300
                20 hours ago






              • 1




                @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
                – alephzero
                18 hours ago










              • @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
                – Cort Ammon
                16 hours ago










              • @alephzero Can you please stop posting unsubstantive comments?
                – PascLeRasc
                14 hours ago








              3




              3




              iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
              – Džuris
              21 hours ago




              iOS severely limits what users can do and how they can interact with the applications by design. It's the purpose of a good mobile OS. - how is that good? That's exactly the reason why I dislike iPhones. I don't want the phone to decide which option should be available for me.
              – Džuris
              21 hours ago




              1




              1




              @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
              – Ruadhan2300
              20 hours ago




              @Džuris My dad once characterised the difference between iOS, Windows and Linux as a progression of how much people wanted to be involved in what their computer was doing. iOS users just want to use the applications and Do Things without dealing with a computer, Windows users like a bit more control but ultimately prefer not to think about most of the technical side, and Linux users fear the robot revolution and want to do everything themselves. He was mostly tongue in cheek about it but I think there's a grain of truth there :P
              – Ruadhan2300
              20 hours ago




              1




              1




              @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
              – alephzero
              18 hours ago




              @Ruadhan2300 your dad was getting close, but not quite right. The objective of iOS users is to be seen as the owner of an (expensive and fashionable) high tech device. The objecting of Windows users is to use the computer apps get some "real-world" work done. The objective of Linux users is to get Linux itself to work - actually using it once it does work isn't very interesting ;)
              – alephzero
              18 hours ago












              @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
              – Cort Ammon
              16 hours ago




              @Džuris Its the tradeoff of power and responsibility. Most iOS users do not want to be told "the Candy Crush knockoff game you just downloaded had a bug which erased all of your pictures and took them off the cloud before draining your battery when you needed to receive an important phone call." They would like an OS that helps avoid such situations. Arguably all OSs work like that. One of the primary aspects of any reasonable multitasking OS is that it limits application access to hardware so that one app can't take the whole system down. Apps can't insert IRQs.
              – Cort Ammon
              16 hours ago












              @alephzero Can you please stop posting unsubstantive comments?
              – PascLeRasc
              14 hours ago




              @alephzero Can you please stop posting unsubstantive comments?
              – PascLeRasc
              14 hours ago










              up vote
              0
              down vote














              OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




              Yes, you're totally, completely, absolutely correct.



              Engineers and companies that do what you say, make huge amounts of money.



              Some of the biggest key products of our entire era are totally based on what you describe.




              Is this a widely-held view among UX designers/developers?




              Yes, it's one of the central ideas.



              it is constantly and widely discussed as one of, or the, central issues in UX.



              The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




              Is there an official term for this philosophy?




              Sure, it is



              User-driven design



              Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



              Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



              Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



              Note that to some extent, the everyday phrase



              K.I.S.S.



              amounts to, basically, a similar approach.





              Note - since the "Pascal-issue" is indeed so pervasive, there are



              many, many specific terms for subsets of the concept:



              For example, in the literal example you gave, that is known as



              plug-and-play



              or



              hot swappable



              Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



              So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






              share|improve this answer



























                up vote
                0
                down vote














                OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




                Yes, you're totally, completely, absolutely correct.



                Engineers and companies that do what you say, make huge amounts of money.



                Some of the biggest key products of our entire era are totally based on what you describe.




                Is this a widely-held view among UX designers/developers?




                Yes, it's one of the central ideas.



                it is constantly and widely discussed as one of, or the, central issues in UX.



                The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




                Is there an official term for this philosophy?




                Sure, it is



                User-driven design



                Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



                Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



                Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



                Note that to some extent, the everyday phrase



                K.I.S.S.



                amounts to, basically, a similar approach.





                Note - since the "Pascal-issue" is indeed so pervasive, there are



                many, many specific terms for subsets of the concept:



                For example, in the literal example you gave, that is known as



                plug-and-play



                or



                hot swappable



                Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



                So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






                share|improve this answer

























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote










                  OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




                  Yes, you're totally, completely, absolutely correct.



                  Engineers and companies that do what you say, make huge amounts of money.



                  Some of the biggest key products of our entire era are totally based on what you describe.




                  Is this a widely-held view among UX designers/developers?




                  Yes, it's one of the central ideas.



                  it is constantly and widely discussed as one of, or the, central issues in UX.



                  The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




                  Is there an official term for this philosophy?




                  Sure, it is



                  User-driven design



                  Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



                  Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



                  Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



                  Note that to some extent, the everyday phrase



                  K.I.S.S.



                  amounts to, basically, a similar approach.





                  Note - since the "Pascal-issue" is indeed so pervasive, there are



                  many, many specific terms for subsets of the concept:



                  For example, in the literal example you gave, that is known as



                  plug-and-play



                  or



                  hot swappable



                  Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



                  So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".






                  share|improve this answer















                  OS [and all software] developers should see this and build their software to accommodate this instead of bothering users with "you did that wrong" messages.




                  Yes, you're totally, completely, absolutely correct.



                  Engineers and companies that do what you say, make huge amounts of money.



                  Some of the biggest key products of our entire era are totally based on what you describe.




                  Is this a widely-held view among UX designers/developers?




                  Yes, it's one of the central ideas.



                  it is constantly and widely discussed as one of, or the, central issues in UX.



                  The BMW 7-series was a nightmare since you had to fight and search for every function among literally 100s of choices. Whereas the masterpiece Renault Espace cockpit was (see below) user-driven and the epitome of that.




                  Is there an official term for this philosophy?




                  Sure, it is



                  User-driven design



                  Not 10 minutes ago I was yelling at some people "make it user-driven". They had some switches etc. that "had to be" set by a customer before use, which is a crap idea. Instead I screamed at everyone to make it "Pascal-style". I literally said "Make this user driven, get rid of the fucking switches."



                  Yesterday I literally dealt the entire workday with precisely the "Pascal issue" in relation to a product and no other issue.



                  Two years ago I spent four months personally inventing/engineering/whatever a new sort of algorithm for an unusual graphical interface where the entire end result was eliminating two bad "anti-Pascal-style" actions. (The result made zillions.)



                  Note that to some extent, the everyday phrase



                  K.I.S.S.



                  amounts to, basically, a similar approach.





                  Note - since the "Pascal-issue" is indeed so pervasive, there are



                  many, many specific terms for subsets of the concept:



                  For example, in the literal example you gave, that is known as



                  plug-and-play



                  or



                  hot swappable



                  Note that a company we have heard of, Apple, arguably made some 10 billion dollars from being the first to market with ("more") plug and play printers and other peripherals than the competitors of the time, back before you were born.



                  So, "plug and play" or "hot swappable" is indeed one particular specific subset of the overall user-driven design, KISS-UX, "Pascal-issue".







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited 3 hours ago

























                  answered 3 hours ago









                  Fattie

                  773517




                  773517






















                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.










                       

                      draft saved


                      draft discarded


















                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.













                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.












                      PascLeRasc is a new contributor. Be nice, and check out our Code of Conduct.















                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fux.stackexchange.com%2fquestions%2f122360%2fis-there-a-term-for-the-user-cant-use-anything-wrong-design%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Ellipse (mathématiques)

                      Quarter-circle Tiles

                      Mont Emei