Do we, as developers, have a UI responsibility?
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
First, you have to know your audience. The software I develop is only used in the US, so 06/07/2017 is unambiguous. That being said, it could be improved. Why shouldn't we say June 7, 2017, just in case we ever get a ROW contract? To answer your question, UI is our responsibility in a similar way that good design principles, good data structures, etc are our responsibility. Part of our role is to be consultants to the business people, and point out these kinds of problems when we think of them. In the same way we might raise issues with how data actually is related when gathering requirements (for example), by asking deeper questions during the requirements phase based on our experience, we should raise these kinds of issues at that time. For with this specific issue, though, just having a software standard seems like a good idea: dates should be displayed as or or whatever works for you.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I prefer some variation on yyyy-MMM-dd, although for some cultures it may be yyyyy-MMM-dd or even yyyyyy-MMM-dd. But that's because I like easy sorting, something that is easy to do with a computer but not so easy in a paper ledger. If you know that the application is going to be used on a system that supplies a default date formater, always use that. If the user doesn't have it set to what they like at least it will be consistent with most of the other software the user uses. Or you can work in an industry that specifies the format that everybody has to follow. In my case that is ddMMMyy or ddMMMyyyy both of which are pain to sort if all you have is text. :( But to answer the question... It depends. For a legacy UI that you don't have time/budget/permission to recode and regression test, stay with the same format of data display. Changing it for your piece will generate user irritation because it is different, or will make them irritated with the older portion because it's not as nice as the new part. For new code, use system defaults. Maybe add a section to documentation about setting the system date. Of course that has it's pitfalls as well.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yeah Just try to implement dd-MMM-yyyy in many places only to have the [non-technical] "stakeholder" go 'Why is the date f!@#$% up? Go fix that. Our stupid users won't unnerstand'. Yes Biff - going Biff....
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
It's almost certainly a matter of month/day being most often used, year being added on optionally. Consider the
MM/DD
portion of the date to beday-of-year
. It's big-endian, which isn't bad in itself. Appending the year afterwards vs beforehand is normal in speech, but when you write it, it appears that you're now using little-endian:day-of-year
/year
. It's awkward, but not crazy. If you think that crazy because mixing little-endian with big-endian is crazy, consider thatDD/MM/YYYY
is already mixing little- with big-, since the actual digits we write are big-endian. (I preferYYYY-MM-DD
, myself, which is all big-endian and fully consistent, but the American way isn't illogical, per se.)Jesse
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today. My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
-
Are you trying to strong arm me? /ravi
My new year resolution: 2048 x 1536 Home | Articles | My .NET bits | Freeware ravib(at)ravib(dot)com
I can't think of an on-topic comment that uses "uranus". :(
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
-
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today. My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
Yea, he even points out an instance where they use MM/DD/YYYY then DD/MM/YYYY in the same sentence. It seems as soon as people came over to the Americas, you see the MM/DD/YYYY format start being used along with DD/MM/YYYY, and then at some point people just decided to use MM/DD/YYYY exclusively. A historical mystery. Maybe it was just to spite the British?
-
The real goal is to switch everybody to yyyy-mm-dd (e.g. 2017-08-07). 1. It contains all the needed information. 2. It's good for sorting because it goes bigger --> smaller. 3. No one would get confused over which number means what (bigger --> smaller principle). 4. It's already used by 1.5 billion Chinese and possibly in other places in Asia too. While we're at it, we should also switch to a 24H clock (basically for the same reasons). If you like it: use it in your daily life! You might still want to use mm/dd/yy on your IRS reports, but other than that - use it whenever you can and by god we will make the world unite around this one proper date/time format.
Unlike programmers who always simplify THEIR OWN life, there is normal people who don't care your way of sorting. We should think about usage of dates in every cases, not only dates of files! And if you're human, hardly you like months in the numeric form. What's that month "10"??? November? August? Don't give a damn, it should be LETTERS! And of course if you're not time traveler, you know what year today is! So first place should be occupied by most important parts - day and month. My preference is: "dd MMM yyyy" - it's HUMANLY and it's convenient. And never let you guessing what order is. "17 Feb 2017" - simple, obvious.
-
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
The way american Papuan speaks has nothing common with how it should be written. If you see time "10 hours 12 minutes", IT IS NOT "thousand twelve", whatever military people say. Feel the difference? Same with dates, especially when only crazy USA has clumsy order "month day year". Normal order is "day month year". Day number is more important than year. And if you exchange dates with other people, think twice before you force 'em to scratch head with your "17/7/17". Universal date is quite simple: "dd MMM yyyy". No mistakes, no fighting, suits for much more people than living in america.
-
The way american Papuan speaks has nothing common with how it should be written. If you see time "10 hours 12 minutes", IT IS NOT "thousand twelve", whatever military people say. Feel the difference? Same with dates, especially when only crazy USA has clumsy order "month day year". Normal order is "day month year". Day number is more important than year. And if you exchange dates with other people, think twice before you force 'em to scratch head with your "17/7/17". Universal date is quite simple: "dd MMM yyyy". No mistakes, no fighting, suits for much more people than living in america.
Your time example actually proves my point more than yours. The format you like: dd/mm/yyyy is in decreasing significance order, meaning most specific to least specific. So if you were to extend that with time then it would be ss:mm:hh dd/mm/yyyy. But I am sure that seems ridiculous to you. We all use cultural rules when choosing how to display data. Do you also insist that in languages that are read from right-to-left that they should switch to left-to-right because you feel it suits more people? If you were designing a UI with a status indicator and you chose a fairly standard red/yellow/green scheme would you refuse to change it after finding out your users were colorblind because the scheme suits more people? The point is that it is how your users feel about the way data is presented that matters. That is a higher responsibility than adherence to rules you feel are "universal".
-
Your time example actually proves my point more than yours. The format you like: dd/mm/yyyy is in decreasing significance order, meaning most specific to least specific. So if you were to extend that with time then it would be ss:mm:hh dd/mm/yyyy. But I am sure that seems ridiculous to you. We all use cultural rules when choosing how to display data. Do you also insist that in languages that are read from right-to-left that they should switch to left-to-right because you feel it suits more people? If you were designing a UI with a status indicator and you chose a fairly standard red/yellow/green scheme would you refuse to change it after finding out your users were colorblind because the scheme suits more people? The point is that it is how your users feel about the way data is presented that matters. That is a higher responsibility than adherence to rules you feel are "universal".
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Well, it always helps to have empathy towards the end-user in order to achieve as much ergonomic and intuitive programs, but back to your case, the programmer in charge of redacting the database transaction layer is supposed to make sure that the date is stored using a time-stamp, whereas the UI programmer should take care of accordingly parsing this time-stamp on an end-user localization basis so that the end-user can for example always remotely generate a consistent receipt whatever his location. But yes, both programmers can definitely be the same person. Presenting information to the world in various notations or languages require more UI efforts than using more standardized schemes, but have its charms. In all cases, if the data is presented ambiguously, then the project is an epic failure (!?), with data dumped to the end-user becoming inconsistent so unusable, but in theory and back to your initial question, inside a project with respective road-maps for several programmers, it would be the UI programmer's job to properly dump / format data that have been consistently stored.
Kaveh Rassoulzadegan
-
You prove yourself the way you like, but nothing meaningful to me. If some doc needs timestamp, it's quite easy: "17 Feb 2017 15:27" - format understandable by all humans in the world. Everything else you wrote - TL;DR
Yes that works just fine unless they speak only Russian, or Chinese (which accounts for quite a large number of the humans in the world). The fact that you TLDR for a single paragraph means you have no interest in learning anything and are willing to stay ignorant. Good luck with that.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
This why we hire developers and designers. And, FWIW: 20170809 FTW
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Developers have a responsibility to understand the system that they are building, and preferably before they start building the system IMHO. How the system is intended to be used (via a User Interface), and by whom, is part of that understanding. Therefore the answer has to be 'Yes'.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Wow, I was surprised that this is even a question but pleased so many believe they are responsible for a user-friendly UI. Absolutely, even if the programmer is given specific specifications, they need to validate what the user experience will be. We do not program stuff just to move data around. Ultimately a user must interface with the information and when they do, it should be as intuitive as possible. Nothing is more crazy making for the user than a UI that is confusing or ambiguous when the designer and programmer (and the whole team) have the power to make clean sensible user experiences.
-
In my own writings I always use big endian dates, always with a 4-digit year, and may change e.g. the naming of files receceived to suit my preferences. I also move the date ahead of any descriptive term, so that it starts the file name / table entry / whatever. Main reason: It allows sorting on the date (which is my most common sorting criterion) as text. But I have a slight feeling of being somewhat nerdy when I do so. Humans can sort the dates as they were originally written; my rewriting is for the machine, not for humans. And, I must admit, I frequently do not do it that way for other kinds of data. My address book is not sorted big-endian but little endian. If someone asks me for my birthdate, I state it in little-endian form - and that is a date. Time of day is usually little endian ("ten to nine" - "eight fifty" sounds like something from an army guy). Friends are named by their first name preceeding their family name. Little endianness is, in a way, user friendly in that it focuses first on the nearness, and then gradually puts things into a bigger scope. Big-endianness either requires you to start with the universe and narrow down from there, step by step - otherwise, it might be ambiguous. If you make a new friend, telling him where you live may be limited to giving the street name and number; the town, county, state, nation, continent, planet and galaxy are implicit. So, little endian may be more user friendly. Actually, you have a similar issue in programming! In most programming languages, the opening of a statement may identify it as an assignment ("X = ..."). But after the assgnment operator, which gives you the Grand Overview of the statement, you dive deep into the details of the expression, with priority rules en masse, some of which are so obscure that you have to ignore/override them by use of parentheses. The APL language is 100% consistent: No priorities, main things first, and if you want details, continue reading. "X = 3 * ": X is being changed, that is the essential thing. It is being set to 3 times some calculated value, no matter how it is calculated; in most cases, the 3 has high semantic importance (e.g. number of units bough). If you want the details, read on to break the up. If you only need an overview, you can read only the first parts of the statements. And I have worked with languages going the other way (but with operator priorities): "(A+4) * B =: C" - first assemble the pieces, then tell what to do with it (i.e. storing in C). Th
Member 7989122 wrote:
So how do you correctly sort a table of names containing both Norwegian and Swedish names?
Got similar issues in my home tongue Afrikaans, lots of áâäèéêëïôöûü going on, not to mention other characters from French and German which made their way into it as well. Usually for sorting purposes I tend to first convert them to an invariant culture, you do get some decent libraries which are pretty fast with this.
Member 7989122 wrote:
Our endianness is inconsistent in hundreds of areas; dates is only one.
Definitely correct. Even just addresses is a problem over here (well not so much a problem as an inconsistency). E.g. in English you state number then street, in Afrikaans it's the other way round, but in both they're then followed by suburb/area, etc.
Member 7989122 wrote:
Big-endianness either requires you to start with the universe and narrow down from there, step by step - otherwise, it might be ambiguous.
Not completely in agreement here. If I follow the same principle for little-endianess, the same idea then means you never stop until you reach the universe. And the same idea about stopping once it becomes obvious can be applied to big-endian too: Only start where you think it's no longer obvious (e.g. don't state the country if you're already there). To show an example of big-endian we very commonly use: Telephone numbers. We seldom state the area code if they're in the same city, we even more seldom state the country code when they're in the same country. Just because they're big-endian doesn't mean people HAVE to start with the universe.
-
I manipulate and persist dates/times in an unambiguous fashion. I present them according to the user's preferences as indicated by the Windows locale or other mechanism.
Software Zen:
delete this;
Like a good developer should, thank you very much. But in the current "developer" world (i.e. Silicon Valley style), minding that "not all of your millions of users are from the US" is a mind boggling concept. You think dates are bad? How about keyboard short-cuts that absolutely cannot work (looking at you Android Studio)? Everytime a see a "Ctr+/" or something like that, it's apparent that nobody tried the software with a non-US/UK keyboard. (hint, most naturalized keyboars use key combinations for (), [],\, etc.. So shortcuts that require those keys will not work.