Do we, as developers, have a UI responsibility?
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
First of I'd answer your actual question (do devs have a UI responsibility) with a resounding "yes". As to why US companies insist on forcing the rest of the world to use an illogical date format that practically no other country uses? I think it's the same underlying reason why we get poor UI's in general: wrong mindset. As a dev, your job is to write software for **your users**, not for yourself/your ego. This is something a lot of developers seem to forget/aren't aware of. That means putting yourself in your users' shoes and designing software that meets your users' functional requirements while also being a joy to use. Poor UI design generally is the result of only the first part of this equation being deemed important: meeting the functional requirements. The ease of use is an afterthought, and you can guarantee the dev never actually used the UI in a real life situation (they might, if you're lucky test a few boundary cases). A prime example of this was the drop-down list in Outlook years ago for choosing the year of birth for a contact. Not only was this a drop-down list with 100 entries or so (one per year), the default value was the current year at the top of the list. Now if they'd tried entering the data of a real person or two, they would immediately have realised that none of their users will have an address book full of babies born this year, and that selecting something like "1967" from a massively long list is a UI fail. But if all you test is "date in the past, today, date in the future", you won't ever see the massive design flaw. MS fixed this in a subsequent update. And this mindset - whereby people find it difficult to empathise and put themselves in other people's shoes - is also the reason for these weird date formats nobody (apart from Americans) wants. As a programmer, you must know that there are different date formats (you should at least know ISO and your local format if it's not ISO). So it can't be ignorance, it can only be an unwillingness to put yourself in your users' shoes. Because if you did, you'd immediately understand that no one wants to use an unfamiliar and confusing date format. If your mindset is "we don't care about anyone else", your software suffers as a result. And it won't only be the date format that suffers.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Chris Maunder wrote:
Somewhere a programmer decided to output the date this way.
I have another thought about this, on this surprisingly long thread. US Date is written the way we say it. August 7, 2017. I don't know all of the Europolyglot, but I believe a German date would be 7 August 2017. These are how they are spoken. Now, just transform them to a number format whilst maintaining this order natural to the reader. At least to begin with. Conventions and adaptions have followed suit. Consider: even the metric system could be recast with a different size 'meter', and everything else, calculated to remain the same relative to the alternate size (rename them if you wish). But it isn't that way. Only one base-ten system was adopted by (so far as I know) all parties using it. This desire for convention is relatively new. Think of what is now the ultimate example of adopting convention: the Euro. With its particular value. Not a value with any real symbolic significance (or perhaps, trying to be similar to the US dollar in value!) So - unless its from a purely logical standpoint, (YYYYMMDD), which is useful for sorting, the internal representation we so love (which, you recall, are actually FP values - dates like in StarTek), the date is for a human-readable form (UI, remember!). Honestly - doesn't it make sense to write it the way the reader will naturally say it!
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
One school of thought says that I should use the default format because the platform should take it from the OS preferences. In addition, if those are wrong, it is not my fault... right? Well, no. I should take the default date format as another external input, and as any external input, I should not trust it. Thus, I decided a while ago to use the one and only true date format standard: ISO 8601. For the date format that would be yyyy-MM-dd... ... Unless the client asks for something else, that is. They rarely do. Why may they ask for something different? Here are a few possible reasons: - The software needs to interact with a software made by third party that expect default date format. - The above can happen because of legal reasons. For instance, in my country there is a file format designed by the goverment that uses dd/MM/yyyy. At least it doesn't use two digits for the year. - The client wants a date forma that the date to spell out the month name. Translating those is another issue, so at this point I will probably be using a library that supports this, instead of the default format functions.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I disagree with the foundation of your postulate. I think it's our job, as developers, to remove as many barriers to understanding as possible, but ambiguity exists in the context of the interpretation, not in the processing and presentation of the data. The example that you cite is almost the perfect example: there is a cultural difference between you and the developer of a piece of software. If providing services to your cultural norm is not part of the system design, that UI designer would be doing a BAD job by formatting for Canadian norms. A team member that likes to jump on the "added features" train is a liability; I know this too well as I'm often that guy and need to get slapped with YAGNI periodically. If you are the target market, ergo formatting for your culture (or just general internationalization) is part of the system spec, then what you have is a badly designed product and it might be time to look to alternatives. Honestly, though, kvetching about cultural differences is counterproductive and useless. Especially when you're all wrong and yyyyMMMdd does the best job of removing ambiguity ;P
"There are three kinds of lies: lies, damned lies and statistics." - Benjamin Disraeli
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
If that's the case when/why did the UK switch to day/month/year?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies. -- Sarah Hoyt
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I've always thought that the entire world was wrong. In my mind the only reasonable expression would be YYYYMMDD. that should suffice for the next roughly 8k years and by that time I really don't give a rip!
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
These days, with more self-taught programmers in the market, it is no surprise to find anything UI or GUI ignored. My background is as a Mac programmer, and they had an entire manual on user interface design. Buttons were to be x pixels wide and y pixels tall. Today, finding even evenly spaced buttons is a joy. I not only feel responsible for making the GUI make sense, but to make the user experience as easy and intuitive as possible. Most projects do not allow me time to go back and "clean up", so extra effort has to go in at the beginning to click all the proper properties and use the correct widgets. Back to dates, since I deal with a lot of vendors and different business sites, my default is YYYY-MM-DD with YYYY-MMM-DD for monthly reports to differentiate them from daily or one time. A side advantage of this is that daily or one-time reports sort properly in the directory. I have found that projects dealing with multiple vendors and multiple departments quickly adapt to this and I do not have to enforce it. That makes the project run smoother and interactions more friendly. It is the unknown and feeling like you aren't getting requirements that usually cause friction, so it is a proactive effort to get everyone speaking the same language and feeling comfortable.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yeah, I liked that Oracle defaulted to dd-MMM-yy and we used that a lot. Over the years, I ultimately prefer some variation of YYYYMMDD and use that as time stamps for files, etc. I find it sorts nicely, is total unambiguous, and when combined with YYYYMMDD_HHNNSS it still sorts, and moves the ball forward. But don't get me started on AM/PM... Who thought of that? And what were they thinking? And timezones, and Daylight savings time. Obviously not a lot of computer planning went into any of this when the first PC could not represent a date before 1/1/1900 lol. Finally. Where is the metric system when you need it. There should be 10 seconds per minute. 10 minutes per hour, 10 hrs per day, and 10 days per month, etc... How much easier would life be then?
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
First, you have to know your audience. The software I develop is only used in the US, so 06/07/2017 is unambiguous. That being said, it could be improved. Why shouldn't we say June 7, 2017, just in case we ever get a ROW contract? To answer your question, UI is our responsibility in a similar way that good design principles, good data structures, etc are our responsibility. Part of our role is to be consultants to the business people, and point out these kinds of problems when we think of them. In the same way we might raise issues with how data actually is related when gathering requirements (for example), by asking deeper questions during the requirements phase based on our experience, we should raise these kinds of issues at that time. For with this specific issue, though, just having a software standard seems like a good idea: dates should be displayed as or or whatever works for you.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I prefer some variation on yyyy-MMM-dd, although for some cultures it may be yyyyy-MMM-dd or even yyyyyy-MMM-dd. But that's because I like easy sorting, something that is easy to do with a computer but not so easy in a paper ledger. If you know that the application is going to be used on a system that supplies a default date formater, always use that. If the user doesn't have it set to what they like at least it will be consistent with most of the other software the user uses. Or you can work in an industry that specifies the format that everybody has to follow. In my case that is ddMMMyy or ddMMMyyyy both of which are pain to sort if all you have is text. :( But to answer the question... It depends. For a legacy UI that you don't have time/budget/permission to recode and regression test, stay with the same format of data display. Changing it for your piece will generate user irritation because it is different, or will make them irritated with the older portion because it's not as nice as the new part. For new code, use system defaults. Maybe add a section to documentation about setting the system date. Of course that has it's pitfalls as well.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yeah Just try to implement dd-MMM-yyyy in many places only to have the [non-technical] "stakeholder" go 'Why is the date f!@#$% up? Go fix that. Our stupid users won't unnerstand'. Yes Biff - going Biff....
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
It's almost certainly a matter of month/day being most often used, year being added on optionally. Consider the
MM/DD
portion of the date to beday-of-year
. It's big-endian, which isn't bad in itself. Appending the year afterwards vs beforehand is normal in speech, but when you write it, it appears that you're now using little-endian:day-of-year
/year
. It's awkward, but not crazy. If you think that crazy because mixing little-endian with big-endian is crazy, consider thatDD/MM/YYYY
is already mixing little- with big-, since the actual digits we write are big-endian. (I preferYYYY-MM-DD
, myself, which is all big-endian and fully consistent, but the American way isn't illogical, per se.)Jesse
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today. My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
-
Are you trying to strong arm me? /ravi
My new year resolution: 2048 x 1536 Home | Articles | My .NET bits | Freeware ravib(at)ravib(dot)com
I can't think of an on-topic comment that uses "uranus". :(
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
-
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today. My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
Yea, he even points out an instance where they use MM/DD/YYYY then DD/MM/YYYY in the same sentence. It seems as soon as people came over to the Americas, you see the MM/DD/YYYY format start being used along with DD/MM/YYYY, and then at some point people just decided to use MM/DD/YYYY exclusively. A historical mystery. Maybe it was just to spite the British?
-
The real goal is to switch everybody to yyyy-mm-dd (e.g. 2017-08-07). 1. It contains all the needed information. 2. It's good for sorting because it goes bigger --> smaller. 3. No one would get confused over which number means what (bigger --> smaller principle). 4. It's already used by 1.5 billion Chinese and possibly in other places in Asia too. While we're at it, we should also switch to a 24H clock (basically for the same reasons). If you like it: use it in your daily life! You might still want to use mm/dd/yy on your IRS reports, but other than that - use it whenever you can and by god we will make the world unite around this one proper date/time format.
Unlike programmers who always simplify THEIR OWN life, there is normal people who don't care your way of sorting. We should think about usage of dates in every cases, not only dates of files! And if you're human, hardly you like months in the numeric form. What's that month "10"??? November? August? Don't give a damn, it should be LETTERS! And of course if you're not time traveler, you know what year today is! So first place should be occupied by most important parts - day and month. My preference is: "dd MMM yyyy" - it's HUMANLY and it's convenient. And never let you guessing what order is. "17 Feb 2017" - simple, obvious.
-
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
The way american Papuan speaks has nothing common with how it should be written. If you see time "10 hours 12 minutes", IT IS NOT "thousand twelve", whatever military people say. Feel the difference? Same with dates, especially when only crazy USA has clumsy order "month day year". Normal order is "day month year". Day number is more important than year. And if you exchange dates with other people, think twice before you force 'em to scratch head with your "17/7/17". Universal date is quite simple: "dd MMM yyyy". No mistakes, no fighting, suits for much more people than living in america.
-
The way american Papuan speaks has nothing common with how it should be written. If you see time "10 hours 12 minutes", IT IS NOT "thousand twelve", whatever military people say. Feel the difference? Same with dates, especially when only crazy USA has clumsy order "month day year". Normal order is "day month year". Day number is more important than year. And if you exchange dates with other people, think twice before you force 'em to scratch head with your "17/7/17". Universal date is quite simple: "dd MMM yyyy". No mistakes, no fighting, suits for much more people than living in america.
Your time example actually proves my point more than yours. The format you like: dd/mm/yyyy is in decreasing significance order, meaning most specific to least specific. So if you were to extend that with time then it would be ss:mm:hh dd/mm/yyyy. But I am sure that seems ridiculous to you. We all use cultural rules when choosing how to display data. Do you also insist that in languages that are read from right-to-left that they should switch to left-to-right because you feel it suits more people? If you were designing a UI with a status indicator and you chose a fairly standard red/yellow/green scheme would you refuse to change it after finding out your users were colorblind because the scheme suits more people? The point is that it is how your users feel about the way data is presented that matters. That is a higher responsibility than adherence to rules you feel are "universal".