Do we, as developers, have a UI responsibility?
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I don't display dates in
mm/dd/yy
unless the stake holder demands it. I always prefer to usedd-MMM-yyyy
for the very reason you cite.".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013 -
not forgetting degrees (temperature) and degrees angle. The latter is odd because almost all humans use the 0..360 whereas almost every math library uses radians - easy to visualise a 35 degree slope, but a .4 radians is how many? With pi an irrational number and computers not capable of doing infinite digits yet (not that long ago computers couldn't do over 6 dp very well) what a stupid choice that was. Another that's slipping is currency: starting to see single decimals popping up: i.e. $5.5 ... sure cents (pennies if you must) are annoying, but it's just being lazy to skip that last digit. (currently the temperature here is 298 degrees and my chair tilted at about .1 degrees, just the way this grumpy irrational old man likes it.)
Sin tack the any key okay
We use radians because the constants in (naively implemented) math libraries are inverses of integers. As all serious math libraries use economised polynomials, this is less of a problem these days. Note that IEEE 754-2008 recommends functions such as sinPi, defined as sin(pi*x). This could easily be modified to take angles in degrees, grads, mils, etc.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack. --Winston Churchill
-
The question seems a bit odd to me in that, since I never went to computer school, I always assumed the point of a UI is to give the user what they want. I've often returned SQL date, for example, as, LEFT(datefield, 11), which gave MMM DD, YYYY automatically (as long as one remembers to sort by the real datetime values). Or, really, anything else the user needs to look at should be made intelligible. Otherwise, the calls come in and it has to be changed. From my point of view, the European convention, dd-mm-yyyy (regardless of delimiters) is every bit as dumb as the US convention: it won't sort correctly with a pain in the ass. So - I've taken to YYYYMMDD, or, for human readable, YYYY.MM.DD, when it's for my use. Big Endian, I think, is surely the way to go for dates.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010
W∴ Balboos wrote:
Big Endian, I think, is surely the way to go for dates.
Agreed. Especially if some decent delimiter is used (something like -/.) it is as readable, usable, workable, and consistent as possible. And thus it ends up with the best of all worlds - easy to sort, consistent for storage, no ambiguity, straight-forward, as well as easy to pick up the day/month as well. I mean how difficult is it to read the last 4 digits of a date instead of the first 4? For me, I try to use the ISO standard date format as much as possible. Though for display I would try to convert to whatever local setting is in effect. But for storage and working with dates, especially if working on text based dates, there is simply no alternative to ISO standard (even allowing for variations in time zones for DateTime data). Here's the irritating thing though: Most programs are designed for the US market. Thus most are then by default showing the US-only (not counting some small country somewhere wanting to be "different too" - now that sounds oxi-mor-ish) "randomised" date format. And no, the US is not the largest part of the world, not by a long shot. Meaning until the user (the largest part of users throughout the world) has realised that the date is in some unexpected rearranged order, they're reading it wrong to begin with. IMO the default (if not more properly displaying to user preferences) should be ISO (i.e. YYYY-MM-DD) as it is impossible to misinterpret, there's never something like YYYY-DD-MM, or at least not that I've come across. So even if someone's used to DD-MM-YY, or for the US "non-world"-citizen's MM-DD-YY, it's quick to pick up exactly what is meant (especially if sticking to the 4 digit year).
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
During the US military attack on Iraq in 2003, Amercians renamed (or tried to do so) "french fries" to "freedom fries" to indicate their dissatisfaction with French opposition to the bombings. I think that a nice similar political action, in these days of terrorism from the Arabic world, would be to reverse the digit order in our numbers. Our digits are, as most people know, Arabic numerals - exactly the same as the digits used in Arabic writing. Except that in Arabic, they are little endian, in Latin script they are big endians: We write them the same, but the Arabs read them from right to left, we read them from left to right. To clearly dissociate from Arab culture, we should reverse the digit order, and reject doing it "the Arab way". This of course pinpoints another problem: By trying to be different from the Arabs, by writing numbers the other way around, in another sense it makes us more like them! They do it little endian, we change to little endian, too... Hmmmm....
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
The real goal is to switch everybody to yyyy-mm-dd (e.g. 2017-08-07). 1. It contains all the needed information. 2. It's good for sorting because it goes bigger --> smaller. 3. No one would get confused over which number means what (bigger --> smaller principle). 4. It's already used by 1.5 billion Chinese and possibly in other places in Asia too. While we're at it, we should also switch to a 24H clock (basically for the same reasons). If you like it: use it in your daily life! You might still want to use mm/dd/yy on your IRS reports, but other than that - use it whenever you can and by god we will make the world unite around this one proper date/time format.
-
W∴ Balboos wrote:
Big Endian, I think, is surely the way to go for dates.
Agreed. Especially if some decent delimiter is used (something like -/.) it is as readable, usable, workable, and consistent as possible. And thus it ends up with the best of all worlds - easy to sort, consistent for storage, no ambiguity, straight-forward, as well as easy to pick up the day/month as well. I mean how difficult is it to read the last 4 digits of a date instead of the first 4? For me, I try to use the ISO standard date format as much as possible. Though for display I would try to convert to whatever local setting is in effect. But for storage and working with dates, especially if working on text based dates, there is simply no alternative to ISO standard (even allowing for variations in time zones for DateTime data). Here's the irritating thing though: Most programs are designed for the US market. Thus most are then by default showing the US-only (not counting some small country somewhere wanting to be "different too" - now that sounds oxi-mor-ish) "randomised" date format. And no, the US is not the largest part of the world, not by a long shot. Meaning until the user (the largest part of users throughout the world) has realised that the date is in some unexpected rearranged order, they're reading it wrong to begin with. IMO the default (if not more properly displaying to user preferences) should be ISO (i.e. YYYY-MM-DD) as it is impossible to misinterpret, there's never something like YYYY-DD-MM, or at least not that I've come across. So even if someone's used to DD-MM-YY, or for the US "non-world"-citizen's MM-DD-YY, it's quick to pick up exactly what is meant (especially if sticking to the 4 digit year).
In my own writings I always use big endian dates, always with a 4-digit year, and may change e.g. the naming of files receceived to suit my preferences. I also move the date ahead of any descriptive term, so that it starts the file name / table entry / whatever. Main reason: It allows sorting on the date (which is my most common sorting criterion) as text. But I have a slight feeling of being somewhat nerdy when I do so. Humans can sort the dates as they were originally written; my rewriting is for the machine, not for humans. And, I must admit, I frequently do not do it that way for other kinds of data. My address book is not sorted big-endian but little endian. If someone asks me for my birthdate, I state it in little-endian form - and that is a date. Time of day is usually little endian ("ten to nine" - "eight fifty" sounds like something from an army guy). Friends are named by their first name preceeding their family name. Little endianness is, in a way, user friendly in that it focuses first on the nearness, and then gradually puts things into a bigger scope. Big-endianness either requires you to start with the universe and narrow down from there, step by step - otherwise, it might be ambiguous. If you make a new friend, telling him where you live may be limited to giving the street name and number; the town, county, state, nation, continent, planet and galaxy are implicit. So, little endian may be more user friendly. Actually, you have a similar issue in programming! In most programming languages, the opening of a statement may identify it as an assignment ("X = ..."). But after the assgnment operator, which gives you the Grand Overview of the statement, you dive deep into the details of the expression, with priority rules en masse, some of which are so obscure that you have to ignore/override them by use of parentheses. The APL language is 100% consistent: No priorities, main things first, and if you want details, continue reading. "X = 3 * ": X is being changed, that is the essential thing. It is being set to 3 times some calculated value, no matter how it is calculated; in most cases, the 3 has high semantic importance (e.g. number of units bough). If you want the details, read on to break the up. If you only need an overview, you can read only the first parts of the statements. And I have worked with languages going the other way (but with operator priorities): "(A+4) * B =: C" - first assemble the pieces, then tell what to do with it (i.e. storing in C). Th
-
ISO 8601 or go home!
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
First of I'd answer your actual question (do devs have a UI responsibility) with a resounding "yes". As to why US companies insist on forcing the rest of the world to use an illogical date format that practically no other country uses? I think it's the same underlying reason why we get poor UI's in general: wrong mindset. As a dev, your job is to write software for **your users**, not for yourself/your ego. This is something a lot of developers seem to forget/aren't aware of. That means putting yourself in your users' shoes and designing software that meets your users' functional requirements while also being a joy to use. Poor UI design generally is the result of only the first part of this equation being deemed important: meeting the functional requirements. The ease of use is an afterthought, and you can guarantee the dev never actually used the UI in a real life situation (they might, if you're lucky test a few boundary cases). A prime example of this was the drop-down list in Outlook years ago for choosing the year of birth for a contact. Not only was this a drop-down list with 100 entries or so (one per year), the default value was the current year at the top of the list. Now if they'd tried entering the data of a real person or two, they would immediately have realised that none of their users will have an address book full of babies born this year, and that selecting something like "1967" from a massively long list is a UI fail. But if all you test is "date in the past, today, date in the future", you won't ever see the massive design flaw. MS fixed this in a subsequent update. And this mindset - whereby people find it difficult to empathise and put themselves in other people's shoes - is also the reason for these weird date formats nobody (apart from Americans) wants. As a programmer, you must know that there are different date formats (you should at least know ISO and your local format if it's not ISO). So it can't be ignorance, it can only be an unwillingness to put yourself in your users' shoes. Because if you did, you'd immediately understand that no one wants to use an unfamiliar and confusing date format. If your mindset is "we don't care about anyone else", your software suffers as a result. And it won't only be the date format that suffers.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Chris Maunder wrote:
Somewhere a programmer decided to output the date this way.
I have another thought about this, on this surprisingly long thread. US Date is written the way we say it. August 7, 2017. I don't know all of the Europolyglot, but I believe a German date would be 7 August 2017. These are how they are spoken. Now, just transform them to a number format whilst maintaining this order natural to the reader. At least to begin with. Conventions and adaptions have followed suit. Consider: even the metric system could be recast with a different size 'meter', and everything else, calculated to remain the same relative to the alternate size (rename them if you wish). But it isn't that way. Only one base-ten system was adopted by (so far as I know) all parties using it. This desire for convention is relatively new. Think of what is now the ultimate example of adopting convention: the Euro. With its particular value. Not a value with any real symbolic significance (or perhaps, trying to be similar to the US dollar in value!) So - unless its from a purely logical standpoint, (YYYYMMDD), which is useful for sorting, the internal representation we so love (which, you recall, are actually FP values - dates like in StarTek), the date is for a human-readable form (UI, remember!). Honestly - doesn't it make sense to write it the way the reader will naturally say it!
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
One school of thought says that I should use the default format because the platform should take it from the OS preferences. In addition, if those are wrong, it is not my fault... right? Well, no. I should take the default date format as another external input, and as any external input, I should not trust it. Thus, I decided a while ago to use the one and only true date format standard: ISO 8601. For the date format that would be yyyy-MM-dd... ... Unless the client asks for something else, that is. They rarely do. Why may they ask for something different? Here are a few possible reasons: - The software needs to interact with a software made by third party that expect default date format. - The above can happen because of legal reasons. For instance, in my country there is a file format designed by the goverment that uses dd/MM/yyyy. At least it doesn't use two digits for the year. - The client wants a date forma that the date to spell out the month name. Translating those is another issue, so at this point I will probably be using a library that supports this, instead of the default format functions.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I disagree with the foundation of your postulate. I think it's our job, as developers, to remove as many barriers to understanding as possible, but ambiguity exists in the context of the interpretation, not in the processing and presentation of the data. The example that you cite is almost the perfect example: there is a cultural difference between you and the developer of a piece of software. If providing services to your cultural norm is not part of the system design, that UI designer would be doing a BAD job by formatting for Canadian norms. A team member that likes to jump on the "added features" train is a liability; I know this too well as I'm often that guy and need to get slapped with YAGNI periodically. If you are the target market, ergo formatting for your culture (or just general internationalization) is part of the system spec, then what you have is a badly designed product and it might be time to look to alternatives. Honestly, though, kvetching about cultural differences is counterproductive and useless. Especially when you're all wrong and yyyyMMMdd does the best job of removing ambiguity ;P
"There are three kinds of lies: lies, damned lies and statistics." - Benjamin Disraeli
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
If that's the case when/why did the UK switch to day/month/year?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies. -- Sarah Hoyt
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I've always thought that the entire world was wrong. In my mind the only reasonable expression would be YYYYMMDD. that should suffice for the next roughly 8k years and by that time I really don't give a rip!
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
These days, with more self-taught programmers in the market, it is no surprise to find anything UI or GUI ignored. My background is as a Mac programmer, and they had an entire manual on user interface design. Buttons were to be x pixels wide and y pixels tall. Today, finding even evenly spaced buttons is a joy. I not only feel responsible for making the GUI make sense, but to make the user experience as easy and intuitive as possible. Most projects do not allow me time to go back and "clean up", so extra effort has to go in at the beginning to click all the proper properties and use the correct widgets. Back to dates, since I deal with a lot of vendors and different business sites, my default is YYYY-MM-DD with YYYY-MMM-DD for monthly reports to differentiate them from daily or one time. A side advantage of this is that daily or one-time reports sort properly in the directory. I have found that projects dealing with multiple vendors and multiple departments quickly adapt to this and I do not have to enforce it. That makes the project run smoother and interactions more friendly. It is the unknown and feeling like you aren't getting requirements that usually cause friction, so it is a proactive effort to get everyone speaking the same language and feeling comfortable.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yeah, I liked that Oracle defaulted to dd-MMM-yy and we used that a lot. Over the years, I ultimately prefer some variation of YYYYMMDD and use that as time stamps for files, etc. I find it sorts nicely, is total unambiguous, and when combined with YYYYMMDD_HHNNSS it still sorts, and moves the ball forward. But don't get me started on AM/PM... Who thought of that? And what were they thinking? And timezones, and Daylight savings time. Obviously not a lot of computer planning went into any of this when the first PC could not represent a date before 1/1/1900 lol. Finally. Where is the metric system when you need it. There should be 10 seconds per minute. 10 minutes per hour, 10 hrs per day, and 10 days per month, etc... How much easier would life be then?
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
First, you have to know your audience. The software I develop is only used in the US, so 06/07/2017 is unambiguous. That being said, it could be improved. Why shouldn't we say June 7, 2017, just in case we ever get a ROW contract? To answer your question, UI is our responsibility in a similar way that good design principles, good data structures, etc are our responsibility. Part of our role is to be consultants to the business people, and point out these kinds of problems when we think of them. In the same way we might raise issues with how data actually is related when gathering requirements (for example), by asking deeper questions during the requirements phase based on our experience, we should raise these kinds of issues at that time. For with this specific issue, though, just having a software standard seems like a good idea: dates should be displayed as or or whatever works for you.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
I prefer some variation on yyyy-MMM-dd, although for some cultures it may be yyyyy-MMM-dd or even yyyyyy-MMM-dd. But that's because I like easy sorting, something that is easy to do with a computer but not so easy in a paper ledger. If you know that the application is going to be used on a system that supplies a default date formater, always use that. If the user doesn't have it set to what they like at least it will be consistent with most of the other software the user uses. Or you can work in an industry that specifies the format that everybody has to follow. In my case that is ddMMMyy or ddMMMyyyy both of which are pain to sort if all you have is text. :( But to answer the question... It depends. For a legacy UI that you don't have time/budget/permission to recode and regression test, stay with the same format of data display. Changing it for your piece will generate user irritation because it is different, or will make them irritated with the older portion because it's not as nice as the new part. For new code, use system defaults. Maybe add a section to documentation about setting the system date. Of course that has it's pitfalls as well.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yes we have a responsibility when creating a UI. You seem to be forgetting what the U stands for. When people talk about dates, people will say "June Seventh" way more often than they say "Seventh of June". Also when presented with a list of dates, having things in mm/dd format makes them more easy to compare at a glance. If you are designing middleware you can be as logical and unambiguous as you please. But if you design a UI and prioritize your personal sense of logic and order above what the users feel is comfortable and familiar your design will be a failure.
-
I'm going through expenses, and for anyone living in Canada who doesn't have that weird Canada / US hardware translation unit built into their brain, it's painful. It's the dates. The US, alone, uses mm/dd/yy. The rest of the world except for Belize uses something vaguely sensible. Even Canada. Except Canada has a ton of systems imported directly from the US (or shares systems with their US parent companies) so lots of dates on things like receipts are in the form mm/dd/yy. Or they are dd/mm/yy. You can't tell. 06/07/17. Guess the date. Canadians can tell, just by looking at the date whether it's June or July. To me that's impossible yet they seem to do it. Somewhere a programmer decided to output the date this way. Either they just used the default date formatter or they deliberately choose a dd/mm/yy or mm/dd/yy format. 5 seconds of work would enable them to output in dd-MMM-yyyy or dd-MMM-yy or even yyyy-mm-dd or yy-mm-dd format. Either of which would allow a high level of accuracy in guessing the date. I'm sure they also thought, at the time, that their decision was a valid one. It wasn't, and it made me wonder whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously. Is this something you do? Is it something your lead actually stops you doing? Or is it something you've not really though of?
cheers Chris Maunder
Yeah Just try to implement dd-MMM-yyyy in many places only to have the [non-technical] "stakeholder" go 'Why is the date f!@#$% up? Go fix that. Our stupid users won't unnerstand'. Yes Biff - going Biff....
-
Why Do Americans Write Dates: Month-Day-Year? - YouTube[^] TLDW: No one knows but we've been doing it since colonizing NA. Personally I only usually care about the month and current day of the week. The specific day and year are largely irrelevant day-to-day. Maybe that's why? Just a guess.
It's almost certainly a matter of month/day being most often used, year being added on optionally. Consider the
MM/DD
portion of the date to beday-of-year
. It's big-endian, which isn't bad in itself. Appending the year afterwards vs beforehand is normal in speech, but when you write it, it appears that you're now using little-endian:day-of-year
/year
. It's awkward, but not crazy. If you think that crazy because mixing little-endian with big-endian is crazy, consider thatDD/MM/YYYY
is already mixing little- with big-, since the actual digits we write are big-endian. (I preferYYYY-MM-DD
, myself, which is all big-endian and fully consistent, but the American way isn't illogical, per se.)Jesse