Git is an excellent tool for a very specific use case that 99.9% of projects don’t fall into. It’s great if you have a massive source base with many geographically dispersed contributors who don’t always have reliable Internet connectivity. Vastly overcomplicated, IMO for anything else.
Myron Dombrowski
Posts
-
GIT Time again - what am I missing? -
Why is a resource fork?I honestly don’t understand how seeing a full path would help address either of the issues you describe. People generally organized projects by folder. If you wanted to just see the overall organization of a project folder you would probably put the top level folder in list view and then use the disclosure arrows to next to each child folder to see all the contents in a single hierarchical list. If you converted a document from one proprietary format to another and then needed to use the original application for it again you’d obviously need to convert it back or keep a copy in the original format. That has nothing to do with the file system or the platform at all. It’d be the same on every other OS before or since. This really all sounds, as I suspected, like you saw something unfamiliar and just leaned into that meaning “bad”.
-
Why is a resource fork?The thing to keep in mind is that the fork-based file system and the resource fork specifically were novel and very effective solutions to a couple of very real issues *in the early 1980s*. They should be viewed as an artifact of their time and not through the lens of the present. As others have noted in this thread, of course, fork-supporting file systems are now ubiquitous but forks are not typically used to hold critical information these days. Just lose-able metadata.
-
Why is a resource fork?While the Mac of course had paths, they weren’t really used by end users - or even typical code - prior to the release of OS X, so there wasn’t really much value in showing a full path as a string to the user. I’m kind of curious if you recall whether you were doing something that actually required it, or if it was just an abstract expectation you had because it was what you had always seen.
-
Why is a resource fork?Going to start off with one correction: Modern applications are not typically split across multiple forks and haven’t been for literally decades. I’m genuinely surprised you’re running into such applications at all in 2023. I’m also curious what tool you were using to manipulate the files that was breaking them, because Macs have always just transparently handled them. The original Mac file system supported two separate chunks of data per file system entry. There was a data fork which was an unstructured store mostly analogous to the single data stream most early file systems had, and a resource fork that held discrete chunks of data tagged with a type and an integer identifier. The resource fork served the dual purpose of simplifying the use of structured data and providing a means to help mitigate the extremely constrained systems of the day. Applications, for example, kept executable code in the resource fork in multiple chunks that could be loaded and unloaded at need (transparently to the coder) to fit within available memory, not unlike overlay files on MS-DOS. But Mac applications haven’t typically been structured that was since the release of Mac OS X. So again, the fact that you’re running into them in 2023 is somewhat baffling.
-
Why is a resource fork?The haven’t “changed” that because it was never true. The Mac’s equivalent to Windows Explorer is a process called Finder and it has been the central part of the Mac operating experience since 1984. Unless you were in some sort of kiosk mode you shouldn’t have needed to do anything to find it.
-
Favorite way to categorize programming languages?As an applied math major at an engineering-focused school back in the 80s I actually had APL as a required 1-credit course. I kind of enjoyed it *because* it was a little arcane.
-
Favorite way to categorize programming languages?Honestly mostly I just think in terms of whether or not a given language is suitable for my current task.
-
"Coder" stock photosThere was a span of a good 15 years or so where it seemed like almost every time I saw a code snippet in marketing materials it was rendered in the Chicago typeface. Chicago, for the young’uns, is the original font used for system text like buttons and menus on the Mac. Aside from being not even close to monospace, it was designed *specifically* to be used as title text on low-res, 1-bit displays. Nobody ever used it for code without first uttering the sentence, “hey, this’ll be funny.” But it sort of entered the zeitgeist as “generic techie font” for a long while.
-
"Coder" stock photosThat, unironically, was my jam in the early 90s.
-
One-Way ProgrammingI get the sense that a lot of places - especially smaller ones - haven’t really understood and embraced the benefits of internal code reuse. Ideally you shouldn’t be writing code you’ve already written. Maybe your site isn’t achieving that ideal but it’s a goal you/they should strive for. Happily, where I am, I’m in a position to foster that culture and both the other coders and management are receptive to the idea.
-
One-Way ProgrammingPIEBALDconsult wrote:
We don't need help with that, we can do that easily enough ourselves
So you don’t create subroutines either, right? After all, if you know how to write the code you can do it again easily enough. Bonus: no function call overhead. The value in automating what you call low-hanging fruit is that it saves you time and reduces opportunities for error introduced through human intervention. And every solved problem immediately becomes low-hanging fruit. This is literally why libraries exist.
-
Cosmetic vs More EfficientKeep in mind that modern optimizers are *very* good. I wouldn’t assume that your two examples are actually going to result in different compiled code. Definitely lean toward readability.
-
Easter EggsI’ve done a few, both in personal projects and (in a much more limited way) things I’ve done at work. The art in the latter case is getting it past code review without being called out.
-
How much coffee does one man need?For me, any amount of coffee is too much. Never could tolerate the flavor. I generally get by with, like, zero to one cans of Coke per day. Not a huge caffeine consumer.
-
Well that's a new one...Department of anti-crime seems an obvious one.
-
C declarations are half backwardThe short answer is: because the language evolved. The longer, and actually informative, answer is the accepted answer to this post: pointers - Why does the arrow (->) operator in C exist? - Stack Overflow[^]
-
Would this pass code review where you are?In four words: nope.
-
Why isn't C# more popular?Exactly the kind of rebuttal I would expect from someone who doesn’t have a lot of experience. Your pronouncement would be more defensible if you had written “somebody” didn’t do it right, but it’s not necessarily the someone who is writing code today and the question of what exactly “it” is has a couple of potential answers. It could be, for example, that a library author meant to conform to an specific predefined protocol and failed. Or it could be they were implementing something new and the documentation they provided is incomplete or incorrect. In especially old code, perhaps they *were* correctly following a known protocol but the protocol itself ended up redefined. Or one of my favorites: a library has multiple functions that accept an allocation as a parameter. Some consume the allocation and others just reference it, and there’s a convention to help you as the library user recognize which are which. But also there’s an old function that doesn’t follow the convention, its behavior is grandfathered in due to being used in existing systems and the footnote mentioning this legacy deviation is cropped off the bottom of the photocopied documentation you were given. I’ve run into all of those scenarios in large scale production systems that I was trying to interface with. It’s easy to make a simplistic assertion that the only reason this is an issue is that somewhere, sometime, somebody did something wrong. You may be 100% correct about that. But you’re making the very point you’re arguing against. Things like this absolutely happen, and it is in real life one of the most common sources of program misbehavior. We know from decades of experience that this *will* go wrong and that it *will* result in system instability and/or security exposures. So we can cross our fingers and hope after all this time as systems continue to increase in complexity that coders as a population will become perfect at it, or we can automate this tedious, error-prone task for essentially perfect behavior today and let developers spend their time and energy on the real meat of their projects.
-
Why isn't C# more popular?A big part of the problem is that it’s not always clear who bears the responsibility for releasing the allocation. If you think otherwise, perhaps it’s you who need to consider alternate careers. Or prepare yourself for a big shock if you are just getting started and have just assumed it is that simple.