I use the word "lobotomized" a lot, e.g. "WinCE is a lobotomized implementation of Win32" or "the Celeron is a lobotomized version of the Pentium 3." I often wonder, especially as the years pass, how many people actually have any idea what this means. And oftentimes I wonder what kind of antisocial behavior WinCE, the Celeron, etc. engaged in to deserve a lobotomy!
User 3677987
Posts
-
Zombified -
More academic idiocyThere's no idiocy at play here. I've lived in Louisiana, and people are happy. I think this is because they have meaningful relationships that transcend their wealth, physical attractiveness, etc. If you meet a Cajun, there is probably someone, somewhere who really loves that person regardless of their income, bra size, etc. That's less true of more advanced areas. My family lives all over the world, so I see both sides. Often, I wish I was part of a real community with a real culture like Louisiana. When my family gets together, it's cordial enough, but there's no real connection. We're just a grouping of self-interested individuals battling their narcissism with advanced degrees, plastic surgery, etc. No one's ever gonna get up and spontaneously dance, and the minute one of the group falls off the straight-and-narrow, they get ostracized.
-
Is there a site like this one for electronics projects?OK- I have written about 15 pages complete with diagrams, and I doubt I am even 50% done. So, I think this is too much, and am considering a change in strategy. Currently, my article is broad. It has a basic explanation of the formats I use to present data, i.e. electronic schematics and the "rat's nest" diagram. There are passages of text and graphics devoted to explaining DC circuits, transistor-based logic, and so on. I can see this developing into a fairly valuable 40-page summary of "Computer Engineering for Computer Programmers," but (knowing quite a few programmers) I am not sure this will be considered exciting. I do not want to post anything that will attract "1" votes. I have also considered making a purposeful effort to re-frame the article along the lines of a "Minimalist DMA Example." This might even entail modifications to the circuits presented. I can envision ways to replace transistors with pull-down resistors. This would achieve the same result but with a loss of generality, extensibility, etc. This could be posted here, or in another forum, but in any case the overall thrust of the article would be different. There is a Yahoo group for ET-3400, I am considering posting there as well. The main obstacle is that I am less familiar with the format.
-
Replacing Bytes in an Office 2003 Word Documentjonegerton wrote:
Unfortunately this has to work on thousands of documents (corporate rebrand!)
So I guess you have been tasked with replacing all occurences of "Tiger Woods" with "Tom Watson" or something along those lines? I tried to make something like that work years ago, i.e. something that tried to re-create the Word "save" logic at a low level. My recollection is that somewhere in the file, there is a field holding the length of the data or (less likely) a checksum. I thought I was adjusting that properly, but never did manage to create "valid" Word documents. Apparently there was some other checksum somewhere that I did not know about. Eventually I ended up doing the job with automation. I managed to work through the message-box-related issues... I think there are ways to detect the error condition and kill Winword.exe. In the worst case, you could just assume an error occured after a certain length of time. None of this is beautiful, but in the end it proved more workable than manually messing around with the file. And I did try mightily to make that work... I was just out of college, and had been immersed in a thesis that used Intel assembly, and the low-level approach was definitely the one I preferred. One more thought: the "DOCX" format of Office 2007 is much more regular and well-documented than the old melange of DOC formats. I think a DOCX is basically a zipped-up collection of XML documents and embedded image files. Have you considered converting to DOCX as the first step of the process? It might make your life easier.
-
New monitor [modified]The post-CRT monitors seem to fail pretty frequently, in my experience. I've used Acers, Samsungs, and something labeled as a Dell. The 15" square Samsungs in particular seem to have a habit of quitely going dark after a couple of years. I think part of the problem is that the overall unit is simply lighter than the CRT was, and thus fares comparatively poorly in any collisions. The 15" models obviously suffer even more from this problem.
-
Is there a site like this one for electronics projects?Agreed about the Microchip forums... those guys are really helpful, and willing to give advice, even to a "noob." That is a big deal for me. I have been asked why I don't use the "Atmel AVR" chip, which does many of the same things as the PICs, but supposedly with a more modern, less awkward architecture. A big part of the reason I avoid Atmel is that I've been really turned off by the attitude that their proponents seem to have... it's almost a religious anti-PIC zeal. Using a PIC, on the other hand, there are a wealth of application notes, starter kits, etc. available to ease the learning process... Since there seems to be no big objection, I probably will post my article(s) here. If I post articles for all my PIC, CoCo, and ET-3400 projects, then taken together these would form a kind of "Computer Engineering for Programmers" course. What I would really like to see eventually is an open-source, grass-roots effort to build a clean sheet alternative to Wintel, and the Internet. There's too much rotten wood in the current system. The Intel instruction set and Unicode are the two examples that pop to mind. When do we ever get a real "version 2.0?" Never??
-
WiX - Are you kidding?For a programmer, the installer is necessarily an afterthought compared to the product itself. If the programmer is goal-driven, he probably won't want or need anything too elaborate, just to make an installer. WiX does a good job of getting out of the programmer's way, and it's thus a very natural approach for a programmer. But if your employer has decided they want one of those slick installers with its own design language, theme music, etc., I don't think WiX is the answer. Such things can be written in a general-purpose language like C or C#. Or, maybe some of the custom InstallShield-type programs offer this sort of thing now. They probably do, but they're also expensive and not, in my opinion, any easier to deal with than plain old C or C#. Incidentally, what you're describing sounds like a bit of a quagmire. There has probably already been some back-and-forth between management and the developers about their "ugly" installer. No doubt the developers feel they've provided what's necessary, and management must be perplexed by their inability to make simple, cosmetic changes. You'll be jumping right into the middle of that, and you'll be without the technical ammunition to defend yourself... you can draw pictures, and the programmers can make installers, but it doesn't sound like either you or they are equipped to really address the issue of 'installer ugliness.' Personally, I think installers should be as close to invisible as possible. The use of music, custom design language not seen anywhere else, etc. tends to piss me off when I'm installing a product. This is especially true when the product is buggy, expensive, or functionally incomplete. For example, Microsoft has time to include custom photos of dorky ASP.NET coders in their Visual Studio installer, but I have been dealing with the same bugs in Visual Studio for the last 10 years. It's a question of priority, and I don't like their answer.
modified on Wednesday, December 9, 2009 11:37 AM
-
Eletrecity usageNo, that's not true, at least not to any significant extent. In comparing new and used printers, I would consider such things as lifespan, likelihood of failure, maintenance cost, etc. But the difference in power consumption will be negligible.
-
Is there a site like this one for electronics projects?I may still post my project here. It does eventually lead to a compiler and an associated app framework. The language is a purely functional, applicative-order language like Scheme, and the framework supports half- and single-precision real-number data types. Most of this - including the master copy of the framework- is written in C. For the framework, this is "if-then-goto" type C which is designed to be hand-assembled onto any platform and emitted by the compilers. Parts of this have been ported to the ET-3400, Microchip PICKits, and the Tandy Color Computer II. Also: Do you mean that you used Microchip "PIC" chips in your railroad project? Those have been another big area-of-interest for me. It is very easy and cheap to program a PIC chip these days. And I was surprised at how well my language and library worked with the PIC (which is not considered very compiler-friendly). For example, for the 16-series PIC, the stack is limited to seven return addresses. But in a compiler with tail recursion optimization, it is possible to calculate things like 33 factorial, which conceptually have well more than 7 nested function calls. I like this! Also, if anyone thinks that all of this is a colossal waste of time, then please let me know. I did not realize that people were doing basically the same thing 30+ years ago. Maybe that should be a wake-up call...?
-
Is there a site like this one for electronics projects?I have been working on an electronics project, and I want to post it somewhere in a format similar to the articles here at Code Project. The project essentially shows how to design memory circuits for the Motorola 6800 series of CPU (6809, 6502, etc.) My work used a Heathkit ET-3400, which is a rudimentary computer that exposes its key circuits on a breadboard. This thing is almost 30 years old, but it still fairly widespread on e-Bay, at colleges, etc. I got mine off e-Bay for about $70 quite recently. My educational background is in Computer Science, and in writing the article I would assume that same educational background in my readers. I also have some software-related articles involving the 6800 and the ET-3400 which I could post as a sequel. So, I initially considered posting this memory project here. However, in searching article categories, I get the sense that Code Project articles as a whole reside at a very high level of abstraction. Even in the "Systems" categories, I don't find much machine language, assembly language, or even C. This high-level focus would, I think, make my article seem out-of-place here. So, does anyone know of a similar site for computer / electronics engineering? Also, I was a bit surprised at the results of my article searching. This is a "code" site, to be sure, but must that mean user-mode, auto-garbage-collected software only? Is there not a "code" a site where one might see analogous articles for C and assembly language? If not, why isn't anyone at all interested in doing this sort of thing on a not-for-profit basis? It seems that there is a profusion of high-level application software being coded in the public domain, and I feel that the worlds of system programming and computer engineering ought to have something similar.
-
boots on the groundI couldn't hear that without thinking about Dora the Explorer. "He will be a Boots" LOL ... i.e. an animated character incapable of speech outside a few words of broken Spanish?
-
Scott GuthrieWhen you have Scott Guthrie and an elite business school vectoring toward the same point, it doesn't take much out-of-the-box thinking to see what will happen when they collide. It's a perfect storm for jargoneering (that's a buzzword of my own, which is short for "jargon engineering.") The result of this horrible experiment-gone-wrong can be found at http://knowledge.wharton.upenn.edu/article.cfm?articleid=1920 . And at the end of the day, jargoneering is not rocket science. Just forget about parts of speech or providing a useful service, and master the misleading metaphor. Here are a few of the key takeaways from Scott's little talk (webinar?): "The second way we will monetize is by having a connection with customers who are building these types of experiences." "The mobile space is interesting. There's the technical of how you get the software built for it." (Even the interviewer couldn't stomach Scott's innovative use of "technical" as a noun, and the transcript actually inserts "[issue]." ) "Obviously, we have a lot of apps that we build, not just in the developer's space, but in the knowledge productivity space and the enterprise space." ("Obvious" and "meaningless" are apparently synonmous now). I was over in the "knowledge productivity space" the other day and it smelled like someone spilled coffee or something under the refrigerator.
-
It's bizarre, butThat's one of my favorites. It's still silly and uncommon enough that it doesn't make me angry so much as it makes me bemused... I mean, you really have to be an e-Fanboy power-user of Wannabe 2.0 to say "Webinar." Sometimes I use it for humorous effect, e.g. Enterprisey Architecty Type: "Have you played with Expression Blend yet? Dude, it's totally sick!" Me: "Yeah, bro, we should totally e-learn some of that. Maybe we could have a webinar..." Enterprisey Architecty Type: "Totally! I'll run your webinar idea up the flagpole and see who salutes."
-
Breakpoints in Wincore.cppI have been trying to correct an assertion failure in an MFC app for the last few days. It's been really fun... while the rest of you have been doing boring things like riding your outlandish motorcycles or bar-hopping with your chubby girlfriends, I get to really spend some quality time trying to comprehend things like "___Afx_4_U_T_q_HowDoesThisEvenCompile_". Anyhoo... an assertion fails somewhere in Wincore.cpp (a local called pMap comes up as 0, implying a call from the wrong thread, I am thinking). When the failure occurs, Visual Studio breaks into Wincore.cpp (which it finds and opens) and does a good job of debugging. It shows a call stack, "Watch" and "Quickwatch" seem to work, etc. The problem arises when I try to do some manual editing to Wincore.cpp, or even just set a breakpoint in Wincore.cpp. It quickly becomes obvious that even though Visual Studio itself found and opened this file just a moment ago, it is not actually using the copy of Wincore.cpp that I am seeing as part of the build process. So, I can make garbage edits to it, do a "Rebuild All," and still get no errors. This seems completely wrong to me and provokes my ire at a very fundamental level. I mean, I guess Microsoft and I have different definitions of terms like "compile" and "source code." When I try to simply set a breakpoint in Wincore.cpp without changing it, the breakpoint's red dot indicator goes "hollow" at runtime. When I hover over the breakpoint, Visual Studio tells me that the source code is different from that used in compilation. If I check "allow location to be different" in the breakpoint properties, then the breakpoint gets hit, but Visual Studio claims to have no source code. It prompts me to browse to the original source code. Strangely, there is a text field in this dialog showing where the source code originally resided, and this is some path on F:-drive (which, in my case, is a removable flash drive that isn't even present in the system). I'm not sure how much more "original" I can get than the file I'm TRYING to use, which is in a subfolder of C:\Program Files\Microsoft Visual Studio 2008\ and was opened up for me by Visual Studio. Am I expected to e-mail Raymond Chen for the copy on his hard drive perhaps? Generally, I don't need to edit Wincore.cpp; I would never do this permanently. Also, I suppose I haven't had much experience setting breakpoints in that file. However, I think that what I am trying to do should be possible, and I think that Visual Studio's behavior is very misleading. Am I
-
Recommded Reporting componentsFor mostly alphanumeric reports, I suggest you use the regular IO functions of your language to generate HTML. This can be viewed and printed in the browser. I wrote a Windows app that generated invoices that way, and that part of the system turned out really well. Obviously, this is a workable approach for a web-based app. If you're trying to do something with dynamic graphics (charts, graphs, etc.) the HTML approach won't work so well. But I still advise you to - as much as possible - just stick with the built-in capabilities of your main language. As for dedicated reporting tools, I've used several and I can't recommend any of them. The two with which I have the most experience are Brio (a.k.a. SQR) and Crystal Reports. Brio was tolerable but I never really saw the point. When using it, I never got the sense that it was making me more productive compared to, say, STDIO.H or IOSTREAM.H. Crystal Reports is just bad... it's difficult to install and deploy, overly abstract and graphical, bloated, and unreliable. The HTML-based approach I suggested above actually grew out of a failed attempt to stand up a Crystal environment for the project. If you absolutely must use a dedicated reporting tool, I definitely suggest you avoid any of the slick-looking "Business Intelligence" tools that seem to be in vogue. Examples are Cognos and Business Objects. I think these Business Intelligence companies survive by taking a slick sales appeal straight to management ("here's how you can dynamically create slick looking reports from your desk without any tech skills..."). Managers seem to think reports should be easier to make and should look better... frankly, they're wrong. Reports are application programs and will always suffer from the difficulties associated with application programming.
-
The Horror... [modified]I disagree with your attitude about the class. There are situations where it's appropriate to use a 3-tier architecture, or to wrap public members into accessor methods. But you need to know how to do things the simple way before you can start thinking about these things. And there are plenty of situations outside class where the "right" way is actually wrong. There are still many environments targetted by various C / C++ compilers where space is still quite limited, and in those cases we must not introduce architectural features that don't add new functionality. As for the 3-tier architecture, this has no place whatsoever in a class assignment or in any other small program. If you're designing a system which is reasonably complex, and promises to have a long lifetime, and has idenitifiable layers, then there may be economic benefit to thinking in terms of intechangeable tiers. But for a class assignment, it's pure overkill... do you really plan on going back to your old homework code and, say, rewriting the GUI using a different architecture? If not, what possible rationale could you have for using a three-tier approach? I run into this kind of thinking a lot professionally. Generally, I can sort through the quasi-architectural clutter that results, but this three-tier stuff really mucks things up in my experience. I'm not saying that it's a bad pattern, simply that it's overapplied. For example, if you ask one novice programmer to write a client module and another to write a server module you're quite likely to get some sort of three-tier client running against a three-tier server. Now the nature of the assignment ("write a client" or "write a server") seems to dictate a client / server architecture; but these meticulous fresh graduates simply cannot live with themselves if they don't "do things right" and use three tiers. As as result, the vast majority of the three-tier implementations out there are pure rubbish. My general advice is to remember that code is just a means to an end. Beautiful code does nothing in and of itself. By overdoing one particular assignment you're only distracting yourself from the other assignments and challenges you're bound to face down the road. This is the sort of thing that is taught in Economics class, which is a great thing for a programmer to take in my opinion.
-
Waterfall vs Agile and Project EstimatesI would give the same estimate for a given project regardless of methodology. When I give an estimate, I am essentially saying "I think this will take N days if done using reasonable development techniques." These do not have to be the exact development techniques used previously. Methodologies evolve, and what you're describing is the kind of incremental adjustment in methodology which is assumed to be going on all the time in any good shop. Hopefully you will get faster over time, and the estimating model should always be updated as discrepancies are observed, but there's no immediate need that I perceive for you to adjust it right now. Also, I do not think I have ever seen or heard anyone claiming to use the "waterfall method." It's considered a perjorative term these days, almost like saying "my coding style is spaghetti" or "our team's style is garage hacker." When you say "my estimating technique works for waterfall" you're basically saying it's an estimating model for bad techniques. Finally, I think Agile is much better than waterfall, or (as proponents of Agile might say) it's much better than BDUF (Big Design Up Front). I don't think there's much value anymore to the style (call it waterfall, BDUF, or just mid-90s orthodoxy) in which the architect types spend weeks or months dicking around with object hierarchies, UML, etc. before coding ever starts. That time almost always ends up wasted, in my experience. In the absence of code, the architects don't have any real basis for their decisions. Programming instructors are quite wise when they implore us to use natural language, pencil and paper, diagrams, etc., but I think many of us in the 90s went too far in this direction. Also, I think people attempted to over-formalize good technique. What emerged from this effort was a bunch of simplistic, canned methodologies that isolated "design" into its own step at the beginning of the process, performed by an elite cadre of non-programmers. Hopefully we have left, or are leaving, this era!
modified on Thursday, December 4, 2008 4:57 PM