I think you totally underestimate the power of computability. Our brains are following physical and chemical laws that can be simulated by means of if-then contitions, for loops and mathematical calculations. Just take a look at protein-folding simulations that can be carried out today. I just think it's a matter of having enough computing and networking power as well as enough low-level understanding of how a real neuron operates for singularity to become a reality.
sgorozco
Posts
-
artificial inteligence is a myth!!! -
A more realistic tech-movie...And please don't forget the beeps ands chirps the computer *must* emit while performing the enhancement routine ;)
-
A more realistic tech-movie...I agree completely with Orlin's last remark. After discovering how insanely hard it is to make reliable audio capture on Android phones (on each of our 5 test phones, the same code behaves differently) I respect more how Microsoft has managed to make Windows run reliably on such diverse hardware, not to mention how easy it is to find detailed technical information for its products all around the internet. :cool:
-
Do you think math people are the best programmers?Interesting question! I think that defining a "good programmer" is a very subjective matter. There are many valid, applicable metrics and they usually contradict each other. However, I believe that math-skilled programmers have a very strong advantage when it comes to implementing novel algorithms in certain fields like image processing and pattern recognition. Most academic documents I've seen detailing powerful algorithms in those fields are explained in a strongly mathematical fashion, and the gap between "the math" and "the code", at least for me, is sometimes abysmal - especially after I stumble upon a concrete implementation and see the code that was actually derived from the mathematical formulations described in the paper. ;P
-
TDD : DO I reallly needs to learn it ?IMHO, in such a complex scenario (4000 complex features) I think it would be very difficult to create tests that would probe a significant combination of application states that can be present in real-life scenarios. Concurrency errors come to mind... To back up my opinion, I refer to the fact that when many of us developers test our applications manually, we tend to unconsciously follow a certain usage-test pattern that usually hides nasty errors discovered -often surprisingly early- by other users simply employing a different application-usage pattern. What could be a more invariant usage-pattern than a predefined, scripted test? I'm not against TDD - any technique that advances the state of current software engineering is welcome- however I think it's no panacea, and I often am wary of any technique that might create a false illusion of safety that fosters a relaxed attitude towards code correctness. ;)
-
How was your first day on .net?I used to be an advanced Delphi developer, but then took a job where I was forced to do VB6. X| I felt my hands were tied and was deeply frustrated by that language's lack of power. Then one joyful morning we were green-lit to use VB.net and I saw FREEDOM again, just like when I was coding with Delphi; the only thing I really missed from VB6 was its edit-and-continue philosphy, (this feature was not available in VS2003) I had unknowingly reshaped my coding patterns around it! Anders Hejlsberg has created IMO the best development platforms ever since Borland Turbo Pascal's days. Delphi was a masterpiece, and Microsoft did an extremely well strategic move bringing his talent to the table. :-D
-
For the first time everWell, at least it's VB.net, you still have the full power of the framework and full OOP; it could have been VB6! X| Probably someone else has already suggested it (I didn't read the whole thread), but you could write in C# and use Reflector or some similar tool to turn code to VB.net syntax ;) Another positive trait (IMHO) is the availability of the with statement. I really like it a lot ;) Good luck! :cool:
-
state-of-the art computer/machine vision systemBCDXBOX360 wrote:
vision by machines seems 2 lag behind the simplest animal u can think of (like a cat or something else).
:confused: I doubt that a cat qualifies as the simplest animal I can think of; besides, if they had such a simple visual system, they wouldn't probably be the formidable predators that they are....
-
Audio processingFast Fourier Transform filtering is certainly an option. It's a very intuitive solution since, like BobJanova states, band-pass filtering can be easily achieved by doing these steps: 1) Forward FFT of the input data, 2) Zeroing the result bins of the frequencies that we want to remove, and finally, 3) Perform reverse FFT on the same data. However, this is a very expensive approach, if fast execution is a must, I would recommend trying a FIR digital filter instead. I have no experience designing such a filter myself, but there used to be a really great freeware tool around named DSPlay. I Googled for a while and unfortunately can't find it any more. This GUI tool allowed you to easily create band-pass filters by specifying a few input parameters and generated very simple C code that was directly usable in C#. Let me take a look as I'm certain I have the DSPlay tool in a pervious hard disk backup... :)
-
Superstitious Programmers.We in the office fear as hell Mercury-retrograde periods. There have been several such periods where we notice things like to fail at the same time (like 4 remote machines dying because of completely unrelated causes on the same day, or two Raid5 clusters losing two drives simultaneously!) Odd indeed... :P
-
Into bad habitsHello Fabio, I agree with you 100%, I think strictness in the language is something very desirable, especially considering that many of us coders tend to have some very intense periods, where maybe you get to sleep a couple of hours a day if you're lucky. I know there are gifted people that thrive in such intense pressure periods, but at least in my case, with sleep depravation, I tend to see a dramatic rise in my mistakes, and I certainly like to know the compiler is helping me a bit with the error count. :P Regarding the Meloncard article, I think the developer is cutting himself some slack. The mistake may appear to be subtle indeed, but I strongly believe it's a mistake that would have shown up in a simple two-concurrent user test. Maybe I'm wrong, but it's not the first time where I see a contrived technical-jargon full excuse that attempts to divert direct responsibility over an issue. Cheers! =)
-
Another language to learn!ii_noname_ii wrote:
There is a much darker purpose for Dart
Yeah, call it "Dart" Vader! ;)
-
Did I St-St-Stutter?ROFL I know exactly what you mean. Stuff like WPARAM and LPARAM being both 32 bit integers... X| I still think that the 'p' prefix to pointer variables was clever, if only to make your brain switch to pointer-awareness coding effortlesly upon variable usage. That's where I see its primary advantage (if not, arguably, the only one) I also hated the different flavors of hungarian. Everyone I know that uses/used it has their own conventions defeating one of its primary purposes. :)
-
Did I St-St-Stutter?Lol, I think it might be with the current generation of compilers, code-completion tools & typesafe frameworks, but back in the early 90's with most development made in C or C++ it really made some sense. At least to me, it was extremely convenient, for example, to figure out instinctively the level of indirection required with pointer operations (I knew instantly when I saw
ppnSomething
, that I was manipulating a pointer to an int pointer). After being sold on hungarian, it was hard to work on code that didn't use it, as I had to constantly check declarations and documentation and just felt something was missing. Just my very-debatable opinion. ;) -
small, slow memory leakHello, This might not be applicable in your particular scenario, and there are performance issues that should be accounted for (starting an exe is relatively slow), but I have dealt with leaky third-party libraries I can't circumvent by wrapping their use on a different executable file. Instead of directly linking the library to my application, I execute the wrapper process and pass the parameters on the command line. The thing is that *all* resources are freed once the wrapper process dies. You could improve performance by creating a service-like app (probably using WCF) that is restarted once a memory usage threshold is exceeded. This might be a little harder to code but avoids the performance penalty incurred when a process starts. This should only be considered as a last option. Hope this might help, and good luck! :cool: Sergio
-
This is why I am starting to loathe programmingHi Fabio, Yes, you are correct, good code should follow the IDisposable pattern documented by Microsoft, that way you can dispose unmanaged resources early if you want, or leave it to the GC. I'll look at your example, because if what you say is true, then Microsoft is not following its own recommended patterns! Thanks for the reply, Gerardo
-
This is why I am starting to loathe programmingI agree with Leppie. The IDisposable pattern (when properly implemented) has a safety net that will release native resources even if we didn't call Dispose() or wrapped our instance with a using statement. Garbage collection might take a while to kick in, and yes, it is possible to write an application where garbage collection will never occur, but if you're talking of resource exhaustion, then that means you are constantly creating objects, then THERE is memory pressure and the gc WILL eventually run (unless of course, you are unnecesarily keeping references to objects that should be collected and by doing so, inducing a memory leak). Cheers, Gerardo
-
This is why I am starting to loathe programmingHi Fabio, Sorry, your example is not entirely correct. If you do not call Dispose() on a Form class instance, it will still be disposed (with its underlying window handle properly released) when it's garbage collected. The IDisposable pattern (which is implemented, as far as I know, in all .net Framework classes that hold native resources) takes precisely care of this. The only difference is that you don't know exactly when will the garbage collection occur and since it's triggered by memory allocation pressure, it might take a while for it to actually happen. That's why it's wise to call Dispose(). Cheers! :) Gerardo
-
Is RAID 5 worth it?Hi Luis, Thanks for the warm welcome. Good luck on your trials! In fact, I've been a Code Project user for quite some time, however I'm not very active on the forums. I've found some excellent code here, and I'm really grateful to the community - I hope someday I can give something back that's really useful. :) I love programming and I'm totally into pattern recognition techniques. I work at Obsidian, a company me and some good friends formed a few years ago. We monitor TV and radio broadcasts with a software suite we developed in-house (that's why we need so much storage space). Are you also an entrepreneur at Intelectix? Cheers! :cool: Gerardo
-
Is RAID 5 worth it?Hola Luis, saludos de un paisano :) En inglés para contribuir más al thread: We're currently working with around 20 servers equipped with 4TB raid-5 units (5 1TB drives) and the experience has been quite positive. We're not concerned with read/write performance, mainly information volume, so I can't offer an opinion regarding that aspect. The controller is hardware based (Intel) built directly on the motherboard. Here's our experience so far: We sparingly experience soft failures that are simply corrected by a rebuild operation (sometimes taking around 25 hours to complete); we experience a noticable slow-down, but we still manage to be fully online. We have survived 6 single disk failures so far (in a one year period) with no problems at all to rebuild and minimal off-line time (we have no spare disk installed). We also have had only once a simultaneous-dual disk failure and fortunately we were lucky enough to recover virtually all information using a wonderful commercial tool. :cool: One CRITICAL factor is temperature. Try to keep your drives with good spacing between them and if possible, install additional fans to keep your drives cool. This greatly reduced the time between soft failures. In a nutshell, I would strongly recommend hardware based raid-5! ¡Saludos! Gerardo